I've been reading a great deal lately on testing and development methodologies. Certainly all the rage these days are such buzzwords as Test Driven Development (TDD) and agile/RAD programming. Exactly what these terms mean have varied interpretations and I think there are few development shops that implement them precisely as any given description (except, perhaps, their own) presents it. Anyway, in reading blogs, books, Wikipedia posts, etc. I have tried to assimilate, filter and aggregate in order to come up with my own development methodology, admitting that I can't claim to be an expert in the field given that many others have spent a great deal more time thinking about the subject. Still, I feel entitled to an opinion so here goes.
First of all I want to say that I do in principle agree with the one week SDLC cycle emphasized in Agile and RAD methods. It's my experience that the waterfall SDLC concept is generally not effective. Insisting on a set of rigorous and complete requirements up front I feel is a near guarantee of project failure. Requirements change constantly, at least on every project I've been on. More to the point, we learn about the requirements in the process of trying to implement them. Things we thought up front sounded like great ideas just don't look so good as implementations. So a continuous analyze/prototype/evaluate cycle makes sense. Get ideas, see if they work and if they do make those the requirements. I think this is efficient and effective, and given the popularity of Agile methods so, apparently, do many others.
I do think that it is important to ensure that the rapid Agile cycles don't become excuses for cutting corners, though. One thing, for instance, that I keep reading in descriptions of Agile methods is "involves less documentation". I don't buy that. Why, just because we have quick cycles, do we get away with less documentation? This just sounds like a ploy by developers, famous for not wanting to document to get away with it. I feel that every week the team should be spending time keeping running documentation up to date. If they can design and program as they go, they can document as they go.
Basically the way I see it is that in Agile, one shouldn't be abandoning the steps in traditional Waterfall SDLC. One should just be compressing them all into a single week. So, planning/analysis/design/coding/testing/implementation/maintenance (well, replace implementation with prototyping, that may become implementation) gets done every week. No cutting corners in any of these steps should be allowed. The only difference is that the project is broken into bite sized steps that can be accomplished in weekly cycles. I think this is doable and practical.
The "less documentation" paradigm is especially odd given the emphasis on testing in Agile methods, specifically unit testing. Testing is another area notoriously neglected by software developers and Agile testing methods were specifically proposed to address this deficiency. Still, I'm concerned that simply declaring that every method, and ideally every logical input type to the method, needs to be tested does not guarantee good, effective tests. It's easy to toss together a quick test for a method and accomplish little more than not testing at all.
So, here's my proposal. I think that there should be some kind of guidelines such that for every unit of time spent programming, the thing must of us programmers enjoy doing most and would do all day if we were allowed, there should be some proportional unit of time devoted to these less popular activities, testing and documentation. I'm going to, as an initial estimate, propose 2:2:1. So for every 2 units (say 2 hours) of programming there should be an equal 2 units of testing and 1 unit of documentation. One might make the objection that such rigid rules are unreasonable, that every day is different and requires adjustments to the times on different tasks. So projects might have complex programming requirements but not need as much testing and documentation. I don't buy that. If a programming task was complex then it logically requires proportional testing to validate that complexity and proportional documentation to understand it. That seems reasonable to me so I'm sticking to my guns on that.
The proportions themselves I will leave negotiable since I would need to try to put it into practice and reevaluate. They seem reasonable to me on the surface but I believe in empirical evidence so I'm keeping an open mind. The testing requirement I feel pretty confident in since it seems very natural that the complexity of the testing can't be less than the complexity of the coding. In fact, maybe the testing time should be higher. Testing, let's face it, is hard, involved work. I don't think that developers should spend the same amount of time documenting as on these other two tasks, though. The primary job of a developer is not writing documentation, although it is a critical aspect of their job. Besides, even 2:1 between coding and documentation is much more than is typical (e.g. 0). A team could get very nice documents out of such an effort, I think. I doubt it should be much less, though. While not the primary role of the developer, we don't want to de-emphasize the importance of the task.
So, there's my proposal in a nutshell. I hope to think, and write, more about the subject as time goes by. Maybe I'll completely change my mind. These are admittedly initial thoughts. I definitely plan to focus on this area more in this blog, though. After all, algorithms are fun, but this is what really matters in real life software development.
2009-05-06
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment