When Good is Enough (and when it isn’t)
01 Dec 2009It’s a constant debate that we fight on a daily basis, is the work we’ve completed good enough, or is important enough to spend an extra half day or day(s) perfecting.
Often times we’re forced to make sacrifices, but when is it the right time to push back or accept defeat?
Obviously what follows here are my own opinions but I’m a firm believer that a well executed plan only 80% complete, is worth leaps and bounds more than a flawed plan 100% complete.
A few questions I commonly ask myself:
- Is what I’m working on tied to a software release. *We release on a month basis in two 2 week iterations. I know that if I want to make it into a release, I need to have it done sufficiently ahead of time in order to ensure adequate testing. *This may mean cutting scope. **Cutting scope is fine IMO, just do so in a way that you avoid silly shortcuts that create headaches for yourself when you inevitably have to come back and add functionality.
- Who is the consumer. Is it a paying customer (ie. they have already paid)? Is it for a demo? Is it for another developer? If its a customer, what are they actually trying to accomplish? Often times, we get caught up in our own expectations of how something should look or behave, and lose sight of what the customer is actually trying to do. If the customer is currently using pen & paper to track something, a simple (ugly?) *web form may be perfectly acceptable. *Don’t overcomplify. With good feedback, you can always come back and do a better job. Focus on the present, don’t get distracted on what you ***might* need to do in the future.
- How often is the functionality used. *If the feature or function being worked on is used by a handful of backend administrators, you may not need to go over the top on day one. If they come back and tell you that they absolutely cannot use the system without A, B or C. Make sure you understand what A, B and C are and get cracking. *Quick turnaround built loyalty and trust. You need both as early in the customer relationship as possible. **If the feature is public facing and used by a large percentage of the audience, by all means, perfect it.
- Do I need to modify the database. From a technical perspective, this has always been a killer. Make a flub manipulating the database, and you’ll cause yourself countless hours of time trying to correct customer datasets. Get representational datasets, migrate them automatically and test test test! Speaking from experience, do not cut corners testing or working with your database migrations.
- Have I been burnt in this area before. Maybe the work you’re doing is in direct response to a filed bug. Either way, if you’re working a particularly risky area of the codebase, it’s worth going that extra step to ensure you’ve got solid code coverage (unit, regression, selenium tests, etc.). Fool me once… Fool me twice…
There are plenty more, but in general, I err on the side of trying to get releases out the door. If we spend 20 calendar days in a release (4 work weeks), up to 6 of those will be spent either performing non-automated regression tests (don’t worry, we have automated tests as well) or fixing post code-complete deficiencies.
Test cases are a given. If you’re working Java or any other programming language with tooling like JUnit or TestNG, I really see no excuse not to use it. If nothing else, it’s a good safety net if you ever need to refactor a particular area of the code base.
If I had to pick a guiding principle, it would be: Get it in, Make it right, Make it fast.
**Get it in *because you need the feedback.*
**Make it right *because that’s what we’re paid to do and stakeholders expect it.*
**Make it fast *because it’s no fun staying up until 2:00am trying to debug performance problems on a customers live system.*