Silver bullets have plagued software engineering for many years. Time after time, there’s some hot new thing that will solve all our problems. Then, a short while later, it’s pronounced a grand failure and never heard from again. More often that not, that pronouncement is the harbinger of the next hype cycle. Talk about wasteful!
The infection is deep, it’s thoroughly penetrated our social structures:
- Many conferences are about all the cool things we might do with this or that. All the latest and greatest, the hot new things. Tutorials and workarounds for common problems.
There are many online tutorials, plenty of write-ups of this or that collection of patterns with new or old names, a pile of process, technology, tools, language and library documentation.
Case studies and whitepapers abound with positive stories of big money savings and so on.
There are advocates for everything. You gotta be agile or lean, you must use this or that DD, you have to be using this language etc.
The infection is further assisted by corporate behaviours such as the jealous guarding of knowledge via various intellectual property protection mechanisms. Somewhat ironic as without a level of public disclosure, it’s difficult to identify, attract and retain the kind of talent they require to sustain them1.
Just as importantly, the silver bullet is no respecter of history. Often the supposedly new thing is not, in any meaningful way, different from or better than what has come before.
Too rarely do we see real, unbiased, evidence-based discussion of suitability or comparison against other options including the historical. Something as substantial as an examination of the success and failure of a thing across time and a meaningful number of real-world applications2.
In absence of such material, it’s no surprise that the silver bullet virus thrives. How could one possibly build a meaningful body of knowledge that moves the discipline forward? How could any engineer make meaningful judgements of what might be appropriate or otherwise in any relevant arena (e.g. process, design, testing, debugging, operations) except from personal experience or fortune in finding the right mentors? How could silver bullets be seen for what they are?
Can we do better? I believe so.
Firstly we must get to grips with being honest about failure. Folks in the operational disciplines have a long tradition of publishing postmortem material (aka disaster porn). They’re also focused on improving methods for identifying and eliminating their shortcomings. We must adopt similar disciplines in software development covering architecture, process, development practices and more. However we must go further than even that because knowledge work is fundamentally about humans and the systems they’re embedded in. Failures in practice at this level must also be examined.
Second of all, I believe we must actually put the science back into computer science3. Quite simply, we’ve lost the ability to do our research properly, hypothesise adequately and analyse the results of our choices appropriately. Not all that we do in software development can be treated this way but we can get much better at evaluating our methods, tools, design decisions, patterns and such. We have many opportunities to run, document and publish the results of experiments4. Publishing is critical whether we succeed or fail (remember the above, we must be honest). Without it we cannot peer review or test for repeatability. Nor do we create a body of knowledge, a documented history.
If we can combine this honest, critically evaluated history with an appropriate discipline of research and experimentation I believe we can avoid wasteful duplication, enable learning and create genuine progress. Further the silver bullets formed from an ignorance of history and insubstantial claims can be seen for what they are and consigned to the trash can5.
22/3/14 Added a tweeted reference to Gilb on architecture that appeared at just the right moment.
- These company practices inhibit the building of portfolios with substance (architecture diagrams, key bits of code and such). In their absence, CV’s loaded with buzzwords and valueless references to the largely unseen are somewhat inevitable. From this intangible material, companies are supposed to identify talent. ↩
- There are of course exceptions such as the body of work Google publishes. ↩
- That means ensuring each undergrad understands the scientific process and it’s applications within their work. ↩
- For some ideas on how this might be done for architecture, see Gilb speaking or skim the slides. ↩
- No doubt, some will be quite happy to continue on as they are. At least though, these cargo cult engineers will be more easily discernible. Others can then make informed decisions in respect of hiring and such. ↩