Friday, March 20, 2009

Toxic Code

During these days, the word toxic is often associated to investments hidden in some bank caveau, their risk have been underestimated and the result is a backlash with devastating effect.

I think many institutions have the same toxicity problem in their software, but still fail to admit it: key portions of their software act like small timebombs and are quietly undermining their businesses.

There are infinite examples of what I consider to be toxic code, here is a small checklist some might be familiar with:
  • Useless code: it’s there but nobody uses it
  • Untested code: it’s there, probably works but nobody wants to touch it
  • Valueless code: Software which is not delivering any business value
  • Annoying code: every change to the system takes more time than it’s reasonably necessary.
  • Annoying software: the code is “working”, but in a way that makes things harder instead of easier. Users waste time every time they perform an operation.
  • Leaking software: key portions of the application are not tested enough, some exceptions are unhanded resulting, in random untraced errors.
  • Countdown software: software with a time-related flaw that will have a havoc effect at a given point in time.
  • ...(add yours here)

Well... it’s basically the same idea of technical debt (and Ward Cunningham posted an excellent short video on that - thanks to Piergiuliano Bossi for pointing it) which should be familiar to any agile developer, and was created as an analogy to the financial world. But my worries are more related to the business side, of software rather than to its production process. Ideally, we should have 100% trust on our IT systems (don't laugh please... I said "ideally"), because every time some of them it’s not working there is a waste somewhere (be it time, money, or a customer abandoning the company).

Sometimes companies are aware of the real quality of their running software, and decisions to fix or improve specific portions of the running application are just the results of conscious strategic budget allocations. More often, this is not the case: like toxic assets, software with risky behavior is treated like normal production code, underestimating the associated risks and drawbacks.

Anyway, even when companies are taking into account the costs of a less-than-optimal solution, they’re generally considering only the direct effects of this choice. Something like “this feature might be more user friendly but it will cost us $xxx to rewrite it and the possible revenue for that will be only $yyy, so it’s not worth developing” which is reasonable, but doesn’t consider the cost of not developing it which is put on the user who is having a less-than-optimal user experience (well ...sometimes a miserable one) wasting time and/or money any time he uses the system. It's like polluting: if you don't count it, it's cheaper to do dirty things.

I am still intimately convinced that computers should make life simpler and not more complicated. And I am also convinced that they can.
For some reasons they don’t.

Reblog this post [with Zemanta]

No comments:

Post a Comment