The search for simplicity

There are several ways in which computer programming and physics are very similar. Possibly the most important is that both disciplines are, fundamentally, a search for simplicity.

In physics, we have a big pile of experimental results and we want to find the simplest theory that satisfies them all. Just listing all the experiments and their results gives you a theory of physics, but not a particularly useful one since it's not very simple and doesn't predict the results of future experiments (only the past ones). Rather than just listing the results, we would like to find a general theory, an equation, a straight line through the points of data which allows for interpolation and extrapolation. This is a much more difficult thing to do as it requires insight and imagination.

In computer programming, we generally have a big pile of specifications about what a program should do - maybe a list of possible interactions with the user (what they input and what they should expect to see as output). These might be encapsulated as testcases. To write a program that satisfies all the testcases, we could just go through them all one by one, write code to detect that particular testcase and hard-code the output for that particular input. That wouldn't be very useful though, as the program would fail as soon as the user tried to do something that wasn't exactly one of the scenarios that the designers had anticipated. Instead we want to write programs for the general case - programs that do the right thing no matter what the input is. When the "right thing" isn't precisely specified, we get to choose the output that makes the most sense according to our internal model of how the program should act.

I think a number of software companies in recent years (Microsoft in particular but others as well) have started to fall into the trap of writing software that concentrates too much on what the behavior of the software should be for particular (sometimes quite specific) scenarios, at the expense of doing the right thing in the most general case. Windows is chock full of "special case" code ("epicycles" if you will) to work around particular problems when the right thing to do would have been to fix the general problem, or sometimes even to explain that this is how we should expect it to work. Here is one example of this kind of band-aiding. I discovered another the other day - I was running some older Windows software in Vista and accessed the "Help" functionality, which was implemented an old-style .hlp file. Vista told me that it no longer includes the .hlp viewer by default (I guess it was a piece of the OS that doesn't get a lot of use these days, and they had just dropped it from the default distribution to avoid having to bring it up to the latest coding standards). I was pointed to the download location (where I had to install an ActiveX control to verify that my copy of Windows was genuine before I was allowed to download the viewer).

Part of the problem is that (at Microsoft at least) it's very difficult to make big changes. Rewriting some core piece of functionality, even if the programming itself is easy, would involve months of planning, scheduling, designing, specification writing, testcase writing, test-plan reviewing, management sign off meetings, threat modelling, localization planning, documentation planning, API reviewing, performance testing, static analysis, political correctness checking, code reviewing and integrating. And of course everyone whose code might possibly be affected by the change needs to sign off on it and put in their two cents about the correct design. And it must comply with the coding standards du jour, which change every year or two (so delay too long and you'll probably have to start all over again.) When you come to understand all this, the long gap between XP and Vista becomes less surprising (in fact, it's quite a surprise to me that it only took a little over 5 years, considering how many pieces were completely rewritten). All this process exists for a reason (mostly the politician's fallacy) but is rigorously justified and widely accepted.

Because it's difficult to make big changes, people tend to make little changes instead ("hey, we can work around this by just doing x in case y - it's just one extra line of code") - these don't require much process (usually just a code review - most of the rest of the processes for such small changes is automated). All these small changes add up to a great deal of extra code complexity which makes it very difficult for newcomers to understand the code, and even more difficult to rewrite it in the future because people will have come to depend on these edge cases.

2 Responses to “The search for simplicity”

  1. [...] Reenigne blog Stuff I think about « The search for simplicity [...]

  2. [...] up for which the correct fix requires substantial rewriting or even architectural redesign. Often (especially at places like Microsoft) in such a case the correct fix will be avoided in favour of something less impactful. This [...]

Leave a Reply