2014-09-28

Coping with legacy

28 September 2014
Tom Breur

In an ideal world, we never accrue technical debt. Reality is that we do. When technical debt is stacked on top of even more technical debt, and general sloppiness is allowed to persist, you sometimes wind up with what in software vernacular is called “legacy.”

As Richard Mark Soley once said: “How do you recognize a legacy system? Easy … it’s one that is running and has users.” Imho, legacy starts to accrue as soon as somebody else relies on your system. His needs will diverge from what once was envisioned, so a misfit (Weinberg, 1990) between the original solution and contemporary needs are likely to develop. And grow worse over time.

The word “legacy” itself does not have to mean the code is inherently bad. It just means that the owner of a system has to cater to some inherited (but more likely self inflicted) obligations to third parties. And as the number and diversity of users needs grow, this is more likely to lead to awkward straddles.

A brief side note on bad code and technical debt: my friend JB Rainsberger pointed out to me that “accidental” (unintended) accrual of technical debt should better be labeled “sloppiness.” I find “lack of craftsmanship” (as Uncle Bob would call it) a more politically correct term, but I’m convinced we mean pretty much the same thing. I prefer to talk about lack of craftsmanship because I find it more palatable and it carries an implicit challenge (to up your game).

When you’re working with legacy code, it sometimes feels like you’re trapped in a swamp: no matter how hard you struggle, you don’t seem to be able to get out of it. In fact, sometimes “trying harder” gets you into deeper trouble rather than out of it…

This is a well-known phenomenon. John Sterman (Business Dynamics, 2000) writes: “All too often, well-intentioned efforts to solve pressing problems lead to policy resistance, where our policies are delayed, diluted, or defeated by the unforeseen reactions of other people or of nature. Many times our best efforts to solve a problem actually make it worse” (p. 3). And: “Most of the unintended effects of decisions leading to policy resistance involve feedbacks with long delays, far removed from the point of decision or the problem” (p. 91).

In software development a classical example of these “long delays” that Sterman refers to occurs when code that has been in production (maybe for along time) needs to be changed. Reality is, most software development is classed as “maintenance”, which effectively means changing existing, working code. Think: legacy. We made a design decision during initial development, and wind up regretting it. Or rather: someone else winds up regretting it…

How do you know you’re caught in a morass? Every move you make, every change you try, appears to get you into bigger trouble. It’s the code that nobody dares to touch; that nobody dares to change, for fear of breaking something. This “something” being someone’s application, out there somewhere, who is relying on this code.

When you find yourself in a morass, doing nothing is rarely an option. That will make you sink slowly. That type of behavior is usually what got you here. You know, you can “smell” it’s legacy, when people are reluctant, on the verge of panicky to make any changes because they “know” it will break someone’s deployment.

How can you deal with this? You are a professional and want to deal with this in a professional manner. You may, or may not have been involved in designing and maintaining this system. Regardless, nothing or nobody is helped by pointing to someone else’s design errors. Besides that, the design might well have seemed the most efficient, given the known stakeholders at the time.

When a desire to change the system puts you at odds with some stakeholders, pressure can mount to “make you” change your mind. Your congruent stance may be put to the test. As with many systemic conflicts, injecting objective information can short-circuit

Make a concerted effort to surface costs and consequences of choosing alternative futures. What if we change the system? How? Who would be impacted? By how much? What if we don’t change the system? What will happen? What would direct and opportunity costs be? Etc.

The very effort of making the business case for these various scenarios explicit can help you learn things that may change either your, or someone else’s point of view. But even if you don’t come to an agreement, at least you have taken the personal or political edge out of the discussion by sharing what can be agreed on, and making explicit where people do not agree.


No comments:

Post a Comment