Quality-Speed Tradeoff — You’re kidding yourself.

Some people think that there is a necessary trade-off between internal code quality, and getting things done rapidly. I believe they are quite mistaken.

On the agile-testing list, Steven Gordon wrote:

The reality is that meeting the actual needs of the market beats quality (which is why we observe so many low quality systems surviving in the wild). Get over it. Just focus on how to attain the most quality we can while still delivering fast enough to discover, evolve and implement the right requirements faster than our competitors.

First of all, whenever anyone says anything starting with “The reality is,” the one thing you know for sure is that they are not describing Reality.

“I reject your reality and substitute my own.” — Adam Savage

I reject Steven’s reality and substitute my own, and it goes like this.

While I would agree with John Ruskin: “There is no product that some man can not make a little worse and sell a little cheaper; those who buy on price alone are this man’s lawful prey,” I believe that the EITHER/OR thinking above is troublesome in two ways.

First, while I am no perfectionist, and certainly not anywhere near perfect, I do hold quality in high regard. I think that is the natural state for many of us, the view of what we may call the “craftsman”. I think it’s a valid view and one that leads to joy and beauty.

Second, low quality is a dangerous business strategy for a company, and a dangerous business strategy for a software developer. Product buyers, and employers, want the best they can get for their money, not the worst.

Third and far more practical a concern is the EITHER/OR position taken above, which assumes — incorrectly in my opinion — that there is a tradeoff between quality and time to delivery. We see rather clearly in manufacturing that high precision in manufacturing reduces costs while getting things done faster and better. It may well be the same in software, that high precision in building software gets it done faster as well. Here’s why:

Fantasy

The fantasy of cutting quality to deliver faster lives on in part because we do not measure that long period at the end where we are taking out the critical bugs that we put in. We assume, wrongly, that that period is necessary, part of the universe. The fact is that if we hadn’t put the critical bugs in, or had discovered them immediately upon doing so and fixed them, that “stabilization” period would be truncated or eliminated.

Low quality has an immediate and visible negative impact on delivering “fast enough”. That negative impact is the stabilization period. That is lost time due directly to low quality. It also has immediate effects that are harder to quantify:

  • Working at a low quality level injects more defects. Some of these are so egregious that they have to be fixed immediately. This slows down progress as it is always easier to prevent a defect than to find it and fix it.
  • Working in defect-ridden code makes building new features take much longer. We use some existing code that is thought to work, and when our new feature does not work, we assume that there is a defect in our code. Too often, the defect is elsewhere. This slows down progress, often substantially.
  • Working in low-quality code makes building features take much longer. It is harder to find the things we need to use or change, and they are harder to separate out for use and modification than they would be in clean code.

Low quality therefore slows down progress directly, through increased fixing, delays in making things work, delays in figuring out how to make things work, and finally in the long delay of the so-called stabilization period.

As far as I can tell, the sole impact of reducing quality is to make things take longer. I can identify no aspect of low quality that does not have an associated side effect of increased defect insertion and increased confusion. Yes, you can argue that if you don’t add too many defects and if you don’t make the code too grubby, you might come out ahead. But that is my point: higher quality makes you go faster; lower quality makes you go slower.

Now what of high quality? Is it possible to spend all our time polishing something and never release it? Certainly it is. Most of us have done it at some point in our lives. This possibility, however, is not a justification for lowering quality, especially not in most of the projects we actually encounter. If your project has a bug list with an upward slope, if you are planning a “stabilization period” that really means you’ll mostly be fixing bugs, then your quality is not too high and you will not go faster by lowering it. You will go slower.

To a first approximation, then, looking at a random project, if we want to speed up, we must increase quality.

Not Everyone Agrees (too bad for them)

I guess we can assume that Steven doesn’t agree with what I’m saying here. Neither does Joe Rainsberger in his posting on The Quality / Speed Barrier. If I understand him, Joe is saying that if your code sucks sufficiently, trying to make it better will slow you down. and that the only way to maintain or increase speed is to suck more. It seems clear to me that on the face of it, this plan is not sustainable, since sufficiently bad code, which is not at all hard to attain, will drag you to a halt.

Now there is of course an element of truth to what Joe is saying and I imagine that Steven would agree as well. Suppose you are in the middle of putting a feature into some code and the code needs improvement. Suppose you have noticed a variable name “joe” that should really be given a meaningful name like “personBeingHired”. It will clearly take more than zero time to do that renaming. Therefore, trying to improve the code will slow you down.

Well, maybe. But if you’ll walk just a little faster to the coffee room next time, you can make that time up and it’ll do wonders for your heart as well. The slowdown from simple improvements can be quite trivial, quite capable of being absorbed in the natural flow of things, without any real impact on delivery at all. And that’s this time.

Next time you pass through this code, all that thinking you had to do to figure out what the hell “joe” meant can be saved. “personBeingHired” makes sense. Your understanding will come faster, and the feature will go in sooner.

This effect might even hit you on the first time through. Is the code a bit intricate? Then improving the variable names can help you understand it this time, and help you avoid errors as you make your changes this time.

This Does Require Skill and Judgment

You do have to know how to come up with a decent variable name. You do need to know how to change the variable name throughout all the code that is impacted. You do have to know how to perform a few more simple refactorings, such as Extract Method. Most of all, you have to know to do a little bit, but not to do too much.

Have you ever been in a company rest room and noticed water on the sink or a couple of towels on the floor? Did you find it just a little offputting? Well, so do other people who enter that room, all day long. At your high pay rate, it would not make sense to pick the lock on the supplies door and scrub down the room. It might not make sense to wait until you are in a rage and then go hunt down a janitor or the head of HR.

But it wouldn’t hurt you to pick up a fallen towel with yours and put it into the waste can, and it wouldn’t hurt you to use your towel to wipe up some of the splashes. It’ll make you feel better, and it’ll make everyone who comes through the room later feel better.

Do that with your code. It makes things go more smoothly and it makes things go faster. And that, my friends, is reality.

11 Responses to “Quality-Speed Tradeoff — You’re kidding yourself.”

gdinwiddie

February 1, 2009

6:06 pm

permalink

+1

I’m coming to the belief that when someone says, “It’ll take me longer to do this with higher quality,” that they’re often saying “I don’t know how to do this with higher quality” and therefore the “longer” is essentially infinite time in their mind. Even if they say “I could learn to do this better, but it will take me longer to learn than it will to do it the way I’ve always done it,” then they’ll perceive the better way as a net loss. And it could be so, if they get hit by a bus tomorrow and never have to do something similar again.

Odds are, though, that the knowledge and skill they acquire to be a little bit better will come in handy many times in the future. Being prepared to do things better more easily in the future compounds with the being able to do things better with this system because it’s cleaner. We’re generally talking about small improvements at any point, so they might not be immediately obvious. Over time they’ll multiply like compound interest and one day they’ll look back and marvel at the difference.

Oh, and if they take a minute now to do something better, maybe they won’t be in the place where they’ll get hit by the bus. :-) As Will Rogers said, “Even if you’re on the right track, you’ll get run over if you just sit there.”

George

sf105

February 1, 2009

7:06 pm

permalink

I very much believe this too, although I wish I had clear numbers to prove it.

There’s another effect that’s worth mentioning, which is the unpredictability of low quality code. If I have to make a change in a messy code base, it’s hard for me to tell how long it’ll take, which makes it harder for the business to decide whether a feature is worth the effort.

gus

February 2, 2009

8:15 am

permalink

Great post. (or +1 should I say)

The ‘cut quality to go faster’ view represents a lack of trust in the capabilities of the people actually building the product (whether justified or not). Dropping quality simply means you get garbage, later.

You’re absolutely right that keeping quality high requires continuous attention (no broken windows), balance (don’t try to do too much at once) and skill (focus on the right stuff).

I wonder, is the quality-or-speed view most prevailent in developers or non-developers?

anthonyw

February 2, 2009

8:56 am

permalink

I agree with everything you said here. However, I’d like to take the original quote from Steven Gordon, and expand on a point:

“meeting the actual needs of the market beats quality (which is why we observe so many low quality systems surviving in the wild)”

There seem to be two sorts of quality at play here: “code quality”, which matters primarily to the developers and indirectly to the business, as developers take longer to implement stuff and the product has more bugs vs “product quality” which is a measure of how well the product provides value to the target market.

A product with poor code quality, but which has valuable features that mostly work for what the users try and do with them might well fare better in the marketplace than a product with excellent code quality, but which lacks the features completely.

jeffries

February 2, 2009

9:23 am

permalink

“A product with poor code quality, but which has valuable features that mostly work for what the users try and do with them might well fare better in the marketplace than a product with excellent code quality, but which lacks the features completely.”

Yes, sure. Just as a useful product will far better than a useless one. The formulation above, however, is another implicit EITHER|OR comment, suggesting that if we have excellent code quality, we would somehow miss out important features.

That could be true ONLY IF code quality slows us down. Then it MIGHT happen that we would release with fewer features.

However, code quality does not slow us down. Therefore the comparison is not apt at all. Code quality makes us go FASTER. The high quality product will have more good features, not fewer.

ericjacobson

February 4, 2009

2:51 pm

permalink

Great post Ron! I agree that one can have both quality and a competitive feature set. However, IMHO unless all dev teams understand how to increase their feature set by increasing their quality, there is still value in the EITHER|OR comparison.

Quality is a difficult attribute to nail down. It means something different to everyone. And I have often seen it slow down dev teams, especially when the devs attempt to fix every defect a skilled tester can fling at them.

jbrains

February 7, 2009

1:23 pm

permalink

Wow, Ron. Not what I meant at all. I suppose I need to edit my article, since another person misinterpreted it, albeit more slightly than you have.

I meant this: below the speed/quality barrier, more speed requires an investment in more quality, meaning a short-term outlay of time and effort, which will slow us down for some time; above the speed/quality barrier, even more quality leads to even more speed. Perhaps a metaphor would help.

(snip)

In the process of describing the metaphor, I realize what I meant to write in the first place, and I don’t have enough room in this margin to make that happen. I need to go back to the drawing board, then try again. When I do that, I’ll share my refined perspective with you for your amusement and castigation.

Take care.

AgileMan

February 7, 2009

4:32 pm

permalink

Very good post!

In your listing of the two ways (or was it three?) in which you found the EITHER/OR thinking to be troublesome, I noticed that you didn’t include the one that has always bothered me the most.

Whenever a project manager, customer or other stake-holder, in the grip of fear at the thought of a project shipping late, would insist that a development team “cut quality in the interests of meeting a date”, I always stumbled over the problem of definition. Specifically, where writing software is involved, “lowered quality” is generally only definable on a case-by-case basis, and where there will be an almost limitless number of cases!

To see what I mean, consider “lowered quality” in a conversation between me and a TV saleswoman. She has told me that this 50-inch flat screen HD TV is going for 25% off, and so I naturally ask her, “Why?” She tells me that the model is discontinued, for one thing, and this particular TV has a small crack in the casing at the back (which she shows me). Faced with those two pieces of data, I can certainly make an informed decision about whether or not to accept the “lowered quality”, in exchange for a cheaper price tag. And either way, it’s an acceptable resolution for us both, as she’ll just find someone else to sell it to if I happen to pass on it.

Compare that to the typical software conversation between me and a customer around “lowered quality:”

Customer: “I’m OK with the quality being traded off in order to get the features done sooner.”
Me: “So… You won’t mind if there’s lots of bugs when we ship it?”
Customer: “Well, what do you mean by LOTS?”
Me: “I have no idea. I can’t really predict, because as soon as we stop worrying about quality, all I know is: the bug count will grow. I have no way of knowing by how much.”
Customer: “OK, well, what SORT of bugs are we talking about?”
Me: “Again: no idea. After all, if we knew ahead of time which bugs would be in the product when we released it, we’d avoid introducing them in the first place. We can identify some specific goals, like performance numbers that have to be met, but generally bugs are very difficult to predict.”
Customer: “How about if we say, no more than 10 high priority bugs by release date?”
Me: “OK, that’s helpful. But we can’t guarantee to get below that number at the end, because there just may not be time. If we leave considerations of quality out of the mix during development, then by the time the bugs are being found, there will likely be a lot more than 10 high priority ones, and insufficient time to do anything about them.”
Customer: “This is a very frustrating conversation we’re having…”

Part of the problem is that the customer was envisioning “lowered quality” as meaning the equivalent of a discontinued line of TV products or a TV with a small crack in its case. She didn’t have in mind that, metaphorically, the Blue function (of Red/Green/Blue fame) might not work, that the image might be jittery, that the TV might shut itself off every 20 minutes, that one of the buttons on the front might not work, that one of the HDMI inputs was defective, or any of a thousand other things that might REALLY embody “lowered quality.” That disconnect, between what the customer believes that they’re getting when they agree to “lowered quality” as a priority, and what they end up actually receiving as a result of that decision, are rarely aligned when it comes to software.

And THAT’s why I don’t think you can trade quality for expediency, in the software world (in addition to all the fine points made in your blog post).

DylanSmith

February 9, 2009

3:18 pm

permalink

I’ve been struggling with some of these issues myself lately. What if you have a code-base where the internal quality is horribly low, and you want to make significant improvements in quality quickly and you are willing to invest to do so? Do you take a different approach than what you described above? If so how do you justify the time investment required? In your mind is there ever a point where the quality can get so low that a more drastic approach (and investment) is warranted? How do you judge when you’re at that point and justify the additional investment in quality?

I’ve been trying to tackle some of questions with my current project…I posted my thoughts in more detail over at http://geekswithblogs.net/Optikal/archive/2009/02/09/129299.aspx

belteshazzar.mouse@gmail.com

February 12, 2009

8:56 am

permalink

Though I approach building software with the same attitude of quality, sadly it is true that it does not work this way in the world. Quality does lead to joy and beauty.

“Product buyers, and employers, want the best they can get for their money, not the worst.” “For their money” is the lynch pin of the argument and where it fails in reality.

“Lower quality makes you go slower” is true only if you include the user as the “you” in that sentence. Successful software companies are willing to trade quality for speed if the speed impact is to the customer and not to them. Many users are willing to restart their software, use a ten step process where one would do or suffer with a little wait, if there life is somewhat better than before. Great quality impact is spread across many users who are willing to suffer through the lack of quality.

One phenomenon that I have seen, not only in software but in many service oriented industries, is that the customer often does not know what they want. Does the customer want a great hamburger? Then why aren’t the finest restaurants making the finest quality hamburgers crowded with customers and why don’t those businesses have chains all over the world and sell billions of hamburgers? Because customers are willing to sacrifice quality for time and convenience.

acarlos1000

March 10, 2009

12:40 am

permalink

Ron,

Excellent Post! very well put and crystal clear.

Recent Articles

XProgNew – Pivot?

Experience with Sinatra, and some thinking, cause us to look at a shift in approach. Martin Fowler was probably right.

Codea Calculator II

Ignatz and Jmv38 on the Codea forums commented on the previous article. I had hoped to do more anyway so here's the next one.