[copperpress-advertserve-ad-reload zone="3"]
 August 27, 1999

Geek News: Useful software requires teamwork

Here’s Kent’s Rule of Thumb for estimating the likelihood of software success. There’s a 90 percent chance the software will be garbage. That 90 percent covers a wide range, of course, from “this is such a piece of garbage it’ll never be used,” to “it’s horrible, but we’ll just have to put up with it.”
There’s a 5 percent chance that the software will be mediocre. It’ll be usable, but not particularly good. And there’s a 5 percent chance that it’ll be good — anywhere from “hey, this is pretty good,” to “wow, this is great!” I’ve discussed this rule of thumb with a variety of people in the software business, and have discovered that many agree with the percentages, more or less.
The problem is that most software is written by people who don’t know anything about the task for which they’re developing the software. Most software developers these days will tell you that they talk to the users, so they can find out what the users need. But “we ask users what they want,” is a real cop-out. It’s simply not enough to ensure good software.
Here’s an analogy. Let’s say I’m an architect. You come to me and ask me to design a beautiful house. I want to make sure I design a great house, so I ask you what you want. Of course, you’re not an architect, so you can’t tell me how to design a house. So you tell me that you want five bedrooms, four and a half baths, and, oh yes, a really big kitchen. Does this give me enough information to design a great house? Well, let’s do a test. If anyone out there is planning to build a house soon, send me $400,000, and I’ll bet I can build you a real piece of trash with five bedrooms, four and a half bedrooms, and a really big kitchen. An architect should know how to design a house, and although he has to understand the buyer’s needs, that just isn’t enough.
Let me give you another, real-life example. In my first software-testing job, back in 1981, I worked on a system used for monitoring drilling on oil rigs. I was a geologist, and had been working on rigs for a couple of years, so I knew what the system was supposed to do.
When you’re drilling an oil well, you string 90-foot sections of steel pipe together, up to 5,000 or 10,000 feet, sometimes more. Tons of steel are pressing down on the drill bit at the bottom of the hole. But you need to adjust this weight — too much weight and you’ll damage the drill bit, not enough and you’ll drill too slowly. So the drilling staff monitor what’s called, not surprisingly, “Weight On Bit.” Well, the software developers knew that. That was pretty basic stuff, and they learned it by working on the rigs for short periods, and talking to the people on the rigs.
Now, what happens when the drill bit wears down? You have to pull all that steel out of the hole, change the bit, then put it all back into the hole again. But the programmers noticed that when the pipe was being run back into the hole, the instruments measuring Weight On Bit seemed to be showing a positive number for some strange reason. That didn’t make sense to the programmers — if the drill bit wasn’t on the bottom of the hole, how could there be weight bearing down on it? — so they fixed it. They told the program, “if the pipe isn’t on the bottom of the hole, show the Weight on Bit as zero.” That’ll fix that, eh?
The problem is that there can be Weight On Bit, thousands of feet from the bottom of the hole. When you pull the pipe out, the hole “sloughs” in; that is, the walls of the hole you’ve just drilled cave in, and you actually have to drill through the hole again. It’s generally fairly quick, but nonetheless you may be drilling 5,000 feet above the bottom of the hole, so there is Weight On Bit. And it’s useful information, because it gives you an idea of what’s going on at a particular section of the hole. (I recall having a big argument with the development team leader, who refused to believe me when I explained just why he couldn’t zero the Weight On Bit data.)
Now, consider the “we ask our users” method of software development. First, this development team did ask the users, and it didn’t do them any good. But in any case, what user would ever tell them “oh, by the way, when we’re running the bit back to the bottom, don’t zero out Weight On Bit”? Why would a user ever think of that sort of detail?
If “we ask the user what they want” is not enough, then what is? It’s my belief that the only software that turns out right is software that is developed with the assistance of a “user advocate.” You have to have someone on the development team who understands the process for which the software is designed. But that person also has to understand the software development process. This person must be deeply involved in the process, with the ability to test the software and design user interface tools. It requires more than just a few minutes a week. This person must be an integrated part of the process.
And that’s the real weakness in the software-development process. Software development is a mess. And until the industry as a whole learns this very simple concept — include somebody who understands the process on your development team — things won’t get better.

“Poor Richard’s Internet Marketing and Promotions,” the “sequel” to “Poor Richard’s Web Site,” is now in print. Visit http://poorrichard.com/promo/.

Here’s Kent’s Rule of Thumb for estimating the likelihood of software success. There’s a 90 percent chance the software will be garbage. That 90 percent covers a wide range, of course, from “this is such a piece of garbage it’ll never be used,” to “it’s horrible, but we’ll just have to put up with it.”
There’s a 5 percent chance that the software will be mediocre. It’ll be usable, but not particularly good. And there’s a 5 percent chance that it’ll be good — anywhere from “hey, this is pretty good,” to “wow, this is great!” I’ve discussed this…

[copperpress-advertserve-ad-reload zone="3"]

Related Content

[copperpress-advertserve-ad-interstitial zone="30"]