Quality Software

What is Quality?

I've always recalled the following quote from E.W Dijkstra (in fact it's actually posted on the door to my office in his handwriting).

"Being a better programmer means being able to design more effective and trustworthy programs and knowing how to do that efficiently"

My interpretation: Quality means efficiency. Quality software is efficent for the user of that software. Quality programmming is efficency in developing the software. I believe both can be objectively measured.

Note that reliability (trustworthyness) is normally a prerequisite to efficiency, since in most cases programs that break end up consuming more time than those that don't. In theory, however, it's possible the cost of recovering from faulty software + the normal uptime of such software may still be less than a competing software product. In that case by my measurement the former would indeed provide higher "quality".

Note that "showstopper" bugs = zero quality, since the task cannot be completed at all.

I often try to get the overall point across by saying something like this:

We could have a butt-ugly GUI and no documentation but if the user is able to get the task done faster with our tool than with the competitor's tool, then our software is higher quality, bottom line.

For the non-interactive parts of programs we can measure the efficiency in terms of resource utilization: cpu-time, network-bandwidth, memory-use. To optimize these aspects we leverage less resource-intensive algorithms, data-structures, and communication protocols.

There is a limit on runtime performance improving quality. For example, I don't need a car that can drive 200 mph if I always obey the 65 mph speed limit.

For interactive programs we have to also measure the amount of time consumed by the human user.

To optimize this aspect we provide "tools". Tools are software programs that partially automate user tasks on behalf of the user. The combination of user + tool should result in the task being completed faster than user alone (or user + your competitor's tool) (there's also a learnability aspect of the tool in addition to usability which must be factored in, but let's ignore that for the moment).

Experienced programmers know that to improve the efficiency (in terms of cpu use) of the non-interactive parts of programs you should not visually inspect your code, but rather use a profiler as follows:

  1. Run the application under the cpu-profiler. The cpu profiler outputs a list of the methods called by your application sorted by time consumed
  2. You ignore everything after the first item in the list (important!)
  3. You open the source file containing the method and edit the code.
  4. Go to 1 (if you've done your job a new method will be first in the list)

My argument is that the same approach should be applied to interactive programs. We don't necessarily have a tool like the cpu profiler but that doesn't matter. An approach might be like this:

  1. Sit a test user in front of your user interface.
  2. Use a tool like Macromedia captivate to record the session. Start recording now.
  3. If the user hits a showstopper bug. That is the #1 problem. Stop and fix it now. Then Go to 1
  4. Stop recording when the user completes the task
  5. Assemble your team and review the session; Break down the session into discrete functional steps and note the amount of time spent in each one
  6. Order the steps by the amount of time consumed
  7. Assign as many resources as necessary to the first step in the list.
  8. Have them alter the design or implementation and then Go to 1.

In the case of software development tools like those provided by Sun, the human user is another programmer and we need to measure his performance to determine the quality of our tools.

The most efficient way of doing this is to have the tool developer himself act as the user (eat your own dogfood) and in cases where this is easy to do you can definitely see a difference in quality (for example the Netbeans and Eclipse Java editor plugins are noticeably far superior to any other plugins, largely due to the fact (in my opinion) that the developers themselves use them on their own code in their daily work.

Unfortunately, in cases where this isn't so easy (in particular Enterprise Tools) typical managers and developers don't even bother to test their tools on real use cases. For example, Java plugin developers who never wrote an applet, JSP-editor developers who never tried to build a real web site, etc. etc.

In such cases the result is typically tools that are highly unusable in obvious ways, which aren't identified until after a release when real users try to use them.

I used to work in Telecom, where we built C programs that ran on embedded systems (switches, routers, and other devices). When it came time to test our software nobody said "here you go, why don't you test it out on our digital switch". Instead we had to rely on simulation. We did this by means of scripts. So if we had a digitial subscriber line talking to a switch, we had one script that simulated the DSL and another that simulated the switch. Basically we tested our software against itself. Because standard wire protocols were used this worked well. Note that this strategy is commonplace in life, for example in sports: during practice your starting team plays against a "scout" team which simulates your real opponent.

In my opinion, the same approach can be used for enterprise software tools integrated with Web Services as in Sun Java CAPS. Note that as above nobody is going to say "Hey, here you go, why don't you test your enterprise tools on my enterprise". Instead in each case we need to simulate the enterprise software problem that our tool is supposed to solve.

WSDL provides a standard communication protocol that can effectively be used to script a simulated enterprise application environment, which can then be used to test the quality of our development tools in the manner described above.

Comments:

Excellent and very convincing summary of the argument you've put to me several times. I'm very glad to see that you've written this up. Really, what I think you describe is not applicable to just quality, but also to usability, or at least one element of it (efficiency of performing a task is not the only relevant usability measurement; e.g. approachability and reduction of cognitive load across tasks are also important factors). One point I've tried to make to you in the past, though not very well I think, is that quality by your definition doesn't necessarily correlate with market appeal, which is one of my chief concerns. In particular, I'm not convinced that measuring according to the competition is a sufficient yardstick. Surely, that's one axis that should be measured, but are there no absolutes? And what about adherence to convention, or other market forces? For example, you can have the highest quality tool by your definition, but no ability to appeal to the market because you've ignored domain conventions, or are up against an entrenched product monoculture (e.g. no matter what you do, it's nearly impossible to compete with Photoshop on its terms because of the sheer momentum in the visual design industry). Or, if you have a highly "usable" tool, quality could potentially be lower than the competition without compromising the ability to appeal to customers above other competitors. Thus, I don't see your definition of quality as the end of the discussion when talking about the lifecycle of a product and its marketability. You've identified a very important criterion, but I'll need to sleep on this a bit more and try to identify exactly what it is that leaves me wanting more when we've had this discussion in the past. Hopefully I'll have some time to blog about it soon; in particular, I wonder if there is any possible connection or synthesis with bug-driven development as I've formulated it on my most recent blog entry.

Posted by Todd Fast on January 09, 2007 at 03:11 PM PST #

Thank you very much for this useful article. I like it.

Posted by evden eve nakliyat on September 11, 2008 at 08:01 PM PDT #

what a nice

Posted by evden eve nakliyat on October 03, 2008 at 08:46 AM PDT #

en güzel evden eve nakliyat firmaları

Posted by evden eve nakliyat on October 23, 2008 at 10:25 PM PDT #

en güzel evden eve nakliyat firmaları dimi

Posted by evden eve nakliyat on October 23, 2008 at 10:26 PM PDT #

en güzel evden eve nakliyat firmaları evet

Posted by evden eve nakliyat on October 23, 2008 at 10:26 PM PDT #

good sharing, thank you.

http://www.vaziyet.net/kopek-dovusleri-izle
http://www.vaziyet.net/msn-ac
http://www.vaziyet.net
http://www.vaziyet.net/facebook

--------------------------------------------------------------------------------

Posted by haber on May 16, 2009 at 06:48 PM PDT #

evden eve nakliyat,nakliye

Posted by evden eve nakliyat on June 30, 2009 at 03:02 AM PDT #

In my opinion, the same approach can be used for enterprise software tools integrated with Web Services as in Sun Java CAPS. Note that as above nobody is going to say "Hey, here you go, why don't you test your enterprise tools on my enterprise". Instead in each case we need to simulate the enterprise software problem that our tool is supposed to solve. http://www.aygulum.net

Posted by Chat on June 17, 2010 at 08:54 AM PDT #

In my opinion, the same approach can be used for enterprise software tools integrated with Web Services as in Sun Java CAPS. Note that as above nobody is going to say "Hey, here you go, why don't you test your enterprise tools on my enterprise". Instead in each case we need to simulate the enterprise software problem that our tool is supposed to solve. http://www.parcatlkontor.com

Posted by parça kontor on October 20, 2010 at 01:29 AM PDT #

Simple and Nice example !

Posted by شات on December 15, 2010 at 03:37 AM PST #

Post a Comment:
  • HTML Syntax: NOT allowed
About

user12610627

Search

Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today