Friday, October 27, 2000

The Inmates Are Running the Asylum

This article appears in slightly different form in the September/October 2000 issue of IEEE Micro © 2000 IEEE.

Interaction Design

This time I look at a thin book by a certified software visionary who wants to restore sanity to the way we develop computer-based products.

The Inmates Are Running the Asylum -- Why High-Tech Products Drive Us Crazy and How to Restore the Sanity by Alan Cooper (Sams, Indianapolis, 1999, 278pp, ISBN 0672316498,, $25.00)

The jacket of my copy of this book identifies Alan Cooper as the father of Visual Basic and recipient of Microsoft's "Windows Pioneer" and "Software Visionary" awards. What impresses me more than these awards, however, is Cooper's story of how he became the father of Visual Basic. 

In 1988 Cooper made a deal with Bill Gates to develop a programming language that combined Microsoft's QuickBasic with Cooper's Ruby. Cooper's prototype of Ruby had impressed Bill Gates, but after they signed the deal, Cooper threw away the prototype and started from scratch. He kept "nothing but the wisdom and experience," a decision that shocked and outraged the product manager he worked with at Microsoft. To illustrate the results of this decision, he contrasts today's Visual Basic, a highly successful product, with Microsoft Windows, which is "notoriously handicapped [by] its profuse quantities of vestigial prototype code."

This story shows the depth of Cooper's commitment to investing the time to build a solid foundation at the beginning of a project. It also gives a hint of the sometimes strident tone that makes parts of his book sound a little sophomoric. Cooper has seen and accepted a few simple basic truths, and he often shows his frustration with the programmers and project managers -- the vast majority -- who seem neither to see nor to accept those truths.

Cooper's basic thesis begins with the fact that business executives are not in control of the high-tech industry. This is not a new idea. Reading this assertion sent me back to my copy of The New Industrial State by John Kenneth Galbraith (Houghton Mifflin, 1967). When that book came out, I was designing and programming turnkey systems -- computer based systems that bundle the hardware and software necessary to implement specific applications. We designed and built systems for clinical laboratory specimen handling, data acquisition, and reporting. My boss recommended Galbraith's book because Galbraith understood how decision making in technology-based businesses was passing from the executives at the top to the teams at the bottom who do the work.  

Now, a third of a century later, Cooper believes programmers and engineers are firmly in charge. According to Cooper, this situation leads to products and processes that waste huge amounts of money, squander customer loyalty, and erode competitive advantage. This is what Cooper means when he says the inmates are running the asylum. 

Computers are hard to use, and everyday objects are becoming more like computers. Cooper begins his book by asking riddles: What do you get when you cross a computer with an airplane, a camera, an alarm clock, a car, a bank, a warship? In each case, the answer is "a computer." That is, the same opaque, unforgiving interfaces that plague computer users are displacing the much more humane interfaces that many devices once had. Cooper has made it his personal mission to reverse this trend.

In 1992 Cooper gave up programming to devote himself to making products easier to use. The first fruit of this effort is his highly regarded book, About Face: The Essentials of User Interface Design (IDG, 1995). If you have anything to do with producing software for people to use, you should read About Face

While the design principles in About Face are right, Cooper now believes the audience is wrong. The book starts from the premise that programmers, by default, do most user interface design. As a result, Cooper talks primarily to them, trying to help them design good interfaces. Unfortunately, however, even though programmers would like to design good interfaces, they don't have the time, and often don't have the skills to do so. Moreover, programmers face an overwhelming conflict of interest. 

Programming is hard. For as long as I can remember, competitive demands have forced programmers to push their systems and their tools beyond their limits. In order to survive, programmers must simplify the development process in every way they can. Making programs easy to use can conflict with making development easy. When programmers must choose between a hard-to-develop interface that mirrors the user's view of the task and an easy-to-develop interface that mirrors the program's inner workings, they invariably choose the latter. Programmers understand the interfaces that result, so they don't see why those interfaces confuse and frustrate users.

The facts that programmers are in control and that programmers can't design good interfaces leads Cooper to his main point: we must throw out the current development model and replace it with one that requires us to design the product before we build it.

This bald statement is a slap in the face to the multitude of software firms that have well documented product development processes. Such processes invariably start with a written document that makes the marketing case for the product or feature. The programmers then prepare, and circulate for approval, written documents that say what the proposed product does. Next they prepare detailed specifications of the product's inner workings. The programmers produce all of these documents before they write a single line of code.

Cooper points out that even when companies follow such a process to the letter, they rarely analyze user goals, and they never specify the interactions that the program must provide to satisfy those goals efficiently. Neither the marketing needs statement nor the functional specification identifies users clearly enough to enable programmers to make decisions as they go along. Instead, programmers try to put themselves into the user's shoes. But without clear mental pictures of actual users, they create target users who look and act like programmers. They create interactions that make sense only to other programmers.

Because he started his career as a programmer and because he has worked with many development teams as an outside consultant, Cooper knows what he is talking about here. He offers many insights into how current development practices lead to bad design and how the environment in most high-tech companies works to keep things as they are. He peppers his work with colorful phrases like cognitive friction, apologists vs survivors, the dancing bear syndrome, Homo Logicus, and conceptual integrity is a core competency. They mean little out of context, but they make the concepts easier to grasp and remember.

Cooper wants to create a new field, interaction design. He wants interaction designers to design user interactions before programmers get anywhere near the project. And he wants programmers to regard these designs the way they regard the target hardware -- as constraints, not as suggestions from which they can pick and choose.

Having envisioned the central role of interaction design, Cooper wanted to write a book about how to do it. He quickly saw, though, that there is a bigger problem: overcoming the huge inertia of the current system. He wrote this book, therefore, to make the business case for a product design methodology that starts from interaction design. Presumably he will write a how-to book about interaction design next. I look forward to reading it.

The business case for interaction design is simple. Cooper tells executives:
  • Your products are too hard to use.
  • This costs you money, because customers buy only what they absolutely need, and unhappy customers are not loyal customers.
  • There is a better way to do it (here Cooper gives a glimpse of the book he hopes to write).
  • You can make it happen if you take programmers off the hook for product quality and put that burden on the interaction design team.
While this is the business case book, not the how-to book, Cooper tantalizes us with interesting ideas and marvelous case studies. User profiles called personas are an important part of his methodology. He gives examples of actual products and shows how personas helped him simplify and focus the way people interact with those products.

There is a great deal of insight and good advice in this thin book. I strongly recommend it to anybody who has anything to do with designing computer-based products. If you're an executive in a high-tech firm, it can help you take control of a central part of your business. If you're a programmer, read it and take it to heart. It can help you understand and thrive in the coming design revolution.

Thursday, August 17, 2000

Summer Cleanup

This article appears in slightly different form in the July/August 2000 issue of IEEE Micro © 2000 IEEE.

I try to write columns around coherent themes. After a while, the number of interesting items that just didn't fit into recent columns grows to the point where I need to try to make a theme out of no theme.

This month I talk about a cautionary article, algorithms, Windows, Linux, programming utilities, and an authoring tool.

Bill Joy's Warning

In the April 2000 issue of Wired Magazine, Bill Joy, inventor of the vi editor and a whole lot more since then, calls on his fellow technologists to consider the ethical implications and possible dangers of their work.

New technology has often had bad side effects. The early nineteenth century Luddites fought the Industrial Revolution, because it did so much immediate harm. The Luddites lost, but the side effects of the Industrial Revolution were fortunately not fatal to the society at large. In time, society sorted out the problems, and the benefits soon outweighed them.

Humanity, so far, has muddled through the problems of technological change. But technology is becoming more powerful, more potentially destructive, and more widely accessible. The chance that humanity will survive its side effects, accidental consequences, or malicious misuse, drops from the near certainty of the past to a lower, more frightening, level.

Joy begins by quoting at length a plausible but exaggerated and slightly hysterical passage from the Unabomber Manifesto of technologist-turned-murderer, Ted Kaczinski. Kaczinski makes good points, but I can easily see the flaws. I don't worry that we will all become slaves of robots.

Joy's approach, by contrast, is quiet, understated, well reasoned. He looks at three technologies in which our progress is quickly outpacing our ability to control them: genetics, nanotechnology, and robotics. He calls attention to some of the possible consequences of these technologies -- plague, intelligent germ warfare, out-of-control self-replicating robots, and many others. Some of the consequences seem highly unlikely, others very likely. Many are potentially catastrophic, perhaps fatal to humanity. 

Then Joy says, "OK, what do you think about it?" That's his whole point: think about it.

Most people react to an article like this by worrying about it for a little while, then putting it out of their minds. We all have more immediate problems. We want somebody else to figure it out. We've always muddled through.
Joy is right. There are real dangers and not enough public concern. He has taken these matters to scientific and other associations, but without much success. I agree with him that we need to consider the ethical implications of what we do. But even more than that, whether we're personally involved in these technologies or not, we must all learn more and think more about how to preserve, protect, and defend ourselves, our fellow creatures, and our planet.

I have heard Joy say in interviews that he will soon establish a means for people concerned about these issues to communicate and to become involved. Watch for it.


The Advent of the Algorithm--The Idea That Rules the World by David Berlinski (Harcourt, NY, 2000, 366pp, ISBN 0-15-100338-6, $28.00)

Math historians see David Berlinski's work the way political historians see Edmund Morris's memoir of Ronald Reagan, Dutch (Random House, 1999). They respect it as a work of art, but as history they see it as a lost opportunity.

Berlinski moves easily between fact and fiction, between scientific exposition and impressionistic evocations of the time, place, and human context of that science. He sketches a set of mathematical and philosophical trends and builds them into a coherent theme. The theme may be oversimplified, but much of it rings true.

An algorithm, according to Berlinski is
a finite procedure,
written in a fixed symbolic vocabulary,
governed by precise instructions,
moving in discrete steps, 1, 2, 3, . . .,
whose execution requires no insight, cleverness, intuition, intelligence, or perspicacity,
and that sooner or later comes to an end.
We all know this, but Berlinski contrasts it with the calculus (the subject of a companion book). Calculus is the basis of mathematical physics. Its continuous variables correspond directly to the real world. Its laws need no further interpretation. Physics, according to Berlinski, seeks to trace everything to final laws, which physicists discover but do not create, and historical accidents. Algorithms, on the other hand, embody intention. They manipulate symbols, which someone must then interpret, in a sequence of steps that represent a design and a purpose.

If this is so, then where does the interpretation of the genetic code fit in? Where is the design and purpose? These are the kinds of questions Berlinski attacks after he finishes his basic narrative.

In Berlinski's view, the basic history of the algorithm (like a two-minute summary of Hamlet) is as follows. 
Leibnitz devised the idea of reducing thinking to manipulating symbols for a finite library of discrete concepts. Later, Peano established axioms for arithmetic, Frege incorporated them into the predicate calculus, and Gödel, in proving his incompleteness theorem, defined the class of primitive recursive functions. Shortly afterward, Church invented lambda-calculus, Turing invented his machine, and all three formulations, as well as a similar one from Post, turned out to be equivalent.
Berlinski goes into somewhat greater detail than this, and if you read the book, you can get a good sense of the essence of all of this work. Unfortunately, the book suffers from careless editing and proofreading, which can make it harder to understand such terse formulations as Church's lambda calculus.

Berlinski intersperses little stories within the narrative. They are imaginary dialogs, sometimes involving him, or brief stories that resonate in some way with the text. At one point Berlinski talks about complexity by comparing Jackson Pollock and Andy Warhol. A Pollock painting is complex because the only way to describe it is to point to it. 

In some ways this book is simple, but in other ways it's complex. You just have to look at it for yourself. If you're interested in computers and modern thought, I think you'll find the book worth reading.

The Firewall

Earlier this year (Micro Review, March/April 2000) I wrote about Windows 2000 Server and my attempts to use a personal firewall from of San Francisco. I don't know where the fault lies, because I often install software that's not quite ready for prime time, but after a while my Windows 2000 system became completely unstable, so I reinstalled from the CD that Microsoft had sent me. In the interests of stability, I did not reinstall the personal firewall, because it was not certified to run on Windows 2000 Server.

My new configuration is stable, but it needs some sort of security, so I decided to set up a Linux firewall on a separate machine. Another benefit of the Linux machine, I reasoned, is that it can allow me to route Internet traffic for all of the machines on my network, even when one of them is connected to a corporate intranet via virtual private networking (VPN). I had this capability on the Windows system before I reinstalled Windows 2000 Server, but I have not been able to make it work since the reinstallation. Despite the fact that my old and new configurations both identify themselves as Build 2195, the reinstalled version seems to handle VPN and routing completely differently from the way the original version did.

In recent years I have worked almost exclusively with Windows systems, so I decided to ask a friend for help setting up Linux. It went smoothly enough, but it was like a blast from the past. My conscious mind cannot tell you how to operate the vi editor, but my fingers, perversely, deftly execute the arcane commands for inserting text, replacing characters, and so forth.

Quickly, the great benefits and the evident drawbacks of Linux sprang into view. I found a surplus machine (Pentium, 166MHz) that is more than adequate for Linux. To use it as a firewall, I installed a second Ethernet card (an old 10MBit one) to connect to the digital subscriber line (DSL) modem. Figuring out how to set the jumpers for IRQ and memory mapped I/O went smoothly enough. A Linux command produced a list of devices and the resources they use. IRQ 10 and memory address range 300-31F (hex) showed up as unused, so we assigned them to the Ethernet card.

This all went smoothly, but then Linux began to crash horribly. After a while we determined that the crash occurred every time we touched the mouse. Carefully avoiding the mouse, we prowled around the Linux sources and discovered that the mouse driver uses the same I/O addresses we assigned to the Ethernet card. The mouse driver, probably because it predates the facility, fails to register its IRQ and I/O addresses with the Linux command that produces the list of devices and resources. We moved the Ethernet card's I/O address block to 320 (hex), and now everything works.

Because we could look at the sources, we were able to correct the problem. Because nobody is responsible, there is no easy way to prevent other people from having the same problem. The fundamental benefits and drawbacks of Linux, side by side.

With the second Ethernet card working properly, it was easy to set up IP masquerading, the facility by which the firewall machine routes Internet traffic for the local network. The next steps, still to come, are 
Configuring the firewall for the right balance between security and convenience.
Enabling the firewall to route VPN traffic.
In future columns I'll report on those projects. [Postscript (November 2008): I never got these features to work, but I finally managed to get the Windows networking to work.]

In the course of working on the Linux and Windows systems described above, I was extremely pleased at the ease and reliability with which I was able to manipulate disk partitions using PartitionMagic from PowerQuest ( I'm not elaborating on the details, because I know they have released another version since they sent me my copy, but if you need to add, remove, resize, or perform a variety of other operations on disk partitions, you should take a look at PartitionMagic.

Programming Utilities

I have reported on the Visual SlickEdit editor and the MKS Toolkit in past columns. Both of these programming tools are highly regarded throughout the industry. Visual SlickEdit is a cross-platform programmer's editor. MKS Toolkit makes Unix commands and tools available in the Windows environment. Both are recently out in new versions.

Visual SlickEdit, version 5, MicroEdge, Apex, NC,, $275.00.

Visual SlickEdit for Linux is the first program I installed on my Linux system. It wasn't difficult, but it was a lot trickier than a Windows installation. It was wonderful to see its familiar screen in that programmer-friendly but user-hostile environment.

I've praised Visual SlickEdit highly in the past, but the new version is even more impressive. I don't do enough programming to use all of its features, but I recently used some of its new features when I had to publish printed versions of a set of Java classes. Java sources in electronic form tend to use vertical space lavishly, which can spread them out over several pages, making them hard to read. Reformatting them by hand can lead to errors, so I used the Visual SlickEdit code beautifier to do most of the job for me. The beautifier gives you a choice of styles, and, as with everything else in Visual SlickEdit, if you don't like the available choices, you can write your own in MicroEdge's macro language, Slick C.

While testing my new versions of the code, I realized that I must not have started from the latest version of the sources. I had to compare the old and new versions to see what I would have to change. Visual SlickEdit provides an amazing tool for doing this. It's called DIFFzilla, and while it's a little hard to figure out what the controls do, the side-by-side views it presents are awe-inspiring in their clarity and accuracy.

I didn't need to merge changes, but Visual SlickEdit does that even more smoothly now than it did in earlier versions. It also handles braces (curly brackets) more intelligently than it used to.

Previous versions of Visual SlickEdit had context tagging, a feature that allows the editor to give you language-specific help while you're entering a program. The current version improves on this with support for HTML, Cobol, Ada, and Delphi. It also provides support for displaying and editing Javadocs, the Java facility for embedding documentation into source programs.

If you're a current Visual SlickEdit user, the new features are well worth the upgrade price. If you don't have a powerful programmer's editor, rush out and get Visual SlickEdit. It will do wonders for your productivity.

MKS Toolkit, Mortice-Kern Systems, Waterloo, Ontario,, $400 - $2500.

Since I first reviewed this product (in 1992), it has matured and grown, and its marketing has become much more sophisticated. Its basic focus, however, remains the same: give programmers working in a Microsoft environment the same powerful programming tools that Unix programmers take for granted.

The product now comes in a variety of versions -- from the simplest package for a single developer to complete support for cross-platform, web-enabled enterprise development. The high end products are pricey, so be sure you know what you're buying, and why, if you take that route. I don't have enough experience with the high-end products to make a recommendation.

The basic developer tools, however, are essential productivity enhancers for any Windows programmer. If you're a current user, take a look at the new versions to see if they offer anything important that you're not getting now. If you're not a current user, you need to get MKS Toolkit and learn how to use it. It will pay off for you in the long run.

FrameMaker 6

FrameMaker is the leading tool, at least in and around the Silicon Valley, for producing printed hardware and software manuals. FrameMaker was not originally an Adobe product. They acquired it a few years ago, and for the last year or so FrameMaker users have worried that Adobe was letting it die. The recent appearance of FrameMaker 6, and its cousin FrameMaker Plus SGML 6, has quelled those worries for now.

FrameMaker 6 and FrameMaker Plus SGML 6, Adobe, San Jose, CA,, $849 and $1449.

These new versions are not vastly different from the 5.5 versions. The biggest story is that they came out at all, given the doubts in the FrameMaker community about Adobe's commitment to the product.

While there are no big differences, there are a number of important improvements. The first is a greatly improved book mechanism. In the new version, the book file controls the structure more cleanly and closely, and there are many book-wide features -- like search and replace -- that you had to do one file at a time in the old version. The new version now supports the notion of a collection of books, and you can set volume, chapter, and page numbering in the book file. You can now specify a chapter's number as "continue from previous," so that you need not specify chapter numbers explicitly. This makes it easier to rearrange the chapters of a book or to share a chapter between two books.

FrameMaker 6 simplifies the process of including chapter numbers in table, figure, and page numbers. The chapnum variable replaces a complex scheme based on paragraph numbering.

Another problem with the old version was the inadequate mechanism for generating HTML from FrameMaker files. Rather than trying to improve this mechanism, Adobe has dropped it entirely and now bundles a version of Quadralay's WebWorks Publisher with FrameMaker. WebWorks Publisher handles this task extremely well.

The FrameMaker Plus SGML product is just like FrameMaker in most respects, but it facilitates generating SGML (or XML) documents. Rather than tagging with paragraph and character formats (you still can, if you like), you apply tags that show the element's place in the document structure.

If you are upgrading from FrameMaker 5.5 and you have any interest in generating XML documents, consider upgrading to FrameMaker Plus SGML 6. Adobe charges the same amount for an upgrade to FrameMaker Plus SGML 6, whether you are upgrading from FrameMaker 5.5 or FrameMaker Plus SGML 5.5.

FrameMaker is still the standard for large printed manuals, and with version 6, it looks like it will continue to be the standard for some time to come.

Sunday, June 25, 2000

The Humane Interface, The Deadline

This article appears in slightly different form in the May/June 2000 issue of IEEE Micro © 2000 IEEE.

Doing It Right

This time I look at two books that aim at making life easier for humans. The first tells how to make computer interfaces that humans can use. The second uses fiction to explore the ways humans can work together to produce good software.

The Humane Interface -- New Directions for Designing Interactive Systems by Jef Raskin (Addison-Wesley, Reading MA, 2000, 254pp,, ISBN 0-201-37937-6, $24.95)

[NOTE: Jef Raskin died shortly after this review appeared.] Jef Raskin's best known work is the Macintosh. That product brought about a revolution in computer interface design, but yesterday's revolutionary idea has become today's entrenched paradigm. Raskin has moved on. In this book he shows the flaws of the desktop-and-application approach and explains how it can evolve into something much easier for humans to use.

Raskin centers his ideal system on your content -- not named files in a hierarchy of directories, but just content. You can have files and directories, but only if you decide to mark their boundaries in an otherwise undifferentiated sea of content.

Rather than applications to process your content, Raskin gives you individual commands. Thus, rather than buying Photoshop and Word, you buy individual (or perhaps groups of) image processing commands from Adobe and text manipulation commands from Microsoft. You can use the text commands to add text to images, and you can use the image processing commands to manipulate the images in your printed documents.

This sounds like component software or Unix filters. It's not a radically new idea, but Raskin arrives at it from a different direction. He begins by asking how he can make interfaces that humans can learn easily and use efficiently. To answer that question, he looks at the equipment on both sides of that interface: the computer and the human.

Many people have studied human cognition, but few have applied what we know about the capabilities and limitations of humans to the problems of interface design. To do this, Raskin applies techniques and observations that the cognitive psychologist Bernard J. Baars discusses in his book A Cognitive Theory of Consciousness (Cambridge, 1988).

Raskin takes a fundamental principle from Baars' work: humans can accomplish many tasks in parallel, but can only pay attention to one at a time. We all know this, but many people design interfaces as if it weren't true. Raskin gives examples of this error -- taken from widely used software products.

The fact that we have at most one locus of attention, while most tasks we perform with computers require us to accomplish a variety of subtasks in parallel, leads to the principle of automaticity: the more we can do without thinking, the more efficient we are. Anything that makes us think about what we already know how to do slows us down. This principle leads to the following conclusions:
  • Interfaces should be modeless - the way to accomplish a task should be the same under all circumstances.
  • Interfaces that change in an attempt to adapt to your actions can actually slow you down.
Raskin elaborates on these points with many examples. Some of the examples are surprising. They show the inefficiency of widely practiced interface design techniques.

Raskin turns a lot of attention to the problems of navigation. He likens current navigation methods in applications, operating systems, and the web to trying to find your way around a maze with only the ground-level view of where you are and where you've been. He proposes a two-prong approach to improving this situation: a zooming video camera metaphor for finding your way around a pictorial representation of your content, and a text-search facility that differs sharply from the most commonly used search facilities.

Raskin also applies quantitative methods to interface design. He uses the goals, objects, methods, and selection rules (GOMS) technique developed by Stuart Card, Thomas Moran, and Allen Newell to measure the relative efficiencies of alternative interfaces.

I haven't come near to covering all of the topics Raskin addresses in this marvelous book -- icons, programming environments, documentation, the number of buttons on a mouse, and even cables and connectors.

If you have anything to do with designing any aspect of computer systems for use by humans, you should read this book. People will be talking about it for a long time.

The Deadline -- A Novel about Project Management by Tom DeMarco (Dorset 
House, New York NY, 1997, 320pp,, ISBN 0-932633-39-0, 

In 1964 I knew assembly language for the IBM 650, and I had recently learned Fortran II.  I sat down with the IBM Cobol manuals to see what that language was all about.  It took me very little time to decide that it was not a language I wanted to program in.  I don't regret this 
decision, but it shut me out of the world in which Tom DeMarco was later to flourish.  

Fifteen years and many languages later, I wound up (briefly) advising a large California bank, and the book that bridged the gap between my way of thinking and theirs was Tom DeMarco's Structured Analysis and System Specification (Prentice Hall, 1979).  I still remember how exciting it was to use DeMarco's methods to produce diagrams that represented the elements of the bank's proposed application.  The bank, however, was less concerned with the quality of the information than with the fact that my "work product" was drawn in pencil on D-size graph paper.  We parted company.  They later went out of business.  

In 1979, DeMarco was insightful.  Today he is wise.  Then he was concerned with how software modules work together.  Today he focuses on how people work together.  His book is fiction, but it teaches many real lessons about project management and team building.  

Here is the essence of the plot.  Mr.  Tompkins, a middle-aged middle manager, is laid off from a thinly disguised AT&T, kidnapped by a beautiful and resourceful industrial spy, and spirited away to Moravia, a post-Communist third world country somewhere on the Adriatic coast.  A thinly disguised Bill Gates has acquired this country secretly in a stock swap and has decided to help it dominate the shrink-wrap software business by producing knockoffs of Quicken, PhotoShop, Quark XPress, PageMill, Painter, and Lotus Notes, and giving them away.  
Tompkins takes the job of managing this development.  Because Moravia has far more programmers than required to develop these six products, Tompkins sets up three parallel projects for each product, turning the whole operation into a project management laboratory.  He gets carte blanche from Bill, and everything seems to be going smoothly.  

Just as Thompkins is beginning to feel complacent, Bill returns to the States to work on his house, leaving the sinister bean counter Allair Belok in charge.  Belok embodies every stupid, unscrupulous, bullying executive you've ever worked for.  Sadly, his tactics seem true to life. They certainly rang a few bells for me.  Thompkins must face arbitrarily shortened schedules, merging of his parallel projects, and forced overtime.  On top of this, he must contend with the well meaning purveyors of process improvement, whom Belok sics on him with even more unreasonable goals.  

Thompkins is not alone in his struggles, however.  Belinda Binda, the world's greatest project manager, burnt out and now a bag lady, agrees to help Thompkins staff his projects.  So does Ex-general Markov, former head of software development for the Moravian army.  Lahksa, the beautiful resourceful spy, runs around the world sending Thompkins consultants for 
brief visits.  The reclusive Aristotle Keneros, Moravia's first programmer, helps to divert the process improvement folks, and teaches Thompkins the importance of debugging the decomposition and interfaces during the design phase.  

That's it for the plot.  DeMarco patterned Mr. Thompkins after George Gamow's character of the same name.  Gamow had a wonderful ability to explain physics and mathematics with stories.  In One Two Three .  .  .  Infinity, one of my favorite books when I was in high school, he explains complex numbers with a story about buried treasure.  Mr. Thompkins is a 
bank clerk who goes to lectures on science, falls asleep, and has wonderful dreams that make the concepts clear.  

DeMarco has followed that tradition admirably.  His chapters are little vignettes of project management.  A problem arises, a consultant shows up to help solve the problem, and Thompkins adds a few aphorisms to his journal.  According to DeMarco, most of these aphorisms come from his own journal and represent lessons he learned the hard way.  

Here are a few of the aphorisms that I especially like: 
  • Four Essentials of Good Management: Get the right people, match them to the right jobs, keep them motivated, and help their teams to jell and stay jelled. (All the rest is administrivia.)
  • There are infinitely many ways to lose a day  . . . but not even one way to get one back.
  • People under pressure don't think any faster.
  • It's not what you don't know that kills you  . . . it's what you know that isn't so.
The last one is an old proverb, but DeMarco applies it to one of the sacred cows of software development, code inspection.  

I've discussed the more general aspects of DeMarco's book here, but parts of it get pretty technical -- though rarely enough to bog down the story.  DeMarco believes in metrics and modeling as project management tools, and several of his vignettes show surprising ways to use those tools.  

At the end of the book, Thompkins gives away his journal, saying "I can never imagine opening it again.  I don't need to.  I carry those hundred and one principles everywhere I go. They're carved into my hide." The book is a crash course in project management and team building.  If you do any sort of technical development, you should read it and absorb it.  

Thursday, April 27, 2000

Windows 2000

This article appears in slightly different form in the March/April 2000 issue of IEEE Micro © 2000 IEEE.

On February 17, 2000, I attended Bill Gates' kickoff of Windows 2000. I've been using beta versions for a year, but now it's official, so I can talk about it.

The kickoff was an extravagant production. Television actors provided glamour as they struggled through ghastly scripts. The captain of the starship Enterprise, for example, came on stage to complain when Gates used the word enterprise to describe his new product's target applications. The great rock guitarist Carlos Santana and his band closed the proceedings, while an army of reporters and publicists fiddled with their laptops and cell phones and pretended to enjoy the music.

Unlike Santana, the kickoff show may never win a Grammy, but the demonstrations of features and performance inspired awe among the attendees. Gates trotted out benchmarks that put Windows 2000 price/performance ahead of all of Microsoft's competitors -- and well beyond that of prior Microsoft systems. Even more impressive were the demonstrations of dynamic reconfiguration and load balancing in multiprocessor clusters. An operator at a console put machines into and out of service by dragging and dropping icons, and processor usage gauges immediately reflected the automatic rebalancing. Other demos showed how easily a user with a laptop can synchronize with a server and how effectively an administrator can control and allocate resources with Active Directory (see below).

The most impressive measurements that Gates announced were of how infrequently the systems crash. Windows 2000 seems, at least in these tests, to be much more reliable and fault tolerant than its predecessors. My own experience (see below) hasn't been nearly as good as the studies Gates reported, but I test lots of software, and I reconfigure often -- all without the benefit of a trained system administrator.

The Operating System

Windows 2000 is a family of operating systems, all targeted at business users. It does not, as you might have thought from the name, replace Windows 98. It is instead an evolution of Windows NT 4. In fact, the earliest beta versions I received were called Windows NT 5.

The bottom of the line is Windows 2000 Professional, which is aimed at business desktops and laptops and at high-end workstations. There is nothing to prevent you from using this product at home, of course, but in seeking to improve security, Microsoft has removed hooks into the hardware that many video games depend on. Drivers for many older devices have also had difficulty migrating to Windows 2000, which is another obstacle to its home use.

The next member of the family is Windows 2000 Server. It's not very different from Professional, and many users will prefer it. Microsoft intends it for use as a file, printer, communications, or web server.

For high-end server applications, Microsoft provides Windows 2000 Advanced Server. This version supports huge memories and symmetric multiprocessor (SMP) configurations. It supports clustering and rolling upgrades. You might use a cluster of Advanced Server machines to support a high-traffic website.

The fourth member of the family is Windows 2000 Datacenter Server. When Microsoft releases it later this year, it will do what the other family members do, but it will support larger memories and more multiprocessors.

I quickly decided that I needed Windows 2000 Server for the tasks I want to run, so that's the only version I have direct experience with. I'm happy with it, because its user interface is much more that of like Windows 98 than Windows NT 4, and it's easier to configure unusual network configurations. I connect my Windows 2000 Server machine to the Internet via a digital subscriber line (DSL). I have a local Ethernet, and I use the Windows 2000 Server as a gateway to give the other machines Internet access. At the same time I connect the Windows 2000 Server as a client to an enterprise intranet via virtual private networking (VPN). This configuration may have been possible with Windows NT 4, but try as I might, I could never make it work.

I won't run through all of the features of Windows 2000. You can find a summary on the Microsoft website. If you use Windows NT 4, or even Windows 98 with standard business software and devices, you'll find Windows 2000 a significantly more capable, usable and reliable product.


Windows 2000 will spawn a huge supply of third party books, and many have appeared already. I look at four good ones.

Active Directory for Dummies by Marcia Loughry (IDG,, 2000, 402pp plus CD, ISBN 0-7645-0659-5, $24.99)

Windows 2000 Registry for Dummies by Glenn Weadock (IDG,, 2000, 378pp plus CD, ISBN 0-7645-0489-4, $24.99)

It may seem incongruous to talk about anything as complex as Windows 2000 as being for dummies, but these two books, and the one that I discuss under security (below), adhere to the same formula that has made the dummies series such a runaway success. The authors know their audiences, and they talk to them as intelligent people who are just beginning to learn the subjects. While they must assume a certain degree of sophistication and general background, the authors explain everything about the topics of the books.

In addition to targeting their audiences accurately and not taking anything for granted, the dummies books enhance communication in other widely known but less widely used ways. Their page layouts, font selection, and clear illustrations draw readers in. The icons for warnings, tips, technical details, and other aspects of the text are consistent from book to book, so the more dummies books you read, the easier it is to find the information you're looking for. The informal and slightly humorous tone of the books helps establish a rapport between author and reader, despite the highly formulaic structure.

Another great strength of the dummies books is that they are task oriented. They identify the tasks you're likely to wish to perform, and they show you how to perform them. Yet the authors don't restrict themselves to a cookbook approach. With each task they give you the background to understand what you're doing and why you're doing it. This technique is sometimes called just in time learning, and studies show that it is an effective way to learn.

Blatantly stealing Apple's famous catch phrase, the dummies books proclaim themselves a reference for the rest of us. And in fact, in addition to their tutorial elements, they contain many aspects of good reference works -- starting in every case with an excellent index. The Active Directory book also has several helpful appendixes.

The registry book contains information every Windows 2000 user should know about. The registry is an evolution of the registries of Windows NT 4 and Windows 98. If those were always a mystery to you, read this book now.

The Active Directory book may be more interesting to administrators than to average users. Active Directory centralizes resource allocation and control. It is a database of configuration information, some of which would have been stored in the registry under Windows NT 4. The book leads you through the daunting task of setting up directory services for an enterprise. 

Given the important role Windows 2000 will play over the next few years, I recommend that you spend a few hours reading these books and getting the ideas straight now. That knowledge is bound to pay off as time goes by. 

Inside Windows 2000 Server by William Boswell (New Riders,, 2000, 1496pp, ISBN 1-56205-929-7, $49.99)

Boswell's book is very different from the dummies books. It contains a great deal more detail, but you may have to dig harder to get it out. It is definitely more of a reference work than a tutorial. Although it contains many how-to procedures, it is not fundamentally task oriented.

The book has an attractive and readable layout, and its binding allows it to lie flat when open to almost any of its 1500 pages. It is comprehensive and clearly written. While I can't attest to the accuracy of everything in it, it does list three technical reviewers.

If you're a system administrator, you need this kind of reference book. This one seems like a good investment.


I became a regular Internet user in the early 1980s, but I never worried much about security until recently. This was not because the threats weren't real. I remember reading a congressional report on databases and invasion of privacy more than 30 years ago, and the threats were scary then. And if you want to see how that aspect of the problem has developed, read Simson Garfinkel's new book Database Nation (O'Reily,, 2000, ISBN 1-56592-653-6, $24.95).

While I think the kinds of threats that Garfinkel describes are more ominous, I'm concerned here with the kinds of threats you can reasonably do something about on your own Windows 2000 Server machine, namely, the threats of unauthorized access to your machine and to other machines on the networks your machine connects to.

Windows 2000 Server Security for Dummies by Paul Sanna (IDG,, 2000, 378pp plus CD, ISBN 0-7645-0470-3, $24.99)

If you're a system administrator, this book will lead you through the basic steps you can take to protect your system without totally disconnecting it from the world. Like the dummies books I describe above, this book leads you through the tasks you need to accomplish, supplying you at each stage with the information you need to understand the procedure you're following.

If you're responsible for the security of a Windows 2000 Server system, reading this book is a really good idea.

I found another excellent resource for securing Windows-based systems (not necessarily Windows 2000). This is the website of Gibson Research of Laguna Hills, California ( When you go to this site and the first thing you see is your name on the screen, you know you have work to do. Steve Gibson leads you through some simple steps that will greatly reduce your vulnerability.

One of Gibson's recommendations is to use the Zone Alarm 2.0 personal firewall, which you can obtain free from of San Francisco, California. I downloaded this product and have been using it more or less successfully. Zonelabs does not certify the product for the configuration I'm running, and so I don't blame them completely for the occasional blue screens of death that occur when the program fails to handle kernel mode exceptions properly.

Using Zone Alarm has been very revealing. It quite regularly reports probes from unauthorized IP addresses. I'm still tweaking the settings and learning to use it effectively, but I'm happy to have it running, despite the occasional crashes. It's certainly a product worth investigating, and you can't beat the price.

Monday, February 14, 2000

Happy New Year 2000

This article appears in slightly different form in the January/February 2000 issue of IEEE Micro © 2000 IEEE.

A Look Back

I looked back at my first columns of previous years to see if I could spot trends. It's an interesting progression. I'll let you decide about the trends.

In 1988 I complained about the way Word 3.01 implemented styles. I also reviewed books about Word Perfect and microcomputer busses.

In 1990 I reviewed a book about the problems of Japanese language computing. Most of those problems have evaporated in the intervening 10 years.

In 1992 I wrote about upgrading my Macintosh SE/30 to 8 megabytes of main memory and installing the System 7 operating system.

In 1993 I compared Word 5.1 for the Macintosh with Word for Windows 2.0, and reviewed MKS Toolkit 4.1 for DOS and books about debugging, Windows NT, friendly software design, and a realtime kernel.

In 1994 I reviewed some Macintosh utilities and a book about the Intel Pentium architecture.

In 1995 I reviewed four books about the Internet and the TCP/IP protocol.

In 1996 I reviewed Bill Gates' The Road Ahead, a book about the coming information highway. He suggests that you view mergers of ISPs and content providers skeptically. I wonder what he thinks of AOL's acquisition of Time Warner.

In 1997 I wrote about object-relational databases and reviewed a book about how Microsoft conducts its business.

In 1998 I wrote about how Apple was beginning to make a comeback, and I reviewed three interesting books about how technology will affect the society of the future.

In 1999 I wrote about Apple's G3 and iMac computers, reviewed a four-volume handbook of programming languages, and talked about my difficulties getting the Windows NT5 beta software to run on my PC.

This year I'm still struggling with Windows NT5 (now they call it Windows 2000 Server) beta software, but it's a lot better. At the moment I'm running Build 2195, and I'm quite happy with it. I'll say more when I see the released product.

Practical Programming Advice

I've read many books about programming in the last 35 years, and from time to time I review especially good ones. In the past the good ones have been few and far between, but that seems to be changing. In my Nov/Dec 1999 column, I reviewed Extreme Programming Explained by Kent Beck. This time I was delighted to discover another excellent programming book from the same publisher. 

The Pragmatic Programmer -- From Journeyman to Master by Andrew Hunt and David Thomas (Addison-Wesley, Reading MA, 1999, 321pp, ISBN 0-201-61622-X, $34.95)

Andrew Hunt and David Thomas are software consultants doing business as The Pragmatic Programmers ( They have produced an outstanding book about programming, and many of the insights they offer apply to other engineering disciplines as well.

In August 1993 I reviewed Steve McConnell's Code Complete (Microsoft, 1993), which assembles in one thick book everything that a journeyman programmer should know. Hunt and Thomas, in a thin book, give you a glimpse of the kind of thinking that leads to the next step -- mastery of the craft of programming. More broadly, if you participate in technological innovation and have access to computer-based tools, you can probably apply many pragmatic programming principles to your own work.

In his foreword to the book, Ward Cunningham says:
Imagine that you are sitting in a meeting . . . thinking that you would rather be programming. Dave and Andy would be thinking about why they were having the meeting, wondering if they could do something to take the place of the meeting, and deciding if that something could be automated so that the work of the meeting just happens in the future. Then they would do it.
To follow an approach like this, you must always be thinking about what you are doing, and you must be so comfortable with your tools that you can take the step from thought to reality.

Hunt and Thomas talk about tools and how to be comfortable with them. They focus strongly on basics like using plain text, command shells, scripting languages, and a good text editor. The main thrust of their book, however, is about attitude -- an approach to life. This could have led them to a programming version of the Tao Te Ching, but instead they have stayed true to their title. They find practical examples to illustrate everything they say.

Much of what Hunt and Thomas say in this book is simple folk wisdom -- care about your work, take responsibility, know when to stop, and so forth. Yet when I look at the whole package, with each homily applied to specific technical contexts, I find the book inspiring. I enjoy the affirmation of many things I've believed for years, and it moves me to return to paths I've strayed from.

The late Rudolph Langer, a great programmer and for many years the editor-in-chief of Sybex, used to collect books of aphorisms. He would have loved the way the authors boiled down much of their advice into 70 catch phrases. For example, DRY - don't repeat yourself, summarizes a deep and important discussion of how to eliminate duplication and increase orthogonality. Configure, don't integrate summarizes an important insight into how to use metadata to increase the reliability and flexibility of your systems.

When I programmed the PDP-8, I treasured a three-fold instruction card that fit in my shirt pocket. It was the only reference I ever needed. Hunt and Thomas have not matched that size target, but their book comes with a quick reference card. It contains their 70 aphorisms (they call them tips) and 11 checklists. For example, the Law of Demeter for Functions checklist says
An object's methods should call only methods belonging to:
  • Itself
  • Any parameters passed in
  • Objects it creates
  • Component objects
The authors' discussion of this law occurs in a chapter on flexibility. It's a good example of how they move easily between generalities and specifics.

I could go on and on about this book -- the discussions of projects, the exercises and answers, the references, the bibliography and resource lists. It hangs together and communicates its message extremely well. And not surprisingly, the authors applied many of the methods and philosophy they describe to producing it.

If you're a programmer, or even if you're in a seemingly unrelated engineering discipline, Hunt and Thomas have a lot to say to you. Buy their book. Read it. Become a pragmatic programmer.