Friday, April 27, 2001

Pervasive Technologies

This article appears in slightly different form in the March/April 2001 issue of IEEE Micro © 2001 IEEE.

The books I look at this time are about pervasive technologies. The publishers are highly respected producers of books for programmers and systems designers, but these books are for a wider audience.

Computers as Components: Principles of Embedded Computing System Design by Wayne Wolf (Morgan Kaufmann, San Francisco CA, 2001, 688pp, $64.95)

Wayne Wolf is a professor of Electrical Engineering at Princeton University, a former employee of AT&T Bell Laboratories, and a graduate of Stanford University. He is editor-in-chief of IEEE Transactions on VLSI Systems and a well known researcher in the fields of embedded systems and hardware/software codesign. He is eminently qualified to write this much needed book.

Embedded systems are as old as microprocessors, which means they have been around for about thirty years -- longer if you count dedicated minicomputer-based "turnkey" applications, such as the clinical laboratory data acquisition and reporting systems I worked on in the 1960s. Creating such systems has remained challenging. Programming and debugging facilities tend to be limited by the often primitive and non-standard nature of the underlying hardware. In the late 1980s, for example, when object oriented high level languages and integrated development environments were already widely available, I worked on a new version of a medical instrument. It used an 8-bit Intel 8085 microprocessor (introduced 12 years earlier), programmed in assembly language, to control a complex control panel and an automated data acquisition system. The underlying software was based on a minicomputer operating system I had worked on in the 1960s. The debugger allowed me to examine the contents of memory (in hexadecimal) and to place a breakpoint at a specified memory location. I had no simulator -- the test bed was the instrument itself.

Designing hardware and software to work together has lagged behind software development on standard software platforms. UML diagrams, CRC cards, concurrent engineering, and design reviews, for example, are in the main stream of software development, but embedded system development projects rarely use them. In her introduction, Lynn Conway, herself the coauthor of a landmark book, praises this book for providing a systematic approach to embedded system design, based on such systems development techniques and processes. 

Now, as the market for embedded systems expands exponentially, the time has come to civilize this frontier. Wayne Wolf's book shows the way to do that. It is essentially a course in computer science and software development, oriented totally to the problems of embedded systems designers. Wolf assumes that his readers are not computer scientists, so he covers many topics that other books take for granted. He complements this self-contained introduction to embedded computer science with a number of detailed design examples, taken from real projects at places like Bell Labs.

If you are planning to teach a course in embedded systems design, or if you are a long time practitioner and want to bring your skills up to date, this is the book for you. 


Introducing .NET by James Conrad et al. (Wrox, Birmingham UK, 2000, 462pp, www.wrox.com, ISBN 1-861004-89-3, $34.99)

Sixty-five million years ago, dinosaurs ruled the earth. A huge meteor struck, and the environment changed. The dinosaurs died out, and eventually we arrived at today's flora and fauna.

Ten years ago, Microsoft ruled the computer world. Objects, the web, Java, and XML struck in rapid succession, and the environment changed. Software can evolve more rapidly than dinosaurs could, and Microsoft's new platform, .NET, makes MS-DOS and Windows 3.1 look Jurassic by comparison. The .NET platform addresses the challenges of objects, the web, Java, and XML. 

To support object-oriented development, .NET replaces COM with the common language runtime (CLR), which allows all Microsoft languages to share an object management environment. Object have formats that are independent of source language. They share garbage collection, exception handling, and a common class library, including common datatypes.

To support deploying applications on intranets or the web, Microsoft is developing ASP.NET, a complete overhaul of the slow, error-prone, language-limited, non-object-oriented ASP (active server pages) that today's developers are forced to use. Complementing that development is ADO.NET, which replaces today's ADO (ActiveX data objects) in much the same way that ASP.NET replaces ASP. Finally, the simple object access protocol (SOAP) and Microsoft's web Services and web Forms promise to make web-deployed applications look very similar, in appearance and in implementation, to applications deployed on desktop systems.

Microsoft's response to Java is a little more subtle, entangled as it is in legal wrangling. The .NET platform moves toward Java's "write once, run anywhere" promise by encapsulating the operating system in a common set of classes that have different implementations on different platforms. These serve a function similar to the Java virtual machine. Microsoft has also sidestepped Sun's control of Java by developing its own Java-like language called C#. While similar in form and philosophy to Java, C# introduces a few improvements. For example, it standardizes the getting and setting of object properties.

Finally, to mix a metaphor, Microsoft has embraced XML and jumped into it with both feet. Much of the .NET functionality depends on the XML-formatted metadata that accompanies every object. XML also provides the base format for remote interfacing, and it serves as the working data format for ADO.NET. Both the System and System.Data class hierarchies devote a namespace to XML.

Wrox Press specializes in books for programmers. They have assembled a team of 10 men, all with programming backgrounds, to put together a survey of Microsoft's .NET plans, based on Microsoft's public statements and on their own experiences with the public beta release of the .NET software development kit. They give a pretty coherent picture, hedged in disclaimers, of what the first release -- probably still about a year away -- will look like. If you expect to develop software for Microsoft systems in the next few years, you need to know all about .NET, and these authors are excellent guides to the territory. Don't expect definitive information, but if you can stand the missing pieces and loose ends of a work in process, this book will get you up to speed. 


Learning XML: Creating Self-Describing Data by Erik T. Ray (O'Reilly, Sebastopol CA, 2001, 368pp, www.oreilly.com, ISBN 0-596-00046-4, $34.95)

Erik T. Ray (not to be confused with Eric J. Ray, whose HTML books I reviewed here a few years ago) describes himself as a software wrangler and XML guru for O'Reilly and Associates. He wrote this relatively short book to give readers a bird's eye view of the XML field. He succeeds remarkably well at that, but it is more than a bird's eye view. At appropriate points Ray delves deeply into the details by presenting complete, clearly written examples.

If you plan to work with XML to produce technical documentation, this book pays for itself many times over. Ray includes a completely worked out DTD called Barebones DocBook, which you can probably use as is. He also includes an XSLT stylesheet for producing HTML from Barebones DocBook documents.

If you plan to write programs to process XML, you might use, and can learn a lot from reading, Ray's Perl code for an XML syntax checker.

Many authors dump sample code into their books, but the XML, XSLT, and Perl examples in this book are well organized, clearly formatted, well annotated, and easy to understand.

For brevity, completeness of coverage, clarity of writing, and usefulness of examples, this is the best XML book I have seen. I recommend it highly.

Tuesday, February 27, 2001

Context

This article appears in slightly different form in the January/February 2001 issue of IEEE Micro © 2001 IEEE.

In 1968 Arthur C. Clarke and Stanley Kubrick created the film 2001, A Space Odyssey. In the January 2001 issue of National Geographic, Clarke discusses prospects for achieving in real life some of the fictional events of the film. He does not, however, discuss HAL, the murderously megalomaniacal computer that both enabled and endangered the mission. HAL seems even further from reality today than in 1968.

One of the books I look at this time quotes Berkeley computer science professor Robert Wilensky as saying that the social and psychological issues are so hard that computer scientists can only hope that they aren't on the critical path. I think this hope is wishful thinking, as HAL's continuing unreality suggests.

This time I look at books that confront the context, largely social, in which today's technological advances occur. Many technologies make great leaps forward at the surface level but never really take hold. Entrenched technologies have largely implicit and unnoticed support systems. Recognizing these systems and addressing the problems they solve is the way to achieve deep and lasting change.

The first book I look at asserts that tunnel vision leads pundits and producers alike to misjudge the impact and support needs of new technologies. It surveys the main areas in which infohype has gone badly wrong, and explains why.

The other books I look at describe methodologies for developing software-based products. The first methodology is Extreme Programming (XP), which I discussed in my review of Kent Beck's Extreme Programming Explained (Micro Review, Nov/Dec 1999). XP focuses heavily on the actual practice of programming and the realities of customer-developer communication. XP is a highly effective way for small teams to develop software incrementally.

The second methodology is Contextual Design (CD), which lies at the other end of the size spectrum. It is based on Karen Holtzblatt's Contextual Inquiry method, which places heavy emphasis on observation and interviews to find both explicit and unnoticed aspects of the customer's activity. CD assumes a stable, functioning customer process. It organizes the resources of a large development organization or corporate IT department to achieve an auditable design that relies on structured data about the customer's activity to resolve design questions.

XP is like planning and executing a drive to the store. CD is like planning and executing a voyage to Jupiter. Both methodologies, however, have ideas you can apply to projects of other sizes.
 

Social Context

The Social Life of Information by John Seely Brown & Paul Duguid (Harvard Business School Press, Boston MA, 2000, 332pp, ISBN 0-87584-762-5, www.hbsp.harvard.edu, $25.95)

John Seely Brown is head of the Xerox Palo Alto Research Center (PARC). Paul Duguid is a researcher at UC Berkeley. He specializes in social and cultural issues in education. Both draw heavily on their backgrounds for examples to support the analyses in this book.

Information is a convenient construct. It gives us insight into many aspects of modern technology. Like all models, however, it ignores many details. Such simplifications can lead to great advances -- Newton's equations for planetary orbits, for example. This only works when the ignored details are insignificant to the problem at hand. 

Unenlightened or unscrupulous futurists, business consultants, and product developers have applied the information construct too broadly:
  • Books are information containers.
  • Conversation is information interchange.
  • Learning is information absorption.
  • Organizations are information consolidators.
  • Office work is information handling.
  • Business processes are information flows.
These and many similar oversimplifications all contain grains of truth. Ideas based on them, however, have fallen far short of delivering their promised benefits.  For example, the reengineering movement of the mid 1990s succeeded for well defined processes like procurement or shipping. It failed for fuzzier, less infocentric tasks like insurance claim processing. Such tasks depend on more than just a formal process for moving disembodied information. They have social components: the mutual support and shared knowledge of specific human beings.

In case after case the authors return to the same theme: from product development to company reorganizations, innovations fail when they ignore or try to suppress the social support systems that made the pre-innovation situation work. And sometimes innovations succeed only because people find ways to sneak the support systems back into the picture. This leads to a valuable clue to making struggling systems succeed: pay attention to what won't budge. If it's important to the people using the system, include it in the system. Don't try to stamp it out. Reinforce it instead.

The authors cite an excellent example of this. Dispatchers of field support reps got feedback from the dispatched reps when the reps called in for their next assignments. After a while the dispatchers became skilled at analyzing customer problems and only sending the field reps when necessary. When the company adopted a new system that didn't require the reps to call in, new dispatchers were less effective -- except for one who happened to sit near an old-timer and managed to learn from her. By recognizing and fostering this support system, the company helped the new system to succeed.

The authors summarize their theories of education and learning. They have obviously thought a lot about this subject, but they decided not to elaborate in this book. Their presentation is concise, but thought provoking. The essence of it is to identify the core competences of a university. If these were simply variations on causing students to absorb information, universities would indeed be threatened by the proliferation of online courses. Instead, the authors identify the social factors that have made universities such enduring institutions. 

I especially like the fact that they identify misrepresentation as one of the core competences of a university. What they mean by this is that a degree from a good university bundles in a certain amount of experimentation on the student's part that probably could not stand on its own if unbundled and offered as evidence of the student's qualification for a specific job. It is a cross-subsidy of an essential part of becoming an educated person.

One of the most interesting parts of the book deals with the relationship between organizations and the collective knowledge of their employees. On the one hand, many organizations have difficulty accessing and using their knowledge. The perfect example of that problem is the difficulty Xerox had in applying the GUI technology that their employees at PARC developed. Xerox owned the knowledge, but it could not use it.

On the other hand, knowledge flows into and out of organizations like water -- regardless of intellectual property laws. Networks of practice link employees in different firms, especially in concentrated areas like Silicon Valley. The same example applies here: the folks at Apple had no difficulty understanding, and eventually exploiting the Xerox PARC GUI work. Furthermore, because knowledge flowed more freely within Apple than it did within Xerox, Apple was able to bring the GUI to market.

The authors also discuss office work and the way some analysts ignore its social aspects. Anyone who has accessed a company network from off-site equipment -- in a home office, for example -- rediscovers the value of division of labor. Functioning as your own system administrator and IT department is an expensive and not very valuable exercise in reinventing the wheel. Also, incidental learning is harder to come by and shared knowledge harder to access from outside the traditional office site. By ignoring the invisible role of social systems, an infocentric view of office work fails to address and solve these problems.

The advertising agency Chiat/Day performed the ultimate experiment in stamping out the social aspects of office work. Employees had no permanent equipment or desk space. Instead they checked out communal equipment from a pool each morning, then sat wherever they could find space. Each night they returned the equipment to the pool and went home. This experiment, as you probably guessed, failed. The authors attribute the failure to Chiat/Day's management's failure to recognize and understand the value of the social aspects of office life.

The authors apply a similar analysis to agents. Viewing humans as goal pursuing agents hides the importance of the social nature of learning, taste, choosing, brokering, and negotiating. HAL can handle all of these human foibles, which is why HAL remains in the realm of fiction.

Finally, I love the analysis of printed documents and the social importance of their fixity. Among other things, the authors say 
Efficient communication depends not on how much can be said but on how much can be left unsaid -- and even unread -- in the background. And a certain amount of fixity, both in material documents and in social conventions of interpretation, contributes a great deal to this sort of efficiency.
This book is full of common sense. It deserves to become a strong and beneficial influence on the way we think about designing software and processes.


Extreme Context

Planning Extreme Programming by Kent Beck (Addison Wesley, Boston MA, 2000, 158pp, ISBN 0-201-71091-9, www.awl.com, $29.95)

Extreme Programming Installed by Ron Jeffries, Ann Anderson & Chet Hendrickson (Addison Wesley, Boston MA, 2000, 286pp, ISBN 0-201-70842-6, www.awl.com, $29.95)

The XP community has flourished since Kent Beck's first book on the subject appeared a little over a year ago. Addison Wesley labels the two new books as part of The XP Series, and as Kent Beck points out in his foreword to Extreme Programming Installed, the fact that somebody else wrote it bodes well for the future of the discipline.

Extreme Programming Explained is full of good ideas, but very concise, and Beck's new book is the same way. Extreme Programming Installed gives a more detailed and patiently elaborated view of how to do XP.

Because I discussed XP in my Nov/Dec 1999 column, I don't go into much detail about the basics here. The key tie to the theme of this column is the fact that, as Beck says, XP practices depend on human creativity and accept human frailty. They integrate the social support and informal communication that more mechanical methodologies might ignore or try to suppress. They use index cards and a few simple wall charts rather than lengthy requirements documents, design specifications, and project tracking software.

The XP books include little actual code. Occasional examples in Smalltalk, even as simple as they are, can put off many programmers, so I'm glad that Extreme Programming Installed contains two detailed examples of how to apply XP principles in the Java environment. I think all programmers -- whether they wish to adopt XP or not -- should read the chapters XPer Tries Java and A Java Perspective. These chapters convey a sense of the importance XP assigns to development tools, and they give a remarkably clear explanation of how to let testing drive design.

XP is simple and powerful. Get these books and read all about it. Then find a group that feels the same way, hook into the XP community, and put XP to work.


Heavyweight Context

Contextual Design by Hugh Beyer and Karen Holtzblatt (Morgan-Kaufmann, San Francisco CA, 1998, 496pp, ISBN 1-55860-411-1, www.mkp.com, $44.95)

This admirably clear book patiently and thoroughly describes a methodology for gathering and using the data necessary for true customer-centered design. By contrast with XP, which adds a customer to the design team, CD starts from the premise that customers can't represent themselves in the design process. When they are outside their usual work environments, they can't explain what they do or what really matters to them. Furthermore, they don't understand the details of developing the software for their own areas, let alone the other issues that developers have to keep straight.

Whereas in XP the customer speaks with a single voice, CD assumes that there are many customers and that nobody in the customer organization can represent them. The only way to produce a system that supports all of their needs is to understand all of their needs.

For these reasons, CD begins with a research project: find out everything about how each user does his or her work. An interviewer goes to the customer's work site and for several hours acts as a combination between an apprentice to the interviewee and the interviewee's partner in documenting the work practice. The interviewer compiles and interprets the information, then goes over it with the interviewee until they reach a common understanding.

Interviewers repeat this process for each class of worker until they have a complete set of data. They then produce narrative descriptions and graphical representations of a variety of models and arrangements of the data.

A key to the process is ensuring that the developers fully understand and assimilate the results of the customer study. This becomes the data that developers turn to when they have questions or disagreements.

This summary hardly does justice to the entire CD methodology, but the book covers every aspect of it thoroughly. Following a pattern that suggests that they applied their methodology to their own process, the authors make it easy for you to survey CD at the executive summary level or to delve into the details of areas that interest you. At every level the writing is clear and the concepts are easy to understand.

This book is not for everyone. If you are an IT manager or the head of a large consulting organization, you should assign someone to investigate whether CD can help your organization understand and serve your customers.

Friday, October 27, 2000

The Inmates Are Running the Asylum

This article appears in slightly different form in the September/October 2000 issue of IEEE Micro © 2000 IEEE.

Interaction Design


This time I look at a thin book by a certified software visionary who wants to restore sanity to the way we develop computer-based products.

The Inmates Are Running the Asylum -- Why High-Tech Products Drive Us Crazy and How to Restore the Sanity by Alan Cooper (Sams, Indianapolis, 1999, 278pp, ISBN 0672316498, www.samspublishing.com, $25.00)

The jacket of my copy of this book identifies Alan Cooper as the father of Visual Basic and recipient of Microsoft's "Windows Pioneer" and "Software Visionary" awards. What impresses me more than these awards, however, is Cooper's story of how he became the father of Visual Basic. 

In 1988 Cooper made a deal with Bill Gates to develop a programming language that combined Microsoft's QuickBasic with Cooper's Ruby. Cooper's prototype of Ruby had impressed Bill Gates, but after they signed the deal, Cooper threw away the prototype and started from scratch. He kept "nothing but the wisdom and experience," a decision that shocked and outraged the product manager he worked with at Microsoft. To illustrate the results of this decision, he contrasts today's Visual Basic, a highly successful product, with Microsoft Windows, which is "notoriously handicapped [by] its profuse quantities of vestigial prototype code."

This story shows the depth of Cooper's commitment to investing the time to build a solid foundation at the beginning of a project. It also gives a hint of the sometimes strident tone that makes parts of his book sound a little sophomoric. Cooper has seen and accepted a few simple basic truths, and he often shows his frustration with the programmers and project managers -- the vast majority -- who seem neither to see nor to accept those truths.

Cooper's basic thesis begins with the fact that business executives are not in control of the high-tech industry. This is not a new idea. Reading this assertion sent me back to my copy of The New Industrial State by John Kenneth Galbraith (Houghton Mifflin, 1967). When that book came out, I was designing and programming turnkey systems -- computer based systems that bundle the hardware and software necessary to implement specific applications. We designed and built systems for clinical laboratory specimen handling, data acquisition, and reporting. My boss recommended Galbraith's book because Galbraith understood how decision making in technology-based businesses was passing from the executives at the top to the teams at the bottom who do the work.  

Now, a third of a century later, Cooper believes programmers and engineers are firmly in charge. According to Cooper, this situation leads to products and processes that waste huge amounts of money, squander customer loyalty, and erode competitive advantage. This is what Cooper means when he says the inmates are running the asylum. 

Computers are hard to use, and everyday objects are becoming more like computers. Cooper begins his book by asking riddles: What do you get when you cross a computer with an airplane, a camera, an alarm clock, a car, a bank, a warship? In each case, the answer is "a computer." That is, the same opaque, unforgiving interfaces that plague computer users are displacing the much more humane interfaces that many devices once had. Cooper has made it his personal mission to reverse this trend.

In 1992 Cooper gave up programming to devote himself to making products easier to use. The first fruit of this effort is his highly regarded book, About Face: The Essentials of User Interface Design (IDG, 1995). If you have anything to do with producing software for people to use, you should read About Face

While the design principles in About Face are right, Cooper now believes the audience is wrong. The book starts from the premise that programmers, by default, do most user interface design. As a result, Cooper talks primarily to them, trying to help them design good interfaces. Unfortunately, however, even though programmers would like to design good interfaces, they don't have the time, and often don't have the skills to do so. Moreover, programmers face an overwhelming conflict of interest. 

Programming is hard. For as long as I can remember, competitive demands have forced programmers to push their systems and their tools beyond their limits. In order to survive, programmers must simplify the development process in every way they can. Making programs easy to use can conflict with making development easy. When programmers must choose between a hard-to-develop interface that mirrors the user's view of the task and an easy-to-develop interface that mirrors the program's inner workings, they invariably choose the latter. Programmers understand the interfaces that result, so they don't see why those interfaces confuse and frustrate users.

The facts that programmers are in control and that programmers can't design good interfaces leads Cooper to his main point: we must throw out the current development model and replace it with one that requires us to design the product before we build it.

This bald statement is a slap in the face to the multitude of software firms that have well documented product development processes. Such processes invariably start with a written document that makes the marketing case for the product or feature. The programmers then prepare, and circulate for approval, written documents that say what the proposed product does. Next they prepare detailed specifications of the product's inner workings. The programmers produce all of these documents before they write a single line of code.

Cooper points out that even when companies follow such a process to the letter, they rarely analyze user goals, and they never specify the interactions that the program must provide to satisfy those goals efficiently. Neither the marketing needs statement nor the functional specification identifies users clearly enough to enable programmers to make decisions as they go along. Instead, programmers try to put themselves into the user's shoes. But without clear mental pictures of actual users, they create target users who look and act like programmers. They create interactions that make sense only to other programmers.

Because he started his career as a programmer and because he has worked with many development teams as an outside consultant, Cooper knows what he is talking about here. He offers many insights into how current development practices lead to bad design and how the environment in most high-tech companies works to keep things as they are. He peppers his work with colorful phrases like cognitive friction, apologists vs survivors, the dancing bear syndrome, Homo Logicus, and conceptual integrity is a core competency. They mean little out of context, but they make the concepts easier to grasp and remember.

Cooper wants to create a new field, interaction design. He wants interaction designers to design user interactions before programmers get anywhere near the project. And he wants programmers to regard these designs the way they regard the target hardware -- as constraints, not as suggestions from which they can pick and choose.

Having envisioned the central role of interaction design, Cooper wanted to write a book about how to do it. He quickly saw, though, that there is a bigger problem: overcoming the huge inertia of the current system. He wrote this book, therefore, to make the business case for a product design methodology that starts from interaction design. Presumably he will write a how-to book about interaction design next. I look forward to reading it.

The business case for interaction design is simple. Cooper tells executives:
  • Your products are too hard to use.
  • This costs you money, because customers buy only what they absolutely need, and unhappy customers are not loyal customers.
  • There is a better way to do it (here Cooper gives a glimpse of the book he hopes to write).
  • You can make it happen if you take programmers off the hook for product quality and put that burden on the interaction design team.
While this is the business case book, not the how-to book, Cooper tantalizes us with interesting ideas and marvelous case studies. User profiles called personas are an important part of his methodology. He gives examples of actual products and shows how personas helped him simplify and focus the way people interact with those products.

There is a great deal of insight and good advice in this thin book. I strongly recommend it to anybody who has anything to do with designing computer-based products. If you're an executive in a high-tech firm, it can help you take control of a central part of your business. If you're a programmer, read it and take it to heart. It can help you understand and thrive in the coming design revolution.



Thursday, August 17, 2000

Summer Cleanup

This article appears in slightly different form in the July/August 2000 issue of IEEE Micro © 2000 IEEE.

I try to write columns around coherent themes. After a while, the number of interesting items that just didn't fit into recent columns grows to the point where I need to try to make a theme out of no theme.

This month I talk about a cautionary article, algorithms, Windows, Linux, programming utilities, and an authoring tool.


Bill Joy's Warning

In the April 2000 issue of Wired Magazine, Bill Joy, inventor of the vi editor and a whole lot more since then, calls on his fellow technologists to consider the ethical implications and possible dangers of their work.

New technology has often had bad side effects. The early nineteenth century Luddites fought the Industrial Revolution, because it did so much immediate harm. The Luddites lost, but the side effects of the Industrial Revolution were fortunately not fatal to the society at large. In time, society sorted out the problems, and the benefits soon outweighed them.

Humanity, so far, has muddled through the problems of technological change. But technology is becoming more powerful, more potentially destructive, and more widely accessible. The chance that humanity will survive its side effects, accidental consequences, or malicious misuse, drops from the near certainty of the past to a lower, more frightening, level.

Joy begins by quoting at length a plausible but exaggerated and slightly hysterical passage from the Unabomber Manifesto of technologist-turned-murderer, Ted Kaczinski. Kaczinski makes good points, but I can easily see the flaws. I don't worry that we will all become slaves of robots.

Joy's approach, by contrast, is quiet, understated, well reasoned. He looks at three technologies in which our progress is quickly outpacing our ability to control them: genetics, nanotechnology, and robotics. He calls attention to some of the possible consequences of these technologies -- plague, intelligent germ warfare, out-of-control self-replicating robots, and many others. Some of the consequences seem highly unlikely, others very likely. Many are potentially catastrophic, perhaps fatal to humanity. 

Then Joy says, "OK, what do you think about it?" That's his whole point: think about it.

Most people react to an article like this by worrying about it for a little while, then putting it out of their minds. We all have more immediate problems. We want somebody else to figure it out. We've always muddled through.
 
Joy is right. There are real dangers and not enough public concern. He has taken these matters to scientific and other associations, but without much success. I agree with him that we need to consider the ethical implications of what we do. But even more than that, whether we're personally involved in these technologies or not, we must all learn more and think more about how to preserve, protect, and defend ourselves, our fellow creatures, and our planet.

I have heard Joy say in interviews that he will soon establish a means for people concerned about these issues to communicate and to become involved. Watch for it.


Algorithms

The Advent of the Algorithm--The Idea That Rules the World by David Berlinski (Harcourt, NY, 2000, 366pp, ISBN 0-15-100338-6, $28.00)

Math historians see David Berlinski's work the way political historians see Edmund Morris's memoir of Ronald Reagan, Dutch (Random House, 1999). They respect it as a work of art, but as history they see it as a lost opportunity.

Berlinski moves easily between fact and fiction, between scientific exposition and impressionistic evocations of the time, place, and human context of that science. He sketches a set of mathematical and philosophical trends and builds them into a coherent theme. The theme may be oversimplified, but much of it rings true.

An algorithm, according to Berlinski is
a finite procedure,
written in a fixed symbolic vocabulary,
governed by precise instructions,
moving in discrete steps, 1, 2, 3, . . .,
whose execution requires no insight, cleverness, intuition, intelligence, or perspicacity,
and that sooner or later comes to an end.
We all know this, but Berlinski contrasts it with the calculus (the subject of a companion book). Calculus is the basis of mathematical physics. Its continuous variables correspond directly to the real world. Its laws need no further interpretation. Physics, according to Berlinski, seeks to trace everything to final laws, which physicists discover but do not create, and historical accidents. Algorithms, on the other hand, embody intention. They manipulate symbols, which someone must then interpret, in a sequence of steps that represent a design and a purpose.

If this is so, then where does the interpretation of the genetic code fit in? Where is the design and purpose? These are the kinds of questions Berlinski attacks after he finishes his basic narrative.

In Berlinski's view, the basic history of the algorithm (like a two-minute summary of Hamlet) is as follows. 
Leibnitz devised the idea of reducing thinking to manipulating symbols for a finite library of discrete concepts. Later, Peano established axioms for arithmetic, Frege incorporated them into the predicate calculus, and Gödel, in proving his incompleteness theorem, defined the class of primitive recursive functions. Shortly afterward, Church invented lambda-calculus, Turing invented his machine, and all three formulations, as well as a similar one from Post, turned out to be equivalent.
Berlinski goes into somewhat greater detail than this, and if you read the book, you can get a good sense of the essence of all of this work. Unfortunately, the book suffers from careless editing and proofreading, which can make it harder to understand such terse formulations as Church's lambda calculus.

Berlinski intersperses little stories within the narrative. They are imaginary dialogs, sometimes involving him, or brief stories that resonate in some way with the text. At one point Berlinski talks about complexity by comparing Jackson Pollock and Andy Warhol. A Pollock painting is complex because the only way to describe it is to point to it. 

In some ways this book is simple, but in other ways it's complex. You just have to look at it for yourself. If you're interested in computers and modern thought, I think you'll find the book worth reading.


The Firewall

Earlier this year (Micro Review, March/April 2000) I wrote about Windows 2000 Server and my attempts to use a personal firewall from Zonelabs.com of San Francisco. I don't know where the fault lies, because I often install software that's not quite ready for prime time, but after a while my Windows 2000 system became completely unstable, so I reinstalled from the CD that Microsoft had sent me. In the interests of stability, I did not reinstall the personal firewall, because it was not certified to run on Windows 2000 Server.

My new configuration is stable, but it needs some sort of security, so I decided to set up a Linux firewall on a separate machine. Another benefit of the Linux machine, I reasoned, is that it can allow me to route Internet traffic for all of the machines on my network, even when one of them is connected to a corporate intranet via virtual private networking (VPN). I had this capability on the Windows system before I reinstalled Windows 2000 Server, but I have not been able to make it work since the reinstallation. Despite the fact that my old and new configurations both identify themselves as Build 2195, the reinstalled version seems to handle VPN and routing completely differently from the way the original version did.

In recent years I have worked almost exclusively with Windows systems, so I decided to ask a friend for help setting up Linux. It went smoothly enough, but it was like a blast from the past. My conscious mind cannot tell you how to operate the vi editor, but my fingers, perversely, deftly execute the arcane commands for inserting text, replacing characters, and so forth.

Quickly, the great benefits and the evident drawbacks of Linux sprang into view. I found a surplus machine (Pentium, 166MHz) that is more than adequate for Linux. To use it as a firewall, I installed a second Ethernet card (an old 10MBit one) to connect to the digital subscriber line (DSL) modem. Figuring out how to set the jumpers for IRQ and memory mapped I/O went smoothly enough. A Linux command produced a list of devices and the resources they use. IRQ 10 and memory address range 300-31F (hex) showed up as unused, so we assigned them to the Ethernet card.

This all went smoothly, but then Linux began to crash horribly. After a while we determined that the crash occurred every time we touched the mouse. Carefully avoiding the mouse, we prowled around the Linux sources and discovered that the mouse driver uses the same I/O addresses we assigned to the Ethernet card. The mouse driver, probably because it predates the facility, fails to register its IRQ and I/O addresses with the Linux command that produces the list of devices and resources. We moved the Ethernet card's I/O address block to 320 (hex), and now everything works.

Because we could look at the sources, we were able to correct the problem. Because nobody is responsible, there is no easy way to prevent other people from having the same problem. The fundamental benefits and drawbacks of Linux, side by side.

With the second Ethernet card working properly, it was easy to set up IP masquerading, the facility by which the firewall machine routes Internet traffic for the local network. The next steps, still to come, are 
Configuring the firewall for the right balance between security and convenience.
Enabling the firewall to route VPN traffic.
In future columns I'll report on those projects. [Postscript (November 2008): I never got these features to work, but I finally managed to get the Windows networking to work.]

In the course of working on the Linux and Windows systems described above, I was extremely pleased at the ease and reliability with which I was able to manipulate disk partitions using PartitionMagic from PowerQuest (www.powerquest.com). I'm not elaborating on the details, because I know they have released another version since they sent me my copy, but if you need to add, remove, resize, or perform a variety of other operations on disk partitions, you should take a look at PartitionMagic.


Programming Utilities

I have reported on the Visual SlickEdit editor and the MKS Toolkit in past columns. Both of these programming tools are highly regarded throughout the industry. Visual SlickEdit is a cross-platform programmer's editor. MKS Toolkit makes Unix commands and tools available in the Windows environment. Both are recently out in new versions.

Visual SlickEdit, version 5, MicroEdge, Apex, NC, www.slickedit.com, $275.00.

Visual SlickEdit for Linux is the first program I installed on my Linux system. It wasn't difficult, but it was a lot trickier than a Windows installation. It was wonderful to see its familiar screen in that programmer-friendly but user-hostile environment.

I've praised Visual SlickEdit highly in the past, but the new version is even more impressive. I don't do enough programming to use all of its features, but I recently used some of its new features when I had to publish printed versions of a set of Java classes. Java sources in electronic form tend to use vertical space lavishly, which can spread them out over several pages, making them hard to read. Reformatting them by hand can lead to errors, so I used the Visual SlickEdit code beautifier to do most of the job for me. The beautifier gives you a choice of styles, and, as with everything else in Visual SlickEdit, if you don't like the available choices, you can write your own in MicroEdge's macro language, Slick C.

While testing my new versions of the code, I realized that I must not have started from the latest version of the sources. I had to compare the old and new versions to see what I would have to change. Visual SlickEdit provides an amazing tool for doing this. It's called DIFFzilla, and while it's a little hard to figure out what the controls do, the side-by-side views it presents are awe-inspiring in their clarity and accuracy.

I didn't need to merge changes, but Visual SlickEdit does that even more smoothly now than it did in earlier versions. It also handles braces (curly brackets) more intelligently than it used to.

Previous versions of Visual SlickEdit had context tagging, a feature that allows the editor to give you language-specific help while you're entering a program. The current version improves on this with support for HTML, Cobol, Ada, and Delphi. It also provides support for displaying and editing Javadocs, the Java facility for embedding documentation into source programs.

If you're a current Visual SlickEdit user, the new features are well worth the upgrade price. If you don't have a powerful programmer's editor, rush out and get Visual SlickEdit. It will do wonders for your productivity.


MKS Toolkit, Mortice-Kern Systems, Waterloo, Ontario, www.mks.com, $400 - $2500.

Since I first reviewed this product (in 1992), it has matured and grown, and its marketing has become much more sophisticated. Its basic focus, however, remains the same: give programmers working in a Microsoft environment the same powerful programming tools that Unix programmers take for granted.

The product now comes in a variety of versions -- from the simplest package for a single developer to complete support for cross-platform, web-enabled enterprise development. The high end products are pricey, so be sure you know what you're buying, and why, if you take that route. I don't have enough experience with the high-end products to make a recommendation.

The basic developer tools, however, are essential productivity enhancers for any Windows programmer. If you're a current user, take a look at the new versions to see if they offer anything important that you're not getting now. If you're not a current user, you need to get MKS Toolkit and learn how to use it. It will pay off for you in the long run.


FrameMaker 6

FrameMaker is the leading tool, at least in and around the Silicon Valley, for producing printed hardware and software manuals. FrameMaker was not originally an Adobe product. They acquired it a few years ago, and for the last year or so FrameMaker users have worried that Adobe was letting it die. The recent appearance of FrameMaker 6, and its cousin FrameMaker Plus SGML 6, has quelled those worries for now.


FrameMaker 6 and FrameMaker Plus SGML 6, Adobe, San Jose, CA, www.adobe.com, $849 and $1449.

These new versions are not vastly different from the 5.5 versions. The biggest story is that they came out at all, given the doubts in the FrameMaker community about Adobe's commitment to the product.

While there are no big differences, there are a number of important improvements. The first is a greatly improved book mechanism. In the new version, the book file controls the structure more cleanly and closely, and there are many book-wide features -- like search and replace -- that you had to do one file at a time in the old version. The new version now supports the notion of a collection of books, and you can set volume, chapter, and page numbering in the book file. You can now specify a chapter's number as "continue from previous," so that you need not specify chapter numbers explicitly. This makes it easier to rearrange the chapters of a book or to share a chapter between two books.

FrameMaker 6 simplifies the process of including chapter numbers in table, figure, and page numbers. The chapnum variable replaces a complex scheme based on paragraph numbering.

Another problem with the old version was the inadequate mechanism for generating HTML from FrameMaker files. Rather than trying to improve this mechanism, Adobe has dropped it entirely and now bundles a version of Quadralay's WebWorks Publisher with FrameMaker. WebWorks Publisher handles this task extremely well.

The FrameMaker Plus SGML product is just like FrameMaker in most respects, but it facilitates generating SGML (or XML) documents. Rather than tagging with paragraph and character formats (you still can, if you like), you apply tags that show the element's place in the document structure.

If you are upgrading from FrameMaker 5.5 and you have any interest in generating XML documents, consider upgrading to FrameMaker Plus SGML 6. Adobe charges the same amount for an upgrade to FrameMaker Plus SGML 6, whether you are upgrading from FrameMaker 5.5 or FrameMaker Plus SGML 5.5.

FrameMaker is still the standard for large printed manuals, and with version 6, it looks like it will continue to be the standard for some time to come.

Sunday, June 25, 2000

The Humane Interface, The Deadline

This article appears in slightly different form in the May/June 2000 issue of IEEE Micro © 2000 IEEE.

Doing It Right

This time I look at two books that aim at making life easier for humans. The first tells how to make computer interfaces that humans can use. The second uses fiction to explore the ways humans can work together to produce good software.


The Humane Interface -- New Directions for Designing Interactive Systems by Jef Raskin (Addison-Wesley, Reading MA, 2000, 254pp, www.aw.com, ISBN 0-201-37937-6, $24.95)

[NOTE: Jef Raskin died shortly after this review appeared.] Jef Raskin's best known work is the Macintosh. That product brought about a revolution in computer interface design, but yesterday's revolutionary idea has become today's entrenched paradigm. Raskin has moved on. In this book he shows the flaws of the desktop-and-application approach and explains how it can evolve into something much easier for humans to use.

Raskin centers his ideal system on your content -- not named files in a hierarchy of directories, but just content. You can have files and directories, but only if you decide to mark their boundaries in an otherwise undifferentiated sea of content.

Rather than applications to process your content, Raskin gives you individual commands. Thus, rather than buying Photoshop and Word, you buy individual (or perhaps groups of) image processing commands from Adobe and text manipulation commands from Microsoft. You can use the text commands to add text to images, and you can use the image processing commands to manipulate the images in your printed documents.

This sounds like component software or Unix filters. It's not a radically new idea, but Raskin arrives at it from a different direction. He begins by asking how he can make interfaces that humans can learn easily and use efficiently. To answer that question, he looks at the equipment on both sides of that interface: the computer and the human.

Many people have studied human cognition, but few have applied what we know about the capabilities and limitations of humans to the problems of interface design. To do this, Raskin applies techniques and observations that the cognitive psychologist Bernard J. Baars discusses in his book A Cognitive Theory of Consciousness (Cambridge, 1988).

Raskin takes a fundamental principle from Baars' work: humans can accomplish many tasks in parallel, but can only pay attention to one at a time. We all know this, but many people design interfaces as if it weren't true. Raskin gives examples of this error -- taken from widely used software products.

The fact that we have at most one locus of attention, while most tasks we perform with computers require us to accomplish a variety of subtasks in parallel, leads to the principle of automaticity: the more we can do without thinking, the more efficient we are. Anything that makes us think about what we already know how to do slows us down. This principle leads to the following conclusions:
  • Interfaces should be modeless - the way to accomplish a task should be the same under all circumstances.
  • Interfaces that change in an attempt to adapt to your actions can actually slow you down.
Raskin elaborates on these points with many examples. Some of the examples are surprising. They show the inefficiency of widely practiced interface design techniques.

Raskin turns a lot of attention to the problems of navigation. He likens current navigation methods in applications, operating systems, and the web to trying to find your way around a maze with only the ground-level view of where you are and where you've been. He proposes a two-prong approach to improving this situation: a zooming video camera metaphor for finding your way around a pictorial representation of your content, and a text-search facility that differs sharply from the most commonly used search facilities.

Raskin also applies quantitative methods to interface design. He uses the goals, objects, methods, and selection rules (GOMS) technique developed by Stuart Card, Thomas Moran, and Allen Newell to measure the relative efficiencies of alternative interfaces.

I haven't come near to covering all of the topics Raskin addresses in this marvelous book -- icons, programming environments, documentation, the number of buttons on a mouse, and even cables and connectors.

If you have anything to do with designing any aspect of computer systems for use by humans, you should read this book. People will be talking about it for a long time.


The Deadline -- A Novel about Project Management by Tom DeMarco (Dorset 
House, New York NY, 1997, 320pp, www.dorsethouse.com, ISBN 0-932633-39-0, 
$24.95) 

In 1964 I knew assembly language for the IBM 650, and I had recently learned Fortran II.  I sat down with the IBM Cobol manuals to see what that language was all about.  It took me very little time to decide that it was not a language I wanted to program in.  I don't regret this 
decision, but it shut me out of the world in which Tom DeMarco was later to flourish.  

Fifteen years and many languages later, I wound up (briefly) advising a large California bank, and the book that bridged the gap between my way of thinking and theirs was Tom DeMarco's Structured Analysis and System Specification (Prentice Hall, 1979).  I still remember how exciting it was to use DeMarco's methods to produce diagrams that represented the elements of the bank's proposed application.  The bank, however, was less concerned with the quality of the information than with the fact that my "work product" was drawn in pencil on D-size graph paper.  We parted company.  They later went out of business.  

In 1979, DeMarco was insightful.  Today he is wise.  Then he was concerned with how software modules work together.  Today he focuses on how people work together.  His book is fiction, but it teaches many real lessons about project management and team building.  

Here is the essence of the plot.  Mr.  Tompkins, a middle-aged middle manager, is laid off from a thinly disguised AT&T, kidnapped by a beautiful and resourceful industrial spy, and spirited away to Moravia, a post-Communist third world country somewhere on the Adriatic coast.  A thinly disguised Bill Gates has acquired this country secretly in a stock swap and has decided to help it dominate the shrink-wrap software business by producing knockoffs of Quicken, PhotoShop, Quark XPress, PageMill, Painter, and Lotus Notes, and giving them away.  
Tompkins takes the job of managing this development.  Because Moravia has far more programmers than required to develop these six products, Tompkins sets up three parallel projects for each product, turning the whole operation into a project management laboratory.  He gets carte blanche from Bill, and everything seems to be going smoothly.  

Just as Thompkins is beginning to feel complacent, Bill returns to the States to work on his house, leaving the sinister bean counter Allair Belok in charge.  Belok embodies every stupid, unscrupulous, bullying executive you've ever worked for.  Sadly, his tactics seem true to life. They certainly rang a few bells for me.  Thompkins must face arbitrarily shortened schedules, merging of his parallel projects, and forced overtime.  On top of this, he must contend with the well meaning purveyors of process improvement, whom Belok sics on him with even more unreasonable goals.  

Thompkins is not alone in his struggles, however.  Belinda Binda, the world's greatest project manager, burnt out and now a bag lady, agrees to help Thompkins staff his projects.  So does Ex-general Markov, former head of software development for the Moravian army.  Lahksa, the beautiful resourceful spy, runs around the world sending Thompkins consultants for 
brief visits.  The reclusive Aristotle Keneros, Moravia's first programmer, helps to divert the process improvement folks, and teaches Thompkins the importance of debugging the decomposition and interfaces during the design phase.  

That's it for the plot.  DeMarco patterned Mr. Thompkins after George Gamow's character of the same name.  Gamow had a wonderful ability to explain physics and mathematics with stories.  In One Two Three .  .  .  Infinity, one of my favorite books when I was in high school, he explains complex numbers with a story about buried treasure.  Mr. Thompkins is a 
bank clerk who goes to lectures on science, falls asleep, and has wonderful dreams that make the concepts clear.  

DeMarco has followed that tradition admirably.  His chapters are little vignettes of project management.  A problem arises, a consultant shows up to help solve the problem, and Thompkins adds a few aphorisms to his journal.  According to DeMarco, most of these aphorisms come from his own journal and represent lessons he learned the hard way.  

Here are a few of the aphorisms that I especially like: 
  • Four Essentials of Good Management: Get the right people, match them to the right jobs, keep them motivated, and help their teams to jell and stay jelled. (All the rest is administrivia.)
  • There are infinitely many ways to lose a day  . . . but not even one way to get one back.
  • People under pressure don't think any faster.
  • It's not what you don't know that kills you  . . . it's what you know that isn't so.
The last one is an old proverb, but DeMarco applies it to one of the sacred cows of software development, code inspection.  

I've discussed the more general aspects of DeMarco's book here, but parts of it get pretty technical -- though rarely enough to bog down the story.  DeMarco believes in metrics and modeling as project management tools, and several of his vignettes show surprising ways to use those tools.  

At the end of the book, Thompkins gives away his journal, saying "I can never imagine opening it again.  I don't need to.  I carry those hundred and one principles everywhere I go. They're carved into my hide." The book is a crash course in project management and team building.  If you do any sort of technical development, you should read it and absorb it.  

Thursday, April 27, 2000

Windows 2000

This article appears in slightly different form in the March/April 2000 issue of IEEE Micro © 2000 IEEE.

On February 17, 2000, I attended Bill Gates' kickoff of Windows 2000. I've been using beta versions for a year, but now it's official, so I can talk about it.

The kickoff was an extravagant production. Television actors provided glamour as they struggled through ghastly scripts. The captain of the starship Enterprise, for example, came on stage to complain when Gates used the word enterprise to describe his new product's target applications. The great rock guitarist Carlos Santana and his band closed the proceedings, while an army of reporters and publicists fiddled with their laptops and cell phones and pretended to enjoy the music.

Unlike Santana, the kickoff show may never win a Grammy, but the demonstrations of features and performance inspired awe among the attendees. Gates trotted out benchmarks that put Windows 2000 price/performance ahead of all of Microsoft's competitors -- and well beyond that of prior Microsoft systems. Even more impressive were the demonstrations of dynamic reconfiguration and load balancing in multiprocessor clusters. An operator at a console put machines into and out of service by dragging and dropping icons, and processor usage gauges immediately reflected the automatic rebalancing. Other demos showed how easily a user with a laptop can synchronize with a server and how effectively an administrator can control and allocate resources with Active Directory (see below).

The most impressive measurements that Gates announced were of how infrequently the systems crash. Windows 2000 seems, at least in these tests, to be much more reliable and fault tolerant than its predecessors. My own experience (see below) hasn't been nearly as good as the studies Gates reported, but I test lots of software, and I reconfigure often -- all without the benefit of a trained system administrator.


The Operating System

Windows 2000 is a family of operating systems, all targeted at business users. It does not, as you might have thought from the name, replace Windows 98. It is instead an evolution of Windows NT 4. In fact, the earliest beta versions I received were called Windows NT 5.

The bottom of the line is Windows 2000 Professional, which is aimed at business desktops and laptops and at high-end workstations. There is nothing to prevent you from using this product at home, of course, but in seeking to improve security, Microsoft has removed hooks into the hardware that many video games depend on. Drivers for many older devices have also had difficulty migrating to Windows 2000, which is another obstacle to its home use.

The next member of the family is Windows 2000 Server. It's not very different from Professional, and many users will prefer it. Microsoft intends it for use as a file, printer, communications, or web server.

For high-end server applications, Microsoft provides Windows 2000 Advanced Server. This version supports huge memories and symmetric multiprocessor (SMP) configurations. It supports clustering and rolling upgrades. You might use a cluster of Advanced Server machines to support a high-traffic website.

The fourth member of the family is Windows 2000 Datacenter Server. When Microsoft releases it later this year, it will do what the other family members do, but it will support larger memories and more multiprocessors.

I quickly decided that I needed Windows 2000 Server for the tasks I want to run, so that's the only version I have direct experience with. I'm happy with it, because its user interface is much more that of like Windows 98 than Windows NT 4, and it's easier to configure unusual network configurations. I connect my Windows 2000 Server machine to the Internet via a digital subscriber line (DSL). I have a local Ethernet, and I use the Windows 2000 Server as a gateway to give the other machines Internet access. At the same time I connect the Windows 2000 Server as a client to an enterprise intranet via virtual private networking (VPN). This configuration may have been possible with Windows NT 4, but try as I might, I could never make it work.

I won't run through all of the features of Windows 2000. You can find a summary on the Microsoft website. If you use Windows NT 4, or even Windows 98 with standard business software and devices, you'll find Windows 2000 a significantly more capable, usable and reliable product.


Books

Windows 2000 will spawn a huge supply of third party books, and many have appeared already. I look at four good ones.

Active Directory for Dummies by Marcia Loughry (IDG, www.idgbooks.com, 2000, 402pp plus CD, ISBN 0-7645-0659-5, $24.99)

Windows 2000 Registry for Dummies by Glenn Weadock (IDG, www.idgbooks.com, 2000, 378pp plus CD, ISBN 0-7645-0489-4, $24.99)

It may seem incongruous to talk about anything as complex as Windows 2000 as being for dummies, but these two books, and the one that I discuss under security (below), adhere to the same formula that has made the dummies series such a runaway success. The authors know their audiences, and they talk to them as intelligent people who are just beginning to learn the subjects. While they must assume a certain degree of sophistication and general background, the authors explain everything about the topics of the books.

In addition to targeting their audiences accurately and not taking anything for granted, the dummies books enhance communication in other widely known but less widely used ways. Their page layouts, font selection, and clear illustrations draw readers in. The icons for warnings, tips, technical details, and other aspects of the text are consistent from book to book, so the more dummies books you read, the easier it is to find the information you're looking for. The informal and slightly humorous tone of the books helps establish a rapport between author and reader, despite the highly formulaic structure.

Another great strength of the dummies books is that they are task oriented. They identify the tasks you're likely to wish to perform, and they show you how to perform them. Yet the authors don't restrict themselves to a cookbook approach. With each task they give you the background to understand what you're doing and why you're doing it. This technique is sometimes called just in time learning, and studies show that it is an effective way to learn.

Blatantly stealing Apple's famous catch phrase, the dummies books proclaim themselves a reference for the rest of us. And in fact, in addition to their tutorial elements, they contain many aspects of good reference works -- starting in every case with an excellent index. The Active Directory book also has several helpful appendixes.

The registry book contains information every Windows 2000 user should know about. The registry is an evolution of the registries of Windows NT 4 and Windows 98. If those were always a mystery to you, read this book now.

The Active Directory book may be more interesting to administrators than to average users. Active Directory centralizes resource allocation and control. It is a database of configuration information, some of which would have been stored in the registry under Windows NT 4. The book leads you through the daunting task of setting up directory services for an enterprise. 

Given the important role Windows 2000 will play over the next few years, I recommend that you spend a few hours reading these books and getting the ideas straight now. That knowledge is bound to pay off as time goes by. 


Inside Windows 2000 Server by William Boswell (New Riders, www.newriders.com, 2000, 1496pp, ISBN 1-56205-929-7, $49.99)

Boswell's book is very different from the dummies books. It contains a great deal more detail, but you may have to dig harder to get it out. It is definitely more of a reference work than a tutorial. Although it contains many how-to procedures, it is not fundamentally task oriented.

The book has an attractive and readable layout, and its binding allows it to lie flat when open to almost any of its 1500 pages. It is comprehensive and clearly written. While I can't attest to the accuracy of everything in it, it does list three technical reviewers.

If you're a system administrator, you need this kind of reference book. This one seems like a good investment.

 
Security

I became a regular Internet user in the early 1980s, but I never worried much about security until recently. This was not because the threats weren't real. I remember reading a congressional report on databases and invasion of privacy more than 30 years ago, and the threats were scary then. And if you want to see how that aspect of the problem has developed, read Simson Garfinkel's new book Database Nation (O'Reily, www.oreilly.com, 2000, ISBN 1-56592-653-6, $24.95).

While I think the kinds of threats that Garfinkel describes are more ominous, I'm concerned here with the kinds of threats you can reasonably do something about on your own Windows 2000 Server machine, namely, the threats of unauthorized access to your machine and to other machines on the networks your machine connects to.

Windows 2000 Server Security for Dummies by Paul Sanna (IDG, www.idgbooks.com, 2000, 378pp plus CD, ISBN 0-7645-0470-3, $24.99)

If you're a system administrator, this book will lead you through the basic steps you can take to protect your system without totally disconnecting it from the world. Like the dummies books I describe above, this book leads you through the tasks you need to accomplish, supplying you at each stage with the information you need to understand the procedure you're following.

If you're responsible for the security of a Windows 2000 Server system, reading this book is a really good idea.

I found another excellent resource for securing Windows-based systems (not necessarily Windows 2000). This is the website of Gibson Research of Laguna Hills, California (http://grc.com). When you go to this site and the first thing you see is your name on the screen, you know you have work to do. Steve Gibson leads you through some simple steps that will greatly reduce your vulnerability.

One of Gibson's recommendations is to use the Zone Alarm 2.0 personal firewall, which you can obtain free from Zonelabs.com of San Francisco, California. I downloaded this product and have been using it more or less successfully. Zonelabs does not certify the product for the configuration I'm running, and so I don't blame them completely for the occasional blue screens of death that occur when the program fails to handle kernel mode exceptions properly.

Using Zone Alarm has been very revealing. It quite regularly reports probes from unauthorized IP addresses. I'm still tweaking the settings and learning to use it effectively, but I'm happy to have it running, despite the occasional crashes. It's certainly a product worth investigating, and you can't beat the price.