Saturday, December 20, 2003

Me++

This article appears in slightly different form in the November/December 2003 issue of IEEE Micro © 2003 IEEE.

Janus, the god after whom January is named, has two faces. One looks forward, the other back. Recently, I've looked at some of my old columns, and thought again about the topics they cover. The book I focus on this time has its antecedents in earlier works by the same author. At the same time, it follows the Janus model by looking into the future without losing sight of the past.

Me++ -- The Cyborg Self and the Networked City by William J. Mitchell (MIT, Cambridge MA, 2003, 266pp, ISBN 0-262-13434-9, mitpress.mit.edu, $27.95)

William Mitchell is a professor of media arts and sciences at Massachusetts Institute of Technology (MIT), where he also serves as dean of the School of Architecture and Planning. Me++ is the most recent in a series of books that Mitchell has written about about the way that information technology is transforming our lives.

In the November/December 1995 Micro Review, I reviewed Mitchell's City of Bits -- Space, Place, and the Infobahn (MIT, 1995). In that book Mitchell uses vignettes from his personal experience to contrast scenes of the electronically mediated world with their classical counterparts. City of Bits is like an impressionist painting. Mitchell uses it to train your eye (and his own) to see the world in a different way.

The electronically mediated world has changed considerably since 1995, and MIT has reinvented City of Bits as http://mitpress2.mit.edu/e-books/City_of_Bits/, which claims to provide a graphically rich site, enchanced with over 200 links, a sophisticated search engine, and a unique public forum environment. The idea is intriguing, but the implementation appears to be poorly maintained. Most of its links were broken when I visited the site. Nonetheless, I found the entire text of City of Bits there, so you can read it online if you like. But if you'd like a hardcopy version of the book, the link to ordering information works perfectly.

In 1999 Mitchell brought out e-topia -- "Urban Life, Jim, But Not As We Know It" (MIT, 1999), a sequel to City of Bits. In that book Mitchell considers the ways in which architecture and urban planning must change to accommodate broader notions of place and proximity. Virtual places and electronically mediated interconnection provide new opportunities and constraints for architects and planners to work with. Mitchell paints an inviting, optimistic portrait of the resulting communities of the twenty-first century.

Mitchell's latest book continues the train of thought from City of Bits and e-topia, but this time he addresses the subject much more concretely than he did in City of Bits. One reason for this is that people have had time to integrate the world of bits into their physical worlds. The separation between bits and atoms, as described by Nicholas Negroponte in Being Digital (Knopf, 1995), has proved to be a trial separation. Bits and atoms are reconciled. The marriage is back on track. 

The term Me++ suggests a self, extended by portable wireless devices, moving about a networked world. Wireless phones or computers provide access to navigation aids and communication. MIT students bring laptops to class and use wireless internet connections to Google the lecture topic, making for a more intense and interactive learning experience.

Mitchell sees the Dilbert world -- that is, the 1990s workplace divided into cubicles containing PCs -- as dead. Any place can now be a workspace. This gives people another reason to be in public spaces, and it gives architects a chance to focus on reforming these spaces to accommodate the new functions. A café may need well lit seats with their backs to the wall to provide convenience and privacy for laptop users. It may need alcoves into which mobile phone users can go to converse without disturbing or being disturbed by others. Social conventions need to evolve around these new spaces and functions.

The city has always been a system of containers (city walls, buildings, rooms) and networks (transportation, energy supply, communication). To support the extended self, the containers become less physical, and the networks become more numerous and more important. Technological advances also change the nature of the infrastructure. The transportation and communication infrastructure of the twentieth century resulted from government investment or from private investment with government encouragement and support. Infrastructure like WiFi, on the other hand, comes from the bottom up.

One of Mitchell's main messages is that McLuhan's global village is really here. A community is a network of reciprocity. Traditionally, the associated social glue and moral obligations attenuate with distance. We feel most strongly obligated to our families, then to our communities and countries. In the past we felt little responsibility for people in distant lands. Nor did we feel that those distant people could have much effect on us. Events like those of September 11, 2001 show that this is not true. More mundane negative events, such as widespread power grid failures or virus attacks, underline the fact that we must take the idea of global community seriously.

The idea of a global village goes against many strongly held views. Networks and interconnections make boundaries permeable. They are incompatible with the view that nation states can close their borders or isolate themselves from the affairs of other nation states. On the personal level, the idea of "me first" must give way to a Golden Rule that acknowledges our interdependence.

Mitchell does not believe that nation states are doomed, but he does believe that physical places must emphasize the features that make them special. Distinctive subcultures, scenic beauty, desirable climate, and historic connections can all distinguish one place from another and help to perpetuate place-based communities. Physical place, Mitchell believes, will remain at the apex of a pyramid of different kinds of presence. Email may be excellent for routine communication. Synchronous communication by video and phone can satisfy the need for immediate feedback. But the most important situations call for two or more people to be in the same physical location at the same time.

Mitchell's chapter and section titles are filled with allusions. Titles like Virtual Campfires, Cyborg Agonistes, and Downsized Dry Goods reinforce the connections he sees among the electronically mediated world he describes, the world it is evolving out of, and the underlying culture that both worlds reflect. Mitchell sees many connections, and these add value to his work. One literature professor, quoted on the book jacket, says that Mitchell "is able to see the future without losing sight of the past, and he embodies the technological savvy yet still deeply humanistic perspective we need to understand where our technologies are leading us and where we should be leading them." She also praises the book's wittiness, urbanity, and wide range of reference. I agree with these assessments, but I wish Mitchell didn't carry it quite so far as he does.

I think Mitchell sometimes displays his erudition too consciously. Most readers, I suspect, will get little help from the references in the following excerpt:
It opens up the possibility of new, as yet unimagined spatial practices, and the opportunity (in the words of Michel de Certeau) "to rediscover, within an electronicized and computerized megalopolis, the 'art' of the hunters and rural folk of earlier days." Or, if you don't like the pseudo-primitivism of this formulation, you might imagine rediscovering Baudelaire's flânerie, situationist "drift," or whatever it was that Deleuze and Guattari were recommending in A Thousand Plateaus.
Along the same lines, the title of the book will not mystify programmers, but readers who don't get the allusion to the incrementation operator and to the C++ language will face an unnecessary obstacle to understanding what the book is about.

Mitchell's erudition extends to his vocabulary. I'm sure I knew the meaning of insouciant and carapace when I took the SAT exams, many years ago. Now when I encounter these words, however, I have to look them up. And as for enciente -- I'm sure this book is the first place I've ever seen it. Sometimes common words communicate better than uncommon ones, even at the expense of lost nuances.

These small annoyances aside, I think that Me++ is an essential read for anyone trying to make sense of the bewildering advances that are transforming our world. Reading it should take you only a few hours. Do it now. Janus would approve.

Saturday, October 25, 2003

So Many Books, So Little Time

This article appears in slightly different form in the September/October 2003 issue of IEEE Micro © 2003 IEEE.

I try to write in depth reviews of a few worthy items in each column, but sometimes the flow overwhelms the system. Here are short reviews of some books I wish I had more time to examine.

General Interest

How the Internet Works, Seventh Edition by Preston Gralla (Que, Indianapolis IN, 2004, 368pp, ISBN 0-7897-2973-3, www.quepublishing.com, $29.99)

Preston Gralla is a well known technology columnist and author. In this large format book, lavishly illustrated by Michael Troller, Gralla explains all of the concepts that most people are likely to encounter in connecting to the Internet. The explanations are not deep, but they systematically cover the basics. Most people, even experts, will probably find at least some of the explanations helpful. For example, I feel that I know quite a bit about the Internet, but Gralla's explanation of proxy servers clarified the subject for me.

The illustrations are colorful and attractive, but in many cases they are merely decorative. Visual learners may find them helpful, though I doubt that anyone could get much from the book just by looking at the pictures. The text, on the other hand, can probably stand alone.

If you know someone who is just starting to use the Internet and email, consider giving them this book. It may save you from some difficult questions. 


Firewalls and Internet Security - Repelling the Wily Hacker, 2d ed by William Cheswick, Steven Bellovin & Aviel Rubin (Addison-Wesley, 2003, 456pp, ISBN 0-201-63466-X, www.wilyhacker.com, $49.99)

In the interest of full disclosure, I need to say that I know Bill Cheswick and have heard him speak on many occasions. He is a remarkably charming, inventive, enthusiastic, boyish person with a wonderful sense of humor. His personality comes through as you read the book.

The 1994 edition of this book quickly became the standard reference for security professionals. The long awaited new edition brings the work up to date. Some of the issues in the book are of interest only to security professionals, but many of us nowadays have home networks for which we have to provide the security ourselves. We rely on off the shelf virus protection and firewalls, but most of us have no idea how to provide effective security. This book can help.

One of the great strengths of this book is that anyone with a general knowledge of computers and the Internet can read most of it. The Security Truisms section at the start of the book contains aphorisms like An attacker does not go through security, but around it. Sun Tzu could have written this 2500 years ago.

If you read this book, you'll come out knowing a great deal about security, and you'll have fun doing it. 


Windows XP Hacks - 100 Industrial Strength Tips and Tools by Preston Gralla (O'Reilly, Sebastopol CA, 2003, 412pp, ISBN 0-596-00511-3, www.oreilly.com, $24.95)

Preston Gralla has apparently been busy! In addition to figuring out how the Internet works, he has assembled a useful collection of ways to make Windows XP behave better. The O'Reilly Hacks series (and the associated site, hacks.oreilly.com) uses tools, configuration options, and even unpublicized registry settings to make your life easier. This book tells you how to get your old programs to run under XP, even if they are nominally incompatible with XP. It tells you where to find free tools to convert files from one graphics format to another. It helps you remove unwanted icons from your desktop. It provides strategies for reducing spam.

I love books like this. I don't think it is as thorough or detailed as Windows XP Annoyances (Micro Review, Jan/Feb 2003), but it's worth looking at.


From Gutenberg to the Global Information Infrastructure by Christine L. Borgman (MIT, Cambridge MA, 2003, 344p, ISBN 0-262-52345-0, mitpress.mit.edu, $21.95)

This book is a paperback reissue of a work that won the 2001 Best Book Award from the American Society for Information Science and Technology. Borgman looks at how well the global information infrastructure does, or might in the future, fit into our daily lives. She views this as the main criterion of its success.

Though Borgman comes to the subject from a background in information studies, she takes a holistic approach that considers sociological factors as well as technical and implementation details. She views digital libraries, electronic publishing, and the life cycle of electronic information in terms of their effects on humans.  

Borgman's presentation is in the academic style, with many references and a dry, impersonal voice. Nonetheless, it is not difficult reading. It provides a valuable and nuanced perspective on a technology that most people see in black and white. If you have not thought much about the human consequences of the global information infrastructure, Borgman's book is a good place to start.


Adobe Acrobat 6.0 Standard Classroom in a Book by Adobe Creative Team (Peachpit, Berkeley CA, 2004, 456pp plus CD, ISBN 0-321-19374-1, www.peachpit.com, $45.00

The Adobe Classroom in a Book series has a well deserved reputation for being thorough, accurate, easy to use tutorials. Adobe Acrobat 6.0 Standard is the latest version of Adobe Acrobat software. I haven't had a chance to look at Acrobat 6 in detail, but it seems to be a lot like Acrobat 5.

If you are new to Acrobat, this tutorial is an excellent way to cover all of the main features in a thorough and authoritative way.


Open Source

Tomcat - The Definitive Guide by Jason Brittain & Ian F Darwin (O'Reilly, Sebastopol CA, 2003, 412pp, ISBN 0-596-00318-8, www.oreilly.com, $39.95)

Tomcat began life as part of the Sun Microsystems development kit for JSP and servlets. A few years ago Sun donated Tomcat to the Apache Foundation, where it became an open source product. It is the most widely used platform for JSP and servlets. If you are going to develop J2EE web applications, you will probably use Tomcat.

This book tells you everything you need to know about the theory and the details of all of the Tomcat features you're likely to use, including security.


Code Reading - The Open Source Perspective by Diomidis Spinellis (Addison-Wesley, Boston MA, 2003, 526pp plus CD, ISBN 0-201-79940-5, www.awprofessional.com, $49.99)

This book has a simple premise. Programmers need to be able to read code. There are two main reasons for this. The first is to help learn how to write good code. As with prose writers, reading excellent examples of others' works helps programmers find their own voice and develop their own excellent styles.

The second reason to learn to read code is to be able to modify or augment existing programs. Programmers spend large portions of their lives in just such activities.

Spinellis notes that the open source movement has made a great deal of excellent code available. His book uses this resource to help you become a good code reader and writer. Using numerous examples from actual code, Spinellis discusses a wide variety of programming topics. In the end he has written a computer science textbook with all of the examples taken from real life.


PHP and MySQL for Dynamic Web Sites Visual QuickPro Guide by Larry Ullman (Peachpit, Berkeley CA, 2003, 590pp, ISBN 0-321-18648-6, www.peachpit.com, $24.99

Because they are open source products and are widely available, PHP and MySQL are widely used tools for server side scripting and database access for personal or small business websites. Ullman systematically covers all of the tasks a web developer might wish to perform and presents procedures for them, with complete code examples and visual representations of screens and output.

The Peachpit visual guides are all very well done, and this one fits right into that mold. If you're developing a website with any sort of server side component, you should read this book.
 

Extreme Programming With Ant by Glenn Niemeyer & Jeremy Poteet  (Sams, Indianapolis IA, 2003, 456pp plus CD, ISBN 0-672-32562-4, www.samspublishing.com, $34.99)

I have reviewed many books on extreme programming (XP) in this column over the years. For example, I reviewed Kent Beck's Extreme Programming Explained in the Nov/Dec 1999 Micro Review. Beck is essentially the inventor of XP, and his book explains its principles and the reasons behind them.

Ant is an open source extensible scripting language that many people use to automate processes for building and deploying software. XP relies on automated testing and frequent builds, so Ant is an ideal tool. The authors show how to use Ant to automate testing and builds, but that's just the beginning.

XP relies on coding standards, because everybody is always free to change any code at any time. Without standards, chaos would reign. Ant can help to enforce standards. Similarly, the authors proceed to automate every aspect of the XP development cycle.

If you're doing XP, or if you just want some good ideas for how to use Ant to help with whatever development process you follow, check out this book. 


Java and JSP

The next two books total close to 2400 pages. They are part of a series for developers. If you know everything in them, you should have no trouble developing enterprise applications, including web services, in Java.


J2EE Developer's Handbook by Paul J. Perrone, Venkata Chaganti & Tom Schwenk (Sams, Indianapolis IA, 2003, 1536pp plus CD, ISBN 0-672-32348-6, www.developers-library.com, $59.99)

The authors spell out their mission right away: 
Provide a comprehensive, cohesive, and practical guide for building scalable, secure, assured, Web-enabled, and distributed enterprise systems with the Java 2 Platform, Enterprise Edition (J2EE). The technologies presented in this book can be used to rapidly build any enterprise system and integration solution that you can imagine. We describe these enterprise technologies from the ground up, leaving you with a thorough and in depth understanding of the Java enterprise application stack.
That's a pretty tall order, but from what I can tell (I haven't read every word), they do a pretty good job. Nonetheless, if you finish the sections on application servers, servlets, JSP, and web services (about 200 pages) and you still haven't had enough, you can read the next book.


JavaServer Pages Developer's Handbook by Nick Todd & Mark Szolkowski (Sams, Indianapolis IA, 2003, 838pp, ISBN 0-672-32438-5, www.developers-library.com, $49.99)

JavaServer Pages (JSP) is a way of enhancing HTML with server side Java code to make web pages more interactive and to help them deliver access to databases and other server capabilities. If you don't really need to understand the entire J2EE architecture, this book can provide everything you need to know about JSP. The book contains a huge amount of sample code, which you can download from the book's website. 


.NET

Measured by weight or by number of pages, the two books in the Java and JSP section beat the two in this section by about two to one. Nonetheless, these two can hold their own.

Microsoft's .NET is relatively new and corresponds approximately to J2EE. The ASP.NET facility corresponds approximately to JSP.


.NET Framework Essentials by Thuan Thai and Hoang Q. Lam (O'Reilly, Sebastopol CA, 2003, 380pp, ISBN 0-596-00505-9, www.oreilly.com, $29.95)

The .NET framework is the basis for all new Windows development. In it Microsoft responds to the major trends of the last few years: distributed computing, component based development, enterprise services, and sharing of functionality across the web. It also responds to the maturity of information technology, which creates an expectation of interoperability, scalability, availability, security and manageability.

The .NET common language runtime (CLR) is analogous to a Java virtual machine (JVM), but there are important differences. Rather than the interpreted bytecodes of the JVM, the CLR works with compiled code. All languages compile into the same object format, use a common library, and interoperate. This means that methods written in different languages can inherit from each other, catch each other's exceptions, and so forth.

The .NET framework eliminates the notorious DLL Hell by drastically reducing the reliance on registry entries and by not using file names to bind programs together. For many programs (not all), you can install them by copying their files to the disk and uninstall them by deleting the files.

This book describes these and many other features of the .NET framework in enough detail for you to feel comfortable using them. It gives you a sufficient overview to make it easy for you to use more detailed reference works.  


ASP.NET in a Nutshell by G. Andrew Duthie and Matthew MacDonald (O'Reilly, Sebastopol CA, 2003, 998pp, ISBN 0-596-00520-2, www.oreilly.com, $44.95)

While the previous book provides a detailed overview, this one gets right down to the nitty gritty. If you are familiar with O'Reilly books in the Nutshell series, you know what to expect with this one. It provides a small amount of overview information, but a great deal of detail about the properties, methods, and events of the main ASP.NET classes. The format of the book and the detailed table of contents and index make it easy to find what you're looking for.

Sunday, August 17, 2003

Waltzing with Bears, Lean Software Development

This article appears in slightly different form in the July/August 2003 issue of IEEE Micro © 2003 IEEE.

Managing Software Projects

This time I look at two books on software project management. One addresses a gap in current standard methologies. The other provides a toolkit for a new kind of software development -- one that has grown in popularity in recent years.


Waltzing With Bears -- Managing Risk on Software Projects by Tom DeMarco & Timothy Lister (Dorset House, New York NY, 2003, 208pp, ISBN 0-932633-60-9, www.dorsethouse.com, $33.45)

Dorset House Publishing seems to have found a great formula, and I hope they stick to it. In recent years they have produced a steady stream of books with the following charactersitics:
  • Short - generally well under 300 small format pages.
  • Packed with practical advice on software project  management.
  • Written by people who have been managing software development successfully for a long time.
  • Iconoclastic - often showing common practices and beliefs to be short sighted, counterproductive, and generally absurd.
This formula works because of the "software crisis" that the industry has been bemoaning for at least 35 years. Software project management, as practiced by the mainstream of the industry, has not advanced significantly during that period. Relational databases, object-oriented design, and powerful development tools have made individual programmers vastly more productive, but project management has not gone far beyond where it was in the 1960s.   

As I noted in my review of DeMarco's novel The Deadline (Micro Review, May/Jun 2000), I was inspired by his Structured Analysis and System Specification when I first read it in 1979, and time has improved his understanding. DeMarco and Lister are long time partners in a consulting firm that specializes in software management issues. They have lectured and written extensively about corporate culture, management, productivity, estimation, and risk.

Managing risk receives more lip service and less real effort than any other issue in software project management. In fact, the authors have seen so much fake risk management that they close the book with a test you can use to determine whether your company is really managing risk, not just talking about it. The test is about a page long, checking for 9 key aspects of risk management. Unfortunately, most companies would score poorly on this test.

The book's title comes from a song in the Dr Seuss Cat in the Hat Songbook (Random House, 1967). Uncle Walter (or Uncle Terwilliger in some versions) finds that some risk is essential to his life: 
He goes wa-wa-wa-wa, wa-waltzing with bears, 
Raggy bears, shaggy bears, baggy bears too.
There's nothing on earth Uncle Walter won't do, 
So he can go waltzing, wa-wa-wa-waltzing, 
So he can go waltzing, waltzing with bears!
De Marco and Lister come to a similar conclusion. If a project has no risk, it's not worth doing. Companies that stick to safe projects often go out of business or have a lot of catching up to do. The authors cite the case of Merrill-Lynch in the area of online stock trading. Pioneers like E*Trade and early adopters like Fidelity and Schwab, grew dramatically in the 1990s. Merrill Lynch avoided the risks of developing an online trading facility, but struggled to stay even.

All worthy projects entail risk. The key is to manage the risk. The authors cite the case of the Denver International Airport, in which the City of Denver, apparently for political reasons, ignored and made no attempt to mitigate the risks that ultimately cost them a half billion dollars. This case is dramatic and public, but many cases like it occur every year in the more private settings of business projects.  Many companies climb the capability maturity model (CMM) ladder, and they see real benefits from doing so. Nonetheless, companies at high CMM levels often design processes that look good on paper but make no real effort to estimate the effects of risk or the expected value of completing each component of the project.  

DeMarco and Lister outline the risk part of their program this way:
  1. Construct and maintain a census of project risks. Explicitly escalate to the next level any risks you don't plan to, or can't, manage.
  2. Create an ongoing risk discovery process and develop a corporate culture that does not punish discussion of risks.
  3. Estimate the probability that each risk will occur and the cost you will incur if it does.
  4. Use simple math or a tool (the authors provide a free tool on their website) to create diagrams that show the uncertainty in project estimates.
  5. Set goals optimistically, but estimate outcomes realistically, using the uncertainty diagrams.
  6. Include mitigation actions (to be performed unconditionally) for each risk. Include these in the project's work breakdown structure.
  7. Define a transition indicator for each risk, and define a contingency plan to go into effect if the indicator appears. Include the contingent actions in the project's work breakdown structure. 
There's more to it, of course, but you'll have to read the book.

The other side of the risk coin is value. The authors have no sympathy for companys that skimp on estimating value. They ridicule justifications like "we've got to have it to survive."  Their rule is to estimate value with the same precision you use to estimate cost. This should lead to uncertainty diagrams similar to those for risk. Alternatively, if the best value estimate is "we've got to have it," then an appropriate cost estimate should be "it's going to be expensive."

The authors invest a good deal of energy into fighting with general practice. Statements like "risk management is project management for adults" are meant to needle people into changing the status quo. The following statement attacks a few sacred cows:
People who don't have the requisite talent [to be a good manager] fall back on a host of mechanical approaches, such as Management by Objectives, Parkinsonian Scheduling, and "a culture of fear" to scare their subordinates into performing. . . .  These practices are incompatible with any risk management scheme.
One of the biggest parts of risk management is mitigation. This is work that you do unconditionally to lay the groundwork for contingency plans that may or may not become necessary. For the Denver International Airport, for example, this might have meant building tunnels tall enough so that humans could drive carts through them if the contractor for the automated baggage handling software did not finish on time.

Mitigation is an expense that translates into waste if the contingency plan is not necessary. The authors suggest a mitigation strategy that reduces potential waste: incremental implementation. Their plan for incremental implementation requires you to identify which parts of the project deliver the greatest value to customers and which can eliminate potential risks before mitigation actions are necessary. These are the parts you implement first. This approach provides many benefits, including pushing low valued "pet projects" to the end of the schedule, where they can often be eliminated harmlessly.

One of my favorite parts of the book -- though I doubt that it will influence many readers -- is the story of William Kingdon Clifford, who in 1876 scandalized London's Metaphysical Society with his paper, The Ethics of Belief. The authors kindly include the entire text of this paper in an appendix. Clifford held that it is unethical to hold and act upon beliefs that you have no real basis for. The authors take off from Clifford's work to define risk management as the business of believing only what you have a right to believe. 

There is a great deal of useful advice in this little book. The authors also provide practical techniques for implementing their ideas. If you have anything to do with software, you really need the information in this book.


Lean Software Development -- An Agile Toolkit by Mary & Tom Poppendieck (Addison-Wesley, 2003, Boston MA, 232pp, ISBN 0-321-15078-3, www.awprofessional.com, $39.99)

Agile software development is an umbrella term for methodologies like extreme programming (Micro Review Nov/Dec, 1999). The Manifesto for Agile Software Development (http://agilemanifesto.org/) states:
We are uncovering better ways of developing software by doing it and helping others do it. 
Through this work we have come to value: 
  • Individuals and interactions over processes and tools 
  • Working software over comprehensive documentation 
  • Customer collaboration over contract negotiation 
  • Responding to change over following a plan 
That is, while there is value in the items on the right, we value the items on the left more.

Lean manufacturing refers to the kinds of techniques pioneered by Toyota and others, as described in The Machine That Changed the World by Womack and Jones (Harper Collins, 1991). This approach regards every unnecessary element as waste. An unnecessary element can be acquiring inventory before you need to use it, deferring to management on a decision that the workers can make for themselves, or any of a wide array of similar inefficiencies. In this book the authors have pulled these lines together into what they call a set of thinking tools for bringing agile practices into your unique environment.

Like the authors of Waltzing With Bears, these authors are fighting the status quo. For example, they claim that the Project Management Institute (PMI) certification programs teach project managers to use wasteful formalistic substitutes for learning and innovation, the real keys to the success of development projects.

The authors state the following lean principles:
  • Eliminate waste -- anything that does not add to the value the product brings to customers. But don't throw away all documentation.
  • Amplify learning -- recognize that development is not manufacturing. Design, implement, get feedback, and repeat. But don't repeat indefinitely.
  • Decide as late as possible -- keep gathering information to support your decision for as long as you can get away with (but no later -- don't procrastinate). This requires you to structure your project in a way that supports rapid change.
  • Deliver as fast as possible -- use short cycles that give customers what they need when they need it and give you the feedback you need to refine your design. Implementation speed allows you to defer design decisions until you know enough to make them intelligently. But don't rush so much that you do sloppy work.
  • Empower the team -- rely on the team to make decisions on the spot without referring them to a higher authority. But if you're the leader, continue to lead.
  • Build integrity in -- provide highly usable software that gives customers exactly what they need now. Provide the infrastructure to support graceful evolution. But don't rely on a big up front design process.
  • See the whole -- don't allow specialists to overbalance the design by focusing inordinately on their own specialties. But pay attention to details.
The book elaborates these principles into a set of 22 tools -- not a cookbook, but a schema for designing recipes. For example, the principle of building integrity in gives rise to tools called Perceived Integrity, Conceptual Integrity, Refactoring, and Testing. The authors explain the tools in the context of fascinating real world examples.

A complete commitment to agile development may not be ideal for your organization, but you might find the information in this book useful and relevant anyway. The authors bring a lot of practical experience to it, so it's worth reading, no matter what development process you follow.

Wednesday, June 25, 2003

Prey, BASIC, MKS Toolkit, Visual SlickEdit

This article appears in slightly different form in the May/June 2003 issue of IEEE Micro © 2003 IEEE.

Evolution

Evolution binds the otherwise unrelated topics I discuss this time. On one hand, a science fiction novel shows what might happen if fate should bring together the right mixture of nanotechnology, bacteria, viruses, solar power, agent software, adaptive self-organizing systems, rapid evolution, and corporate malfeasance. On the other hand, products I have been reviewing (and using) for more than fifteen years, continue their steady evolution.

Prey by Michael Crichton (Harper Collins, NY, 2002, 382pp, ISBN 0-06-621412-2, $26.95)

In this novel, Michael Crichton, already famous for his earlier works, The Andromeda Strain and Jurassic Park,  places current technologies into a completely plausible situation, then develops the story into an engrossing thriller. Somewhere in the process he crosses the line from believable to unbelievable. I'm not sure of the exact point in the book when I started to feel that way, but by the time I did, I was already hooked.  

I should say at the outset that I did not read this book. I listened to it in 45-minute chunks while driving. George Wilson's reading nicely conveys the matter-of-fact tone in which the protagonist, Jack Forman, a programming manager, recounts his attempts to rein in a project that has gone badly and dangerously out of control.

Forman, fired for blowing the whistle on questionable corporate accounting practices, has been an unemployed house husband for many months when his former company calls him to consult. They have sold Forman's software to another company, which is having trouble with it. The new company, it turns out, is Xymos, the firm for which Forman's wife is a marketing director. She looks at Xymos as her last chance to make a fortune, and she has cut a few corners to bring the story to this point. She should have read some of the project management books that I reviewed recently in this column. They could have saved her from a lot of trouble, but then there would be no story.

Forman heads for the remote desert facility where the problem is, and he finds that Xymos has used agent software that Forman's team had developed at the first company. The software produces agents that exhibit predator-prey behavior, but Xymos has put the agents to another use -- a military contract to create a spy camera made from a swarm of billions of nanomachines. 

Unable to solve the problem of how to prevent air currents from disrupting the swarm's structure, Xymos has turned swarms loose in the desert to see if a solution emerges as they evolve. Through a series of plausible circumstances, the swarms can use solar power and already carry with them their means of manufacture when Xymos turns them loose. They multiply and evolve at a rapid rate. Adaptive behaviors emerge. Humans become their prey, and it looks as if they have a good chance of wiping us out. From this point, the novelist takes over and carries the story forward to its conclusion.

Prey reminds me of a book that was one of my favorites when I was an avid science fiction fan, about 50 years ago. Hal Clement was a high school science teacher who wrote science fiction in his spare time. He built his stories on real science. They were realistic and believable to a large degree. In his novel Needle, two members of an alien race come to Earth -- one a fugitive and one a pursuing police officer. Members of this alien race live by symbiosis, a relationship in which they and their hosts help, but never harm each other. The fugitive has violated this rule. This contributes to the danger that leads to the book's exciting conclusion. In Prey, one strain of the evolving swarms of nanomachines develops a behavior much like symbiosis, contributing added suspense to the climax.

When I read Prey, I thought of many books and concepts that I have reviewed in this column over the years.

Daniel Dennett, in Consciousness Explained (Micro Review, Mar/Apr 1992) talks about the parallel architecture that the billions of neurons of the brain have organized themselves into. Adaptive self-organizing systems are at least as ancient as humans, but we have only begun to understand such systems in the last decade or so. Prey reminds us that humans are not necessarily the only creatures capable of pursuing their own goals powerfully and efficiently, to the detriment of their fellow Earthlings.

Mitchel Resnick, in Turtles, Termites, and Traffic Jams -- Explorations in Massively Parallel Microworlds (Micro Review, Nov/Dec 1994), says that no leader directs birds to fly in formation or ants to form trails from their nests to food sources. Instead, large numbers of independent entities, each following simple rules, produce the large scale patterns that we observe. This is the principle that makes Crichton's Prey plausible. 

The theme of that Nov/Dec 1994 column was decentralization. In addition to Resnick's book I talked about object-oriented programming and the Internet, two areas that have evolved enormously since then, both seemingly without any central direction.

In my July/Aug 1997 column, I reviewed George Dyson's book Darwin Among the Machines: The Evolution of Global Intelligence.

Dyson tries to understand many of the themes that underlie Prey. He starts with the work of Thomas Hobbes, over 350 years ago. Hobbes' Leviathan is a group intelligence representing the future of human society. Hobbes believed that life arises from the physical behavior of the underlying objects. The parts of the body give rise to a person whose life and thought are of a higher order than those of its heart, nerves, or muscles. Similarly, people, their institutions, and their machines give rise to a group intelligence of even higher order. This, of course, is what happens with the swarms of nanomachines in Prey.

Dyson's book is short, but intricate and profound. It covers a lot of ground quickly. One subject he touches briefly is the work of Thomas Ray in creating Tierra, a globally networked habitat in which digital organisms can interact and evolve.

In the end, Dyson says:
In the game of life and evolution there are three players  at the table: human beings, nature, and machines. I am  firmly on the side of nature. But nature, I suspect, is on  the side of the machines.
In my July/Aug 2000 column, I looked at Bill Joy's article in the April 2000 issue of Wired. Joy cites three technologies in which our progress is outpacing our ability to control them: genetics, nanotechnology, and robotics. He calls attention to some of the possible consequences of these technologies -- plague, intelligent germ warfare, out-of-control self-replicating robots, and many others. Michael Crichton relies heavily on all three of these technologies, and on some of the possible consequences Joy calls attention to, to create the nighmare scenario in Prey.

Joy's object in writing his article was to get people thinking about these issues. I hope that Crichton's book will contribute to that end too.

Prey is an enthralling thriller, but it is also an elaboration of many threads of thought in computer science and in the ethics of science. I highly recommend it.


Real Evolving Programs

Thinking about the rapid sort of evolution that Crichton allows his swarms of nanomachines to engage in made me think about some current software products that have evolved more slowly, and ostensibly under human control, over many years. In my June 1996 column, for example, I detailed the goals, evolutionary steps, and false starts that led from C to Java. The evolution of Java continues unabated, but still largely under the control of Sun Microsystems.

Unix provides another example. In Design of the Unix Operating System (Micro Review, October 1987), Maurice Bach describes a design process carried out under the control of AT&T Bell Laboratories, but beginning to break free. By the time Eric Raymond wrote The Cathedral and the Bazaar (Micro Review, Nov/Dec 1999), however, the open source movement and Unix had collided to produce Linux. Development of Linux is much more like a self-organizing system than Unix development under Bell Labs ever was.

BASIC

In 1967, before the invention of the microprocessor, I wrote BASIC programs on a General Electric timesharing system. The original BASIC was a very primitive language. Kemeny and Kurtz meant it only as a teaching tool. Every variable name was global, and represented by a single uppercase letter. Subroutine calls had no arguments; the routines took their input from global variables.

In the middle 1970s I wrote sophisticated clinical laboratory reporting programs, running on a DTC Microfile (an Intel 8080-based machine). I developed these using Microsoft BASIC, a much more robust language, but still burdened by the limitations of the original. In the years that followed, Microsoft ported BASIC to many microprocessor-based computers. My book Inside BASIC Games (Sybex, 1981) presents BASIC source programs that run virtually unchanged on Apple II, Comodore PET, and Radio Shack TRS-80 computers. Meanwhile, a huge number of commercial packages, programmed in BASIC and running on CP/M systems appeared. When IBM brought out the Personal Computer in 1982, BASIC was the most important software available on it.

With 16-bit microprocessors and the widespread use of Unix and C, BASIC faded in importance, but by the time I reviewed Microsoft Word for Windows 2.0 (Micro Review, December 1992), Microsoft had given BASIC a new role: letting users customize its packaged software. Then, with Visual Basic (Micro Review, October 1993), Microsoft reincarnated BASIC as a tool to allow developers to produce applications with graphical user interfaces (GUIs) to run on Windows.

In COM and DCOM -- Microsoft's Vision for Distributed Objects (Micro Review Mar/Apr 1998), Roger Sessions describes the position to which BASIC had evolved at Microsoft. He predicts a future in which programmers use Java, and to a lesser extent C++, to implement the logic of applications and use Visual Basic to implement the user interface. If we look at Microsoft's current plans for .NET, this prediction seems to be on target, though Microsoft would probably prefer that you substitute C# for Java in this scenario.

MKS Toolkit

MKS Toolkit 8.5 (MKS, Inc, www.mks.com, $359.00 and up)

This is the latest version of a software package that has a long evolutionary history. Mortice Kern Systems (now simply MKS, Inc) was founded to bring Unix style tools into the Microsoft environment. I reviewed MKS Toolkit 4.1 for DOS in the Jan/Feb 1993 Micro Review. By that time the product was already essentially in its current form. It provided the Korn shell, the vi editor, awk, make, tar, pipes, uucp, and all of the popular Unix utilities.

The toolkit's subsequent development has been in the direction of improving the symbiosis -- creating the Unix environment more completely and less disruptively, and adapting to the evolving host environment. Over the years the package has also picked up X.11 and Motif, and popular utilities like Perl and Tcl.

The latest version provides a secure shell for remote access, a much cleaner interface with Windows services, and a communications suite aimed at improving interprocess communication and notification.

Interestingly, the entry level toolkit still costs only about 20% more than the 1993 version, but MKS has implemented a tiered pricing structure in the last few versions. Professional or enterprise developers pay much higher prices, though the entry level version has all of the features that most developers would need.

The 1993 version came with a thick stack of manuals. The current version comes with a thin stack, but installs many PDFs and help files, which contain a great deal of information. The Cross-Platform Developer's Guide, supplied in both PDF and hardcopy, gives a good deal of insight into how hard it is to achieve symbiosis with a host like Windows.

There are other sets of Unix tools for the Windows environment, but for the nitty-gritty details of porting Unix programs into Windows systems, MKS Toolkit is still your best bet.

Visual SlickEdit

Visual SlickEdit 8 (SlickEdit, Inc, www.slickedit.com, $299)

This product has an interesting history. In the beginning, there were two notable programming editors: Bill Joy's vi and Richard Stallman's emacs. They are enormously powerful, but hard to learn, and both are in use today.

In the Nov/Dec 1988 Micro Review, I wrote:
The best program that I use on the PC, and the only one I would recommend unreservedly to anyone is the Brief editor (from Underware). It is the best editor I have ever used. It is modeless, meaning that commands always work the same way, regardless of what you're doing. It is easy to use immediately.
Not long afterward, Underware went out of business, and Brief is no more. Enter Visual SlickEdit (or simply SlickEdit, as it was first called). In my review of Visual SlickEdit 2 (Micro Review Sep/Oct 1996), I quote the manufacturer as saying that Visual SlickEdit emulates Brief so well that even its mother couldn't tell the difference. Visual SlickEdit also emulates vi and emacs, but soon after I started to use it, I moved to the Visual SlickEdit native user interface.

You can trace this product's origin to the vision of the company founder, Clark Maurer. His goal was to give programmers a single powerful, completely reliable, programmable editor that runs on every platform and supports every programming language. Maurer devised a macro language, Slick-C, used it for most standard extensions of the basic editor, and gave users access to it to design their own extensions. 

Maurer's basic design approach for the product was to reach out to working programmers, to find the repetitive tasks and the error-prone situations in their daily routines, and to provide safe, automated support for those activities. Every version of this product that I have seen since then seems to have had that same goal. Features like language-specific tagging, auto-completion, a code beautifier, powerful tools for comparing and merging files, and support for builds are examples that subsequent versions have offered. The product supports XML, including schemas, and provides a native Java debugger. The latest version makes it easier for programmers to work with other programs for building and debugging software. You can run Ant scripts directly from the editor.
     
In their wonderful, must-read book The Pragmatic Programmer -- From Journeyman to Master (Micro Review, Jan/Feb 2000), Andrew Hunt and David Thomas stress the importance of obtaining a powerful programming editor and mastering its features completely. Doing this with Visual SlickEdit will more than repay the effort.

Saturday, April 26, 2003

Dr Peeling, Haiku.ws

This article appears in slightly different form in the March/April 2003 issue of IEEE Micro © 2003 IEEE.

The book and website that I write about this time deal with the many details of doing a good job. People faced with learning all those details for themselves have written roadmaps for those who follow. In the book, a first line manager writes down the advice he wishes someone had given him when he started. On the website, the site's designer gives a complete step-by-step account of how she went from unemployed technical writer and web design novice to competent practitioner.


Dr Peeling's Principles of Management -- Practical Advice for the Front-Line Manager by Nic Peeling (Dorset House, New York NY, 2003, 288pp, ISBN 0-932633-54-4, www.dorsethouse.com, $29.95)

The law of supply and demand works in funny ways. Over and over, I see some hot area that everyone is interested in. Many thick books about that area suddenly appear on bookstore shelves. They tend to be quite similar to one another, and none of them tell you what you really need to know.

Management, especially first line management, is such an area. When Nic Peeling was promoted from software researcher to manager, he made a bee line for the local bookstore, only to find too many books and too little useful information. Now, more than ten years later, Peeling has learned many lessons -- through his own experience and from watching others make the transition from technical wizard to management novice.

Peeling feels that most new managers fail to understand the basic principles of managing people. I'm sure that this is true, especially in the technical fields. Many of us spend years happily shutting out other people while we master ways of making computers and electronics bend to our will. Then we find ourselves in positions where the only way to advance our careers is to work through other people. Very little of our experience in object oriented programming or VLSI design is immediately applicable to this new situation.

As Peeling says, "throughout the world, the standard for managing people is pretty dismal. Most people remain motivated to do their work in spite of their managers' efforts, not because of them." One reason for this is that while many managers ultimately learn how to do the job, few ever figure out how to teach others to do the job. Each new manager starts from scratch. The lucky ones find mentors who help them through specific situations, but those mentors can rarely give them general principles to apply to the next situation. The few managers who could write such books -- Jack Welch or Andy Grove, for example -- tend to write for high level executives and those who aspire to become executives.

Peeling tries to establish general principles to guide new managers. He doesn't completely succeed, but it's a big step in the right direction. Except for an excellent chapter at the very end of the book, his text stays somewhat abstract. In place of concrete examples, he presents delightful cartoons of a manager with a huge grin, closed eyes, and a can-do attitude flaunting Peeling's principles. Thus, Peeling finds a happy medium between stating abstract principles and solving specific situational problems.

I find parts of the book depressing, because I find parts of management depressing. Over and over, I find myself believing that Peeling is right about some point that I wish he were wrong about. For example, as a manager, you must keep yourself a little apart from those you manage. I think this goes against most people's instincts. Many new managers wish they could still be one of the team most of the time. It's only through hard experience that learn that they must give up that goal.

Another example is that sometimes you can't be completely open. In a way, it's like being a parent. You'd like to tell your children everything, but sometimes you need to make a decision quietly and hope that they are looking the other way.

Despite these Machiavellian examples, Peeling puts forth some principles that everybody can feel good about. On the first page, he says that you must set high expectations, establish clear boundaries, give good feedback, and behave in a way that wins respect.

I don't agree with everything Peeling says. His chapter on working with different types of staff is helpful in many ways, but I believe that he perpetuates many stereotypes. For example, he says:
Why sales people seem neurotically eager to blame problems on others can only be determined through years of psychoanalysis -- either for them or for the front-line manager -- but from what I have observed, blaming others is a very marked behavioral trait in salespeople.
Similarly, Peeling says that consultants have strong but unusual ethics. He likens their attitude to that of a burglar who considers himself ethical because he never carries a weapon and never vandalizes the homes from which he steals.

On the other hand, Peeling says some things that I agree completely with, and in the aftermath of the corporate scandals of the last year or so, I hope more leaders will take to heart:
Consistent behavior by the leader sets the tone for the culture.
And the Golden Rule of management:
You will be judged by your actions, not by your words, and your actions shall set the example for your team to follow.
Don't treat the material in this book as gospel, but take it with a grain of salt. In fact Peeling gives a "literal-reading warning" in the preface. Nonetheless, if you're a first line manager or in danger of becoming one, you can learn a lot from this book. You should buy it and read it.


Haiku.ws (A website by Kristine Hahn)

About a week ago I attended a meeting of the Society for Technical Communication at which technical writer Kristine Hahn told how she had built her website, www.Haiku.ws. I was impressed by the way she had learned from the ground up how to build an interactive website backed by a server-side database. I was even more impressed when I saw the document she had written to describe what she had done.

If you go to www.Haiku.ws, you can download a 78-page manual in PDF format called Creating the Haiku Web Site. The manual describes how to set up several different combinations of editing machine, test machine, and web server. Hahn describes configurations using Windows, Linux, and Macintosh OSX. She shows you how to obtain, install, and configure the key software components: Apache or IIS, PHP, and MySQL -- all of which are essentially free. The only real expenses you encounter setting up a website according to Hahn's instructions are the monthly charge for an internet service provider that supports PHP and MySQL, and the price of the web authoring tool.

Hahn uses Macromedia's Dreamweaver MX to build her site. While there are less expensive authoring tools, Dreamweaver is widely acknowledged to be worth the cost. Or, you can download a free trial version from www.macromedia.com. In any event, if you don't use Dreamweaver, you can't follow Hahn's procedures. 

A little over two years ago Macromedia graciously supplied me with a review copy of Dreamweaver 4. Unfortunately, I never had time to review it, but I installed it the other day to follow along while I read Hahn's document. The older version doesn't match the newer software exactly, but I can see why so many people swear by it. Dreamweaver hides many tedious details behind a well designed user interface.

Many of us have excellent technical backgrounds and skills, but we face a nasty problem in today's tight job market. There are so many different tools and skill sets that we find ourselves rejected out of hand by recruiters for jobs that we know we could quickly learn to do. Hahn, who found herself unemployed after many years as a successful technical writer, talked herself into a web designer job that required a large set of knowledge and skills that she didn't have. Over the course of a few weeks, she demonstrated that hard work and determination can quickly overcome such gaps. She is an inspiration to all of us.

Designing websites with server-side functionality is not difficult, but many people find it mysterious. Hahn knew nothing about it when she started, but a few weeks later she was able to start teaching others how to follow in her footsteps. Rather than generalizing from the work she had done for her client, she acquired the Haiku.ws domain name and built a site from scratch to illustrate all of the techniques she had learned. It is not a polished website, and her document is not a polished technical manual, but they are both instructive.

If this is an area that you want to know more about, Haiku.ws provides a good way to do it. And you can't beat the price.

Wednesday, February 26, 2003

Leadership Roundtable, Windows Annoyances

This article appears in slightly different form in the January/February 2003 issue of IEEE Micro © 2003 IEEE.

The books I look at this time both deal with annoying behavior. One explores stupid programmer tricks to find truths about technical leadership. The other looks at unfortunate design choices in the leading operating system and sees annoyances to be ameliorated rather than obstacles to be cursed at.

Roundtable on Technical Leadership edited by Gerald M. Weinberg, Marie Benesh, and James Bullock (Dorset House, NY, 2002, 176pp, ISBN 0-932633-52-8, www.dorsethouse.com, $21.45)

This book is the second in a series of digests of threads from a moderated discussion group called SHAPE (Software as a human activity, performed effectively). Gerald Weinberg moderates the SHAPE forum. He charges subscribers an annual fee, which compensates him for the time he spends keeping the signal to noise ratio high.

In the July/Aug 2001 Micro Review I reviewed the first book in this series, Roundtable on Project Management. In reading that book, I noticed that the participants take certain background information for granted. The same is true to a lesser extent in the new book. Shared parables, aphorisms, and classifications greatly increase the depth and efficiency of the dialog, but they make it harder for outsiders to follow. For example, if you haven't learned about the Meyers-Briggs personality types, you probably have no idea what INTP or ISTJ signify. If you haven't read about the Do You Mean? game, the brief summary in this book won't adequately convey its value. The book's bibliography consists of only 13 items (6 by Weinberg), so it shouldn't be too hard to come up to speed. Even so, I had to go to Google to find the meaning of cargo cult programming.

The editors set the tone of the book with a small quote (unattributed there, but sometimes attributed to Harlan Ellison):
The two most common elements in the universe are hydrogen and stupidity.
At the heart of the Y2K problem was the idea of saving storage space by keeping only the last two digits of the year in applications that store and retrieve dates. This was a clever idea when data storage space was scarce and expensive, and 2000 was many years in the future. By 1999, most people regarded it as a stupid programmer trick. Weinberg began a SHAPE discussion thread on stupid programmer tricks, asking people to identify their favorites, explain why the tricks are stupid, and provide ways to avoid them. That thread occupies most of the book.

I'm not on the SHAPE forum, so I am contributing one of my favorites here. Back in the 1960s, the Digital Equipment Corporation (DEC) gave away in large quantities on college campuses a book about programming their PDP-8 computer. The first programming example in the book used the Increment and Skip if Zero (ISZ) instruction to modify another instruction in the program, so that the other instruction served as both a pointer and a loop counter. Modern computer architectures have made that a worse idea than it was 35 years ago, but even then it was a stupid programmer trick.

Not wishing to be totally negative, Weinberg turned the stupid programmer tricks thread into one that includes truly clever tricks. Two other threads on technical experts and gurus -- their teaching methods, and their behavior under fire -- round out the book. Weinberg considers the result to be an ideal supplement to his well known work Becoming a Technical Leader (Dorset House, 1986).

The first thing that strikes me about the stupid programmer tricks thread is that we learn so slowly. For example, one contributor, Jim Batterson, complains about overloading identifiers, that is, making them carry information about the items that they identify. His example deals with assigning numerical identifiers to people in such a way that numerical order of the identifiers is the same as alphabetical order of the people's names. Reading about the machinations this led to, I couldn't help remembering line numbers in the BASIC language -- a problem that plagued programmers from 1965 until the early 1980s. BASIC line numbers identified the lines for editing (there was no visual editing in 1965). They also provided the targets of jumps and subroutine calls, and they determined the order of program execution. If lines had consecutive line numbers, you could not insert code between them without renumbering them to make room. This put them out of sync with the printed program listing and broke the jumps and subroutine calls.

Another thing that strikes me about this thread is that one person's stupid is another person's clever. For example, James Bullock has this to say about Charles Simonyi's Hungarian notation, a widely used and widely praised scheme for naming identifiers:
So in terms of stupid tricks . . . Hungarian notation is right up there . . .. I don't want to have the type dragging around in the symbol name. Type conversion needs to be explicit, hard, and hidden. The only reason to drag the type around is to make sure that two things have comformable types when they should. But don't you know that? Shouldn't you know? Shouldn't the compiler/linker/environment help you out with this? (Not if it's C with errors suppressed, it won't.)
If this is a stupid programmer trick, it's another old one. The FORTRAN language used the first letter of a variable name to specify its type as long ago as the 1950s.

The stupid tricks theme doesn't stop at coding tactics. It addresses the widespread problem of confusing design, documentation, and documents. The design of a project determines the project's future. Knowing the key ideas and alternatives that the designers considered is essential to building, extending, and maintaining the code. Documenting the design is essential, but most design documents fail to do this. In fact, a videotaped interview with the designers is likely to be a more valuable part of the design documentation than the typical functional specification. If you don't read anything else in the book, the 12 pages that carry the discussion of stupid design document tricks will still give you your money's worth.

The thread on tricks arising from social inadequacy provides the segue from programmer tricks to leadership. The Programmer's Drinking Song (source: the Internet) will cut too close for comfort for many project teams:
100 little bugs in the code,
100 bugs in the code,
Fix one bug, compile it again, 101 little bugs in the code.
101 little bugs in the code . . .
Repeat until BUGS = 0.
The threads on leadership distinguish between experts, who can do the expert activity, and gurus, who can teach the expert activity. A guru, according to this thread, remains humble and accessible, while exhibiting patience, gentleness, playfulness, enthusiasm for learning, and empathy. As I write this, I am listening to a 1989 recording of Gurumayi Chidvilasananda, successor to Baba Muktananda, leading a chant of the Guru Gita (Song of the Guru). I hope something rubs off. Experts have answers, gurus have questions. Do you think you'd like to read this book?


Windows XP Annoyances by David A. Karp (O'Reilly, 2003, Sebastopol CA, 586pp, ISBN 0-596-00416-8, www.oreilly.com, $29.95)

This is the most recent in a series of books that began with Windows Annoyances (O'Reilly, 1997), which deals with Windows 95. Subsequent books deal with Windows 98 and Windows ME.

The title of the book might lead you to believe that Karp is asking you to listen to him whine about the inadequacies of the Windows operating system. In fact, though, Karp takes a positive attitude toward the frustrating aspects of the Windows XP design. He doesn't waste much time whining about them. Instead, he shows you ways to work around them and to fashion an environment that you like. Along the way he even points out helpful features that you may not have noticed.

Karp wisely starts by showing you the basics. I've been using Windows heavily for more nearly 15 years, but Karp's Basic Explorer Coping Skills chapter taught me a few things I didn't know. And I was delighted to find that Windows XP provides a capability equivalent to the Show All feature of the late, lamented Xtree program. I remember when Xtree's incompatibilities with Windows 95 forced me to switch from Xtree to Windows Explorer. I am glad that Windows Explorer is finally catching up.

Karp teaches you how to work safely with the Windows registry, then proceeds methodically to explore all of the ways to customize Windows XP. The wonderful thing that Karp brings to this is his overviews of how the options control the underlying mechanism. Windows is maddening in this regard. It gives you lots of options but no way to understand what they do. Karp is co-author of Windows XP in a Nutshell (O'Reilly, 2002), so he understands all of the details.

I have paged through this book and seen dozens of changes I want to make to my system (probably not the Internet enabled fish tank, though). The book is comprehensive and detailed. I think it will keep me busy until the next version of Windows comes out. In fact, my only real complaint is that it could use a more thorough index.

Don't be fooled by the title. This is a very useful book. If you use Windows XP, you need it.