Monday, December 20, 2004

Hiring Knowledge Workers, Radiant Cool

This article appears in slightly different form in the November/December 2004 issue of IEEE Micro © 2004 IEEE.

More on Old Themes

The two books I look at this time have little in common, except that each provides a fresh angle on a theme that I have addressed in earlier columns. The first shares valuable information about how to hire knowledge workers. The second seeks to shed light on the way our brains support consciousness.


Hiring the Best Knowledge Workers, Techies, and Nerds by Johanna Rothman (Dorset House, New York NY, 2004, 352pp, ISBN 0-932633-59-5, www.dorsethouse.com, $43.95)

In Micro Review, Jan/Feb 2004, I reviewed How Would You Move Mount Fuji? -- Microsoft's Cult of the Puzzle. In that book, William Poundstone describes the Microsoft's hiring strategy: disqualify anyone they're not completely sure about. This strategy works for Microsoft, because the number of highly qualified job applicants vastly exceeds the number of available positions. 

Hiring managers at other companies, however, face many different situations. Johanna Rothman shows how to analyze your own situation and devise and execute a hiring strategy that is right for you. 

Johanna Rothman obtained simultaneous bachelor's degrees in English literature and computer science. Later she obtained a masters degree in systems engineering. She says on her website, 
I consult, speak, and write on the issues of managing product development -- specifically as a project management consultant, risk management consultant, and people management consultant for software or IT products. I help take the pain out of managing people and projects.
Gerald Weinberg is a highly respected consultant in the area of software project management (see Micro Review, July/August 2001). Rothman has worked with Weinberg for many years. Weinberg wrote a foreword for this book in which he says that he can think of no way to improve it. He calls it essential reading for practically everyone.

In Micro Review Sept/Oct 2002 I reviewed a collection of essays, edited by Weinberg and others, called Amplifying Your Effectiveness. In one of her essays in that collection, The Perils of Parallel Projects, Rothman quantifies the degree to which context switching reduces effectiveness. A person working on 5 projects, according to Rothman's numbers, spends 75% of the time on context switching and 5% on each of the projects.
  
In the same collection, Rothman's essay It's Just the First Slip tells you to listen to your project. The first slip is a whisper: "Your expectation is not matching my reality. Listen to me. I can tell you my reality." By the fourth slip, it's yelling "You'll pay for this!"

I've focused on Rothman's background and examples of her work, because this book so strongly reflects her way of thinking. Rothman provides an overall roadmap and many procedures and checklists. Nonetheless, the flavor of the book is generally practical and anecdotal. Her mingling of systematic and pragmatic material makes this book unlike most management books. I especially like the boxed anecdotes, each labeled "a true story," that appear throughout the book.

If a central theme underlies this book, it is that knowledge workers are not fungible. Of course, every typist or assembly line worker is also unique. Nonetheless, cookie cutter hiring techniques tend to be more successful with such workers than they are with, say, programmers or technical writers. Thus, Rothman advocates tailoring your hiring strategy and tactics to each specific situation.

One of the most appealing aspects of this book is that Rothman does not try to push you into doing things the way you "ought" to do them. In one of her true stories, for example, she describes a situation in which a "considerate, empathetic, consensus driven" group of testers hired compatible people, only to find that many of the new hires left soon after being exposed to the two main development managers, who were loud and argumentative. Her solution was to include the development managers as part of the hiring team. Rather than trying to change the behavior of the development managers -- the "right" thing to do -- she accepted them as a part of the environment. She made the ability to tolerate the development managers one of the qualifications for the job.

Another example of Rothman's flexibility has to do with multitasking. In the study noted above, Rothman shows that multitasking is inefficient. In the hiring process, however, she suggests exploring that subject with applicants to find out how well their working styles mesh with the company's. If the company believes in multitasking, regardless of how inefficient it is, then make willingness to multitask a qualification of the job.

Despite Rothman's emphasis on tailoring each hiring project to the specific situation, she does provide an end-to-end procedure for acquiring and keeping good workers. Rothman lays out the tasks and the issues, then addresses actual situations that may arise. She covers the entire subject thoroughly.

In the July/August 2003 Micro Review, I reviewed Waltzing With Bears -- Managing Risk on Software Projects by Tom DeMarco & Timothy Lister. That book made me much more sensitive to risk management issues than I once was. As a result, I'm especially happy to see that Rothman addresses risks, contingency planning, and mitigation actions in the hiring process. For example, she includes a helpful discussion of what to do if you can't find the right person for the job.

If you are a hiring manager in a high tech field, you must read this book.  


Radiant Cool -- A Novel Theory of Consciousness by Dan Lloyd (Bradford/MIT, Cambridge MA, 2003, 386pp, ISBN 0-262-62193-2, mitpress.mit.edu, $14.95)

Over the years, I have reviewed a number of books that deal with human consciousness. The most notable of these is Daniel Dennett's Consciousness Explained (Micro Review, April 1992). On the back cover of Lloyd's book, Dennett says:
I had dreamed of writing a book like this someday, and Lloyd has done it, taking us backstage and explaining how the brain plays its tricks creating the benign illusions of consciousness.
As Lloyd hints in the subtitle to the book, he has built his argument around a novel, a mystery in the film noir style. His hardboiled detective is Miranda Sharpe, a graduate student in philosophy, who investigates the disappearance of her thesis advisor. This leads her to Dan Lloyd -- the author turned protagonist -- who helps her discover a virus designed to modify all of the worlds computers. The modification replaces pointers to websites dealing with consciousness and phenomenology with pointers to exact duplicates of those sites.

The novel provides the underlying examples that help explain Lloyd's theory of consciousness. But Lloyd largely avoids the trap that most didactic novels fall into. He doesn't stop to explain everything. Instead, he divides the book into the novel, told from Miranda's point of view, and his own philosophical essay. This allows the novel to move quickly. It is, in fact, a real page turner.

One of the delightful aspects of the novel is Lloyd's frequent use of allusions and unexpected connections. This ultimately ties in with Lloyd's theory, but it also makes the book fun to read. For example, at one point a famous but shallow self-help expert engages in a consultation in which she sounds exactly like Joseph Weizenbaum's famous 1966 computer program, Eliza, the therapist. At another point, in the midst of a farcical climactic encounter, there is a knock at the door and Lloyd says "That'll be the Car Talk guys." You'll have to read the book to see why that line is so funny. And I haven't even mentioned Boris Badenov.

When it comes to Lloyd's theory, the book is no longer a page turner. Perhaps an example from popular fiction can help to explain the problem Lloyd is trying to address. In the Harry Potter books by J K Rowling, the world's preeminent wizard, Albus Dumbledore, is headmaster of Hogwarts, the wizarding school that Harry attends. Dumbledore has a storage device, called a pensieve, for his thoughts. He places his wand tip into his hair, and when he draws it out, wisps of thought adhere to it like threads. He places them into the pensieve, where he can sort them and access them.

On one occasion Dumbledore leaves Harry in his office while he goes to attend to an urgent matter. While Dumbledore is gone, Harry's curiosity leads him to investigate the pensieve. Harry is drawn into a large courtroom and sees Dumbledore sitting and watching the proceedings. Nobody seems to notice Harry's presence, so Harry concludes that he is experiencing one of Dumbledore's memories. Harry, however, does not know what is going on. He experiences the scene as a movie. Events proceed sequentially in time, but Harry embues them with his own, rather than Dumbledore's, knowledge and feelings. Soon the scene shifts to a different proceeding in the same room. Then the new sequence of events unfolds. Harry remembers the first scene as he experiences the second.

Lloyd is a phenomenologist. He does not see mind and body as separate. He recognizes the importance of time sequences, but he points out that your memories must be as you experienced them, not as an external observer would see them. And each memory carries with it all of the past memories that helped to shape it. This superposition of memories upon memories leads to a recursive model -- perhaps an allusion to Douglas Hofstadter's Gödel, Escher, Bach: An Eternal Golden Braid (Basic Books, 1977).

Lloyd goes into plenty of detail about his theory, but I won't. One interesting point about it is that he uses data from functional magnetic resonance imaging (FMRI) studies to support his conclusions. You can visit his website (www.trincoll.edu/~dlloyd/) to see some of the images that play an important role in the novel and in his theory.

Even if you're not interested in consciousness, Radiant Cool is fun to read. If you are interested in consciousness, you'll find Lloyd's ideas stimulating.

Monday, October 25, 2004

Seek and Show

This article appears in slightly different form in the September/October 2004 issue of IEEE Micro © 2004 IEEE.

Data, information, knowledge, content -- whatever you call it, there is more of it than any of us can keep track of. The volume continues to increase, and the tools for managing it don't keep pace with that growth.

Most people, however, don't even use the existing tools effectively. In The Pragmatic Programmer (Micro Review Jan/Feb 2000), Andrew Hunt and David Thomas admonish you to become an expert on using your tools. Their main example of this is a programmer's text editor, but it might apply equally well to Google, Adobe PDF, or Microsoft Excel. The books I look at this time show you how to get more out of tools that you probably already use every day. 
  

Online Search Tools

One of the main ways to find individual nuggets in the vast sea of online information is to perform effective online searches. Search sites like Google and Yahoo receive millions of requests for information every day and provide useful results with astonishing speed. Most people with online access know how to use at least one of these sites, but few people know how to exploit their full power.

The books in this section help you to use search tools more effectively. The first focuses on Google and all of its features, many of which are unrelated to searching. The second focuses exclusively on searching, but discusses a broad range of search sites.


How to Do Everything with Google by Fritz Schneider, Nancy Blachman, and Eric Fredricksen (Osborne, Emeryville CA, 2004, 382pp, ISBN 0-07-223174-2, www.osborne.com, $24.99)

Nancy Blachman is president and founder of Variable Symbols, a company that specializes in consulting and training on technical software. She has written books and training materials about Mathematica (Micro Review, August 1991 and February 1992). She holds a BS in mathematics from the University of Birmingham in the UK, an MS in operations research from UC Berkeley, and an MS in computer science from Stanford, where she has taught for eight years. She is the creator of the Google Guide website (www.googleguide.com).

For this book, Blachman teams up with two Google software engineers to produce a comprehensive guide to using Google. Unlike Blachman's Google Guide website, this book does not go into the underlying technical details. It focuses almost completely on how to use the various features of Google. It does this with a large number of step-by-step procedures, examples, and screen shots.

Google is mainly a search tool, but it is much more besides. Everybody, including me, who looks at this book says something like "I had no idea Google could do that!" about some feature Blachman describes. I also found explanations of many features that I had noticed but never looked into. I'm much more likely to use those features now that I've read about them.

If you want to know everything about Google, read this book.


Google and Other Search Engines by Diane Poremsky (Peachpit, Berkeley CA, 2004, 376 pp, ISBN 0-321-24614-4, www.peachpit.com, $19.99)

This is another book in Peachpit's marvelous Visual Quickstart Guide series. Previous editions were written by Alfred and Emily Glossbrenner, but Diane Poremsky (www.poremsky.com), an expert on Microsoft software, has taken over. Peachpit provides no information about the reason for the change.

The cover of the book puts the word Google into much larger type than the rest of the title. This probably reflects the current huge interest in Google and its public offering of stock. It also reflects the fact that Google has indexed far more pages than any other search tool. Nonetheless, the book is not principally about Google. Its message is to use the right search tool for the job at hand, and that may not be Google. Poremsky provides detailed information about Alta Vista, MSN, Yahoo, Ask Jeeves, Excite, Lycos, and even AOL. Moreover, she explains the differences between these search tools, so you can choose the right one for the search at hand.

If you want to know how to perform online searches effectively, read this book.


PDF and Acrobat

Another widespread approach to handling information is the Adobe portable document format (PDF) and the Adobe Acrobat tools for managing documents in PDF format. All publishing applications and all Microsoft Office programs can generate PDF documents. Almost everyone who goes online encounters PDF documents and can view them within a browser.

Beyond what everyone knows, however, are enormously powerful and varied publishing, viewing, and searching capabilities. Most people who use PDF every day know very little about these capabilities. The books in this section seek to correct that situation. The first focuses on Adobe Acrobat. The second focuses on PDF.


Carl Young's Adobe Acrobat 6 -- Getting Professional Results from Your PDFs by Carl Young  (Osborne, Emeryville CA, 2004, 412pp, ISBN 0-07-223138-6, www.osborne.com, $34.99)

Carl Young is an Adobe certified expert in Acrobat and FrameMaker, as well as a certified technical trainer. Adobe selected him to run the first public Acrobat 6 training sessions at their worldwide launch of Acrobat 6. As eminent as Carl Young is, however, I heard about this book from someone even more eminent.

Shlomo Perets has specialized in online documentation applications since 1993. He trains technical communicators to get the most out of FrameMaker and Acrobat and the combination of these tools. Perets started his company, MicroType (www.microtype.com) in 1989 to train and consult about electronic publishing tools and techniques. Based in Israel, Perets often comes to the United States on business. In June 2004, I saw him speak at the Berkeley chapter of the Society for Technical Communication (STC), where he presented a good deal of useful information about using Adobe Acrobat. Perets served as technical editor for Young's book, so perhaps his recommendation is not entirely unbiased. Nonetheless, Perets recommends Young's book as the best book to read about Acrobat 6.

Acrobat 6 has many powerful features, but its maddening user interface and its arcane options and settings make it opaque to most users. Futhermore, it is easy to produce a PDF that has mysterious flaws. For example, you may make a beautiful PDF document, but when someone else views it, it looks bizarre, because all the fonts have changed. Or the PDF may look right on screen, but the graphics look terrible when you print the document.

These problems and many more arise because users do not understand the consequences of their choices. Young explains how to make choices that lead to PDFs that behave the way you expect them to.

Once you understand the basics, you can try adding movies or sound. You can make your documents accessible to people with disabilities. You can investigate automation, reviewing tools, forms, digital signatures, differing access permissions, and many other advanced features.

To understand everything that Acrobat can do to make publishing, reading, and reviewing PDF documents go smoothly, read this book.


PDF Hacks -- 100 Industrial Strength Tips & Tools by Sid Steward (O'Reilly, Sebastopol CA, 2004, 296pp, ISBN 0-596-00655-1, www.oreilly.com, $24.95)

The O'Reilly Hacks series focuses on enabling clever programmers to customize software applications in ways the inventors of those applications never envisioned, or at least never publicized. O'Reilly has tried to reclaim the positive connotations that the term hacker had before it came to mean computer criminal. Hacks are clever, quick-and-dirty solutions to small problems. 

According to the book's publicity, Sid Steward has analyzed, extended, secured, cracked, authored, converted, embellished and consumed PDF over the last 5 years. He maintained and created custom software. He has pushed the envelope of Acrobat API programming. He has developed a toolset that is the core of a PDF conversion service bureau. He also performs PDF finishing, which includes optimizing PDF file size and adding navigation features.

This book is not about Adobe Acrobat, though Steward does provide some tools and tricks that apply to Acrobat. PDF adheres to a published specification, so anyone can process it. The Macintosh operating system has tools to process PDF. So does Ghostscript (www.cs.wisc.edu/~ghost).

I use Acrobat all the time, so my favorite hacks apply to Acrobat. For example, Steward shows how to shorten the time Acrobat takes to start, by disabling plugins that you are not likely to use. He also shows how to let Acrobat 6 use the important Acrobat 5 TAPS plugin, which Adobe does not provide with Acrobat 6. TAPS allows you to copy tables and other formatted text from PDF documents and paste them into other documents. This bit of information alone is worth the price of the book.

If you use PDF and like to tinker with your tools, this book is for you.

Wednesday, August 25, 2004

Attacking Complexity

This article appears in slightly different form in the July/August 2004 issue of IEEE Micro © 2004 IEEE.

For as long as I have been in the computer industry, increasing complexity has been a constant problem. Every advance in computer speed and storage capacity has been met by increased requirements and expectations for the software that runs on those computers. Moore's Law continues to bail us out, but it does not encourage efficiency. 

Efficiency is not a problem if we simply trade inexpensive hardware for expensive development time. Bloated and inefficient code and cumbersome development methodologies cause few problems for products that provide basic functionality with minimal demands on computing resources. Inefficiency becomes a problem at the cutting edge, when we try to squeeze out that last bit of added functionality to gain commercial advantage over our competitors.

This time I look at three books. One book shows how to apply the techniques of agile software development to projects of large size and scope. Another explains the ins and outs of an operating system that might be a little simpler than the one you're currently using. The third provides principles and techniques for simplifying Java application programs.


Agile Software Development in the Large -- Diving into the Deep by Jutta Eckstein (Dorset, NY, 2004, 246pp, ISBN 0-932633-57-9, www.dorsethouse.com, $39.95)

Since reviewing Extreme Programming Explained (Micro Review, Nov-Dec 1999), I have often written about agile software development techniques. Extreme Programming is the most famous of these techniques. Some others are Scrum, Crystal Methodologies, Feature Driven Development, and Adaptive Software Development. The proponents of these and other techniques have banded together into the Agile Alliance (www.agilealliance.org). The alliance has agreed upon a set of choices and a set of principles, known collectively as the Agile Manifesto.

Their basic choices are to value:
  • Individuals and interactions over processes and tools.
  • Working software over comprehensive documentation. 
  • Customer collaboration over contract negotiation. 
  • Responding to change over following a plan.
You can find the agile principles that flow from these choices expressed in twelve short paragraphs on the Agile Alliance website.

Many firms have tried to adopt or adapt agile techniques. Many of these attempts simply entail calling their existing processes agile or adopting a few superficial aspects of Extreme Programming. Other firms have seized on the second value choice as an excuse to eliminate documentation or to stop commenting code.

Such superficial or misguided attempts to use agile techniques have little practical benefit. Other attempts, however, have been successful, but almost always in projects of relatively small scale. The techniques described in most books on agile development do not apply easily to large projects. For example, using Extreme Programming on a large project might entail placing 100 people in a single room. This is likely to be impractical for most projects.

Jutta Eckstein is a member of the board of Agile Alliance and a software development consultant. In this book she shows  that there are practical ways to adapt agile development techniques to projects of large scale. As you might expect, these adaptations rely heavily on good project communication and on finding ways to assign tasks to small subteams.

Eckstein proceeds methodically through the aspects of agile processes that work differently in large projects. She brings the benefit of her real world experience to these questions. If you work on large projects, you'll find it well worth your time to read this short book.


How Linux Works -- What Every Superuser Should Know by Brian Ward (No Starch, San Francisco, CA, 2004, 366pp, ISBN 1-59327-035-6, www.nostarch.com, $37.95)

Unix in its many forms has been around for about thirty-five years. When it was half that age , I reviewed Maurice Bach's The Design of the Unix Operating System (Micro Review, Sept/Oct 1987). In the same column I reviewed Douglas Comer's book Operating Systems Design, The XINU Approach. XINU is a recursive acronym that stands for "Xinu is not Unix." It is also Unix spelled backwards. 

Xinu is one of many attempts to simplify and reinvent Unix, wholly or in part. Linux is the most successful of these attempts. It began in late 1991 as the work of Linus Torvald, and has grown into a poster child for open source development. Linux is the base operating system for many production web servers.

Brian Ward has been working with Linux since 1993. In this book he tries to give you an understanding of the inner workings of Linux. Rather than providing procedures for common Linux tasks, Ward provides conceptual information. He hopes that after you read his book, you can read and understand the documentation of any system program.

One interesting part of the book is Ward's explanation of the boot process. If you've ever watched a Unix system start up, you know that screen after screen of information goes scrolling by. In about ten pages, Ward explains what all of that information is meant to tell you.

Because Linux is frequently the base operating system for web servers, Linux administrators must understand networking and firewalls. Ward explains how these features work.

Operating systems must often support software developed by third parties. For example, in Windows, the .NET framework provides a way to encapsulate applications and avoid the DLL hell of earlier Windows systems. Shared libraries provide support for third party software in Linux. Ward shows how to manage -- and avoid the pitfalls of -- shared libraries.

One of the most powerful features of any Unix system is the programmable command shell. Many books explain how to use different shells. Ward gives a clear account of the basics of using the Bourne shell.

Ward addresses all of the main tasks that Linux administrators must face, but this book is not for current Linux administrators. They already know all of this and more. But if you have little experience with Unix and you want to set up a Linux system -- possibly on an old computer that can no longer run the latest Windows version -- this book is essential reading. You'll probably find the chapter on hardware especially useful.


Better, Faster, Lighter Java by Bruce A Tate & Justin Gehtland (O'Reilly, Sebastopol, CA, 2004, 262pp, ISBN 0-596-00676-4, www.oreilly.com, $34.95)

Tate and Gehtland are well known authors of books on software development. In this book they attack the current state of Java development, which they characterize as follows:
Development is getting so cumbersome and complex that it's threatening to collapse under its own weight. Typical applications use too many design patterns, too much XML, and too many Enterprise JavaBeans.
This is not,as you might have expected from the title, a book about improving your personal coding habits. Its main approach to producing better, faster, lighter Java applications may require you to change things that you do not control. Specifically, the authors ask you to consider abandoning heavyweight frameworks like WebLogic, Jboss, and WebSphere in favor of lightweight open source architectures like Hibernate and Spring. If such decisions are outside your control, you probably don't need to read this book. But even if you can't follow the book's advice, you may still find it enlightening.

The authors begin by painting a bleak picture of creeping bloat, though their principal target seems to be container managed Enterprise JavaBeans. They then enunciate basic principles that few are likely to disagree with:
  • Keep It Simple.
  • Do One Thing, and Do It Well.
  • Strive for Transparency.
  • Allow for Extension.
  • You Are What You Eat.
These are understandable enough, except for the last. In that one, the authors appear to mean that you shouldn't swallow every bit of conventional wisdom and vendor hype that comes your way. Instead they want you to be a heretic -- challenge a few things that "everybody knows."

From this point, the plan of the book is clear: discuss the selection of underlying technology in the light of each of the basic principles. Then introduce Hibernate and Spring. Finally, develop actual applications.

The authors have a lot to say, and they are fighting a good fight. It won't take you a long time to read this book, and you might learn something important.

Friday, June 25, 2004

Dvorak Predicted

This article appears in slightly different form in the May/June 2004 issue of IEEE Micro © 2004 IEEE.

This time I look at a ten year old book of predictions about the computer industry. Most of the predictions are wrong to some degree, yet the book provides a valuable look at a pivotal time in our industry.

Dvorak Predicts -- An Insider's Look at the Computer Industry by John C. Dvorak (Osborne McGraw Hill, Berkeley CA, 1994, 184pp, ISBN 0-07-881981-4, $16.95)

I met John Dvorak in the early 1980s. I don't think we've talked since then, but I've caught many of his radio and TV programs. I think we met at a reception put on by Media Alliance. In a room full of writers enjoying the late afternoon view of the San Francisco Bay and chatting, over drinks, about all manner of subjects, we were the only two who wrote about computers. It's amazing to compare that image with the jostling throngs of computer media representatives queued up to enter the hall for a Bill Gates or Steve Jobs keynote today.

In 1994, when this book appeared, Dvorak had been in the computer business as an entrepreneur, and then as a writer, for nearly twenty years. He had immersed himself in the computer industry. His columns dealt with nuts and bolts issues about computers, peripherals, storage media, firmware, operating systems, development tools, word processors, spreadsheets, and anything else you might think of. 

Dvorak's book is out of print now, but I found copies on the Internet for 80 cents and up. After all, who wants a ten year old book of predictions about the computer industry? The title is deceptive, though. Dvorak makes many predictions, but he also brings in a good deal of history and analysis to support them. Reading this book provides a fascinating look back at an important time in the history of our industry.

In 1994 practically nobody had heard of browsers and the web. The closest approximation to a website was a bulletin board system (BBS). The Java language had not yet sprung into prominence, and the closest approximation to "write once, run anywhere" was software or hardware emulation of the Intel 486 microprocessor. Online pornography was in its infancy, and junk email was a minor nuisance. Broadband to the home was a thing of the future, though a few people were using ISDN connections. Hardly anybody remained connected all the time. Most people dialed up their service providers when they wanted their mail. Even though Robert Morris's worm in 1988 had spawned the virus epidemic, malicious attachments were few and easily avoided. Rapidly spreading viruses and continually updated virus protection software were rare.

Dvorak does not mention Java, browsers, the World Wide Web, or spam. It's unfair to expect complete prescience, though. His analysis and some of his predictions capture the spirit, if not the details, of many developments that he did not complely foresee. He predicts a few things that have definitely not come to pass, but by and large, his predictions are good. In fact, the worst thing about the book is the many correct predictions he makes about things that do not matter at all ten years later. I think this points up the most problematic point about reading predictions -- it's rarely clear how to act to take advantage of them.

Though Dvorak didn't see the web coming, he saw something similar in his analysis of other people's predictions of 500 interactive TV channels. He said
It won't be 500 channels. It will be 50,000 or more channels all individually pumped out of homes and businesses in much the same way as computer bulletin  boards work today.

 . . . We can assume that people might just put a camera in their dining room, allowing us to watch a  family eat and argue.
He imagined that this might be implemented using BBS technology, with people dialing up over ISDN. From this he extrapolated to an amazingly prescient prediction.

Dvorak predicted that "Little brother will be watching, and little brother will be everywhere." From videotape of the Rodney King beating in 1992, to Amy Goodman's trickle-up journalism, to recent photographs of the coffins of war dead, we see how Big Brother cannot control the content of the news.

In 1994, Dvorak grudgingly conceded that we would have to take windowing seriously and that command line interfaces would die out. He felt that Unix, despite a nifty graphical user interface (motif), would nonetheless remain a niche operating system. He did not foresee Linux and the whole open source movement, though he did consider the possibility that the Public Windows Interface, as proposed by Sun Microsystems, might help keep Microsoft from building obstacles to competing products into its operating systems ("DOS isn't done until Lotus won't run").

Dvorak felt that Microsoft would try to achieve a monopoly in the industry. In 1994 it was not obvious that this could happen. IBM was still in the picture with OS/2. Apple was continuing to grow and was looking forward to a future based on the PowerPC architecture. WordPerfect was a popular competitor to Microsoft Word. Windows for Workgroups and its successors had not yet shown that they could push Novell from its network dominance. In fact, Dvorak believed that if Novell gave away DR DOS 6.0 to the entire industry, it could seriously undercut Microsoft. Novell did not do this, of course, so there is no way to know how effective that action would have been.

Dvorak also predicted that Microsoft would open a chain of software stores, so it could control both the shelf space and the sales pitch. This did not happen, of course, but that may be largely because Microsoft achieved those goals through other means. In the 1960s, IBM was amazingly successful at selling to data processing (DP) managers, which gave IBM market dominance at that time. IBM was a safe buy, so why risk anything else? Today's equivalent of the DP manager is the information technology (IT) manager, and Microsoft can reach them with a similar argument about networked Windows machines. On this basis Dvorak concluded that Macintosh sales to business would fall off drastically.

Dvorak's predictions of Microsoft's future were not all rosy. He said "Microsoft's domination will come to an abrupt end." He didn't say when, so he could still be right, but ten years later, there's no sign that this will happen. Dvorak's reasoning is interesting, though. He saw Microsoft as a company of youngsters willing to work 70 hour weeks and take gratuitous abuse from the management, because the rising stock price kept turning them into millionaires. He reasoned that the stock price could not keep going up without a few dips and that nobody would put up with Microsoft working conditions without that incentive. On top of that he says
Lotus has reemerged as a leader in the spreadsheet arena and pretty much owns the groupware category. WordPerfect is spending all its extra money marketing its word processor. . . . The best development tool are now produced by Borland and others. . . . NT does  not look like it will be much of a success either as  a network server operating system or as a standalone  operating system.
Dvorak also believed that hiring "old pros" would hurt Microsoft by eroding the general naivete and diverting energy into corporate empire building games. 

Dvorak devotes a large portion of his book to the microprocessor wars. He had great hopes for the PowerPC architecture. I was the guest editor for the Micro issue on PowerPC (Sept/Oct 1994), so I remember quite well the enthusiasm that everyone felt at the time. That architecture has been a success, and Apple has successfully moved to it. Dvorak believed that IBM had a secret strategy to include a clone of the Intel 486 chip as part of a PowerPC chip. I don't recall whether IBM actually did this. In any event, PowerPC has done little to affect Intel's dominance.

It is a cliché of futurism that people tend to overestimate the short term impact and undersestimate the long term impact of new technologies. In 1994 Dvorak said that voice recognition would be the killer app of the 1990s and would take off explosively in the next two years. This prediction is probably correct for the long run, but it hasn't happened as fast as Dvorak predicted. Similarly, Dvorak predicted that the Unicode character encoding scheme would lead to the death of ASCII by 1995. ASCII is alive and well today, though Unicode has become increasingly important.

Dvorak predicted that virtual reality would be a dud and that the TV and the computer will not merge. Dvorak saw that the marketing hype of the time overestimated the short term impact of these technologies. I suspect that Dvorak underestimated their long term impact.

I could go on and on. This short book contains a great deal of material. If you are at all interested in how we got where we are today, you should find a copy of this book and read it.

Wednesday, February 25, 2004

Single Sourcing, Mount Fuji

This article appears in slightly different form in the January/February 2004 issue of IEEE Micro © 2004 IEEE.

This time I look at two books that have little in common. One provides a nuts and bolts account of how to approach a difficult real world problem. The other looks at the kinds of fanciful, artificial problems that interviewers at some high tech companies -- and their imitators -- pose to job applicants.


Single Sourcing -- Building Modular Documentation by Kurt Ament (William Andrew, Norwich NY, 2003, 246pp, www.williamandrew.com, ISBN 0-8155-1491-3, $33.95)

Single sourcing is a way to produce documentation so that you only have to write the meat of it once. You create a collection of modules, each of which says one thing well. Then you use those modules to publish the same information in many forms -- print and online, reference guides and training materials, English and Japanese, and so forth. At least, that's the theory. The devil is in the details. 

Kurt Ament is an information architect. He has used XML, and its predecessor, SGML, to produce single sourced documentation for Hewlett-Packard, Hughes Aircraft, Xerox, and other large companies. He has distilled the experience he gained from these jobs into a clear and concise guide to approaching large documentation projects. Others have written thick volumes on aspects of this problem. Ament does a better job in a lot less space by focusing on the key issues:
  • structured information
  • teamwork
  • modular writing
He addresses these issues independently of the underlying technology, but his examples refer to popular tools.

There are many systematic techniques for producing structured information. Ament settles on a fairly simple one. If you are converting old documents into structured form, you can start by analyzing the information in the old documents. Otherwise you can start from scratch. In either case, Ament's taxonomy has two levels: primary modules and secondary modules. The primary modules are topics, processes, procedures, definition lists, and similar entities. The secondary modules are examples, figures, tables, notes, itemized lists, and entities of that sort. Ament's method calls for identifying, labeling, organizing, and producing these modules.

One step in producing structured information is to integrate the secondary modules into primary modules. Ament provides guidelines for this. Some types of secondary modules are inappropriate for some types of primary modules. Ament is particularly hard on tables. He wants to keep them out of most kinds of primary modules. In general, he finds them hard to work with when producing output in more than one format.

Another step is arranging modules into hierarchical documents. Ament recognizes that using a hierarchical structure helps writers ensure that their documents are comprehensive and coherent. Unfortunately, readers, especially online readers, have difficulty perceiving and navigating within a hierarchy. Ament recommends flattening the natural hierarchies of documents to make them more usable. This is much easier said than done, and Ament does not give much practical advice in this area.

Another important step is to design the linkages that underlie the navigational aids in the target output. These linkages take the form of tables of contents, indexes, and inter-module references. Doing this carefully is important for all documents, but especially those intended to be viewed online. Ament gives guidelines for these tasks, but he cannot teach you how to perform them. I understand that he has written a separate book on indexing, but I have not seen it.

Ament's process is ongoing. After you build and test the documents that embody your structured information, you look at the lessons learned and adjust your process. This is an important step in all projects, but it is especially important in single sourcing large document sets. Here success depends on having a process within which individual writers can work together to speak with a single voice.

The success of single sourcing projects depends heavily on teamwork. Individual writers must accommodate their personal writing styles to match the team voice.  Ament recommends that team members develop writing guidelines based on what works in actual projects and adopt these guidelines by unanimous consent. This limits the style guide to rules that everybody agrees to. People have difficulty following rules they don't believe in. If team members don't follow team rules, the team falls apart.

Ament suggests a division of labor within teams. He wants to centralize information architecture and distribute information development. These functions tend to have competing objectives, and the people who perform these functions have disparate viewpoints. As a result, Ament stresses the need for these groups to overcommunicate.

Ament devotes about half of this short book to what amounts to a sample style guide for a single sourcing team. The fact that he can write a style guide in 130 pages shows that he follows his own rules: he sticks to what works in actual projects, and he proposes rules that everybody is likely to agree with. This section is the heart of the book, because it addresses the difficult problems of writing modularly. Modular writing does not come naturally to most writers, but without it you cannot produce successful single sourcing projects.

Ament makes an interesting point: what works in print may not work online, but whatever works online usually works even better in print. Thus, his guidelines for modular writing come mostly from the needs and constraints of online documentation. 

For a documentation department planning a single sourcing project, this book is pure gold. Ament writes clearly and concisely. He brings real world experience to the most difficult issues and offers practical advice for addressing them. If your success depends on producing usable product documentation in a variety of forms, you must read this book.


How Would You Move Mount Fuji? -- Microsoft's Cult of the Puzzle by William Poundstone (Little Brown, Boston MA, 2003, 290pp, ISBN 0-316-91916-0, $22.95)

William Poundstone is a well known science writer. He has produced a thought provoking and highly entertaining book. The book is thought provoking because it examines a widespread practice, the posing of tricky and artificial logic puzzles to job applicants. It is entertaining because it includes many representative puzzles and because it pokes fun at Microsoft.

I have no personal bias either against or in favor of the Microsoft Corporation. I use their software every day. I enjoy its benefits, but often curse its faults. I have never interviewed for a job with Microsoft, so I have no idea how true or fair anything in this book is. Nonetheless, I thoroughly enjoyed reading it.

Poundstone describes Microsoft job interviews as a gauntlet, akin to fraternity hazing. He includes accounts of several actual interviews -- each quite funny and each resulting in "no hire." Microsoft has many more job applicants than open positions. Their principal objective in interviewing is to disqualify anyone they're not completely sure about.

Part of the hazing is to confront applicants with questions like "Count in base negative 2," "If you could remove any of the fifty U.S. states, which would it be?" or "How would you move Mount Fuji?" Other questions, however, are logic puzzles like finding the underweight billiard ball with a minimum number of weighings or determining how to get the missionaries and the cannibals across the river. The justification for asking questions of this variety usually begins with "It stands to reason that . . ." and asserts a correlation between the ability to solve such puzzles and future success writing computer programs. The degree of correlation is known in the testing business as the validity of the test. Even under controlled conditions, standardized IQ tests have questionable validity as predictors of any sort of job success. Logic puzzles, in the chaotic environment of a job interview have no established validity at all.

Even though questions of this sort have little value in job interviews, they have become increasingly popular in a wide range of industries. Companies try to emulate Microsoft -- either out of a wish to achieve similar success or because more traditional hiring approaches have become completely ineffective. Legal considerations have made obtaining useful information -- especially negative information -- from personal references difficult. Candidates come in with well rehearsed answers to traditional questions like "What's your greatest fault?" or "Where do you see yourself five years from now?" A question about the probability of dying in a game of Russian roulette adds a little spice to the process -- at least until all potential applicants have rehearsed answers to that question too.

While I have little interest in job interviews, I found the book fascinating. I enjoyed the opportunity to think of plausible answers to the impossible/stupid questions and to work out the logic puzzles. My ten year old daughter got a big kick out of the questions too. We spent a whole afternoon discussing the problems and reading some of Poundstone's answers. Then she went off to tell her mother how to move Mount Fuji.

Poundstone provides answers to the questions he mentions in the book. He also puts forth a few general rules for addressing this type of question. He frames the discussion in terms of how you should interact with the interviewer. 

Given the widespread nature of this interviewing technique, if you are looking for work, you should definitely read this book.