Sunday, December 20, 1998

New Ideas, New Tools

This article appears in slightly different form in the November/December 1998 issue of IEEE Micro © 1998 IEEE.

Three interesting new books, a powerful software package, and two books about the software package form the basis of this column.

One book deals with a promising new technique for object-oriented programming. Another applies basic economic principles to information-based commerce and arrives at practical strategies. A third explores the way new visual media stand on the shoulders of their predecessors.

As a counterpoint to all this analysis, I look at the new version of the most powerful and widely used image-processing software package. I also review its official tutorial and an unofficial one.


Books

Putting Metaclasses to Work by Ira R Forman and Scott H Danforth (Addison-Wesley, Reading MA, 1999, 300pp, ISBN 0-201-43305-2, www.awl.com/cseng/, $39.95)

Ira Forman and Scott Danforth are expert practitioners of object-oriented programming. Both have been involved in IBM's efforts in this area. The book describes techniques the authors developed as part of IBM's SOMobjects Toolkit 3.0.

This book is not for everyone. In fact, it's not even for every C++ or Java programmer, even though the techniques it describes can be implemented in, and may someday be incorporated into, those languages.

Proponents of object-oriented technology say that these methods increase the possibility of software reuse. Skeptics point out that a very small proportion of object-oriented software actually is reused. The authors' work arises out of a desire to increase that proportion.

Metaphorically, programming has evolved from a focus on verbs (operations) to a focus on nouns (data). In object-oriented programming, a class defines the data structure and supported operations of all objects that implement that class. In the authors' system, classes themselves are objects - instances of a higher-order class called a metaclass. Metaclasses shift the focus to adjectives, and the authors' contribution is a systematic foundation for metaclasses that ensures that adjectives behave in an orderly and intuitive way.

The authors call the key aspect of their system the inheritance of metaclass constraints. It means that, for example, if one metaclass implements the property "secure" and another implements "thread safe," any class that derives from both also derives from an automatically generated metaclass that implements "secure and thread safe."

The book explains the systematic foundation that underlies this idea and shows how it works in practice. To read the book you must be generally familiar with the theory of object-oriented programming. It requires about the same level of mathematical sophistication as an upper division class in set theory.

To illustrate why their system is useful, the authors begin with a scenario that many programmers may find familiar. A programmer, George, owns a class called Dog. It works only in single-threaded applications. George must make it work with multithreaded applications, which means he must make the following mechanical changes to his source code:
  • Acquire a semaphore at the beginning of each method.
  • Release the semaphore at each return point from the method.
  • Override each inherited method with a method that uses a semaphore to protect a call to the inherited method.
This is a long, boring, error-prone operation in most systems. In the authors' system, George writes
class ThreadSafeDog is subclass of Dog and is ThreadSafe
and the job is done.

If you struggle with this sort of issue and you're willing to put a few weeks into learning about and implementing this method in your favorite programming language, rush out and buy this book. 


Information Rules by Carl Shapiro and Hal R Varian (Harvard Business School Press, Boston MA, 1998, 384pp, ISBN 0-87584-863-X, www.inforules.com, 29.95)

As a provider, user, and observer of computer-related technology for nearly forty years, I have often watched us reinvent the wheel. Recently, entrepreneurs on the frontiers of cyberspace have been doing the same thing. Shapiro and Varian, professors at Berkeley's Haas School of Business, have noticed this and have cautioned those who will listen that "while technology changes, economic laws do not."

This book is a practical guide. It translates established economic principles into practical terms that business leaders can apply to their own information-based enterprises. While the book contains many useful strategies about subjects like differential pricing and personalized content, I'm more interested in what the authors have to say about intellectual property and standardization.

The underlying principle the authors want you to understand about intellectual property is that managing rights is more important than protecting rights. Proper management of intellectual property rights seeks to maximize their value.

Shapiro and Varian contrast Disney's policy of threatening to sue day care centers that put pictures of Disney characters on their walls with the initial marketing strategy for Barney the Dinosaur, which called for giving free copies of Barney videos to day care centers. By exposing children to Barney in day care centers, the purveyors of the purple pest created a demand for Barney videos in the children's homes.

This is similar to the practice of handing out free samples of cigarettes on school playgrounds, but the economic model is a little different. Unlike cigarettes, intellectual property is costly to produce but increasingly inexpensive to reproduce. Exploiting this fact doesn't call for a new paradigm. All it takes is new coefficients in an old formula.

What Shapiro and Varian have to say about standards arises from the concepts of networks and positive feedback. Macintosh users form a network. People whose evening entertainment or business success depends on the VHS video format form a network. When networks compete, the larger the network, the greater the support it receives. Software developers see greater return from Windows applications than from Macintosh applications. This is feedback, which makes the strong stronger and the weak weaker.

Networks can cooperate by standardizing and interconnecting. When efforts to achieve these goals fail, standards wars break out. The authors offer practical advice on how to wage standards wars and explore in detail the two basic strategies: evolution and revolution. An evolutionary strategy, like the introduction of color television, offers a migration path from existing networks to the new one. A revolutionary strategy, like the introduction of the Macintosh, attracts users to a new network by offering performance that is unavailable within the old network.

The authors offer practical advice, not magic formulas. They direct your attention to existing principles and invite you to apply them to your own situation. Their main message is "don't reinvent the wheel." If you have entrepreneurial interests, be sure to read this book.


Remediation: Understanding New Media by Jay David Bolter and Richard Grusin (MIT, Cambridge MA, 1998, 295pp, ISBN 0-262-02452-7, 800-356-0343, mitpress.mit.edu, $32.50)

Bolter and Grusin are professors at Georgia Tech. In a way, their book, like the one by Shapiro and Varian, is about not reinventing the wheel.

Media mediate. A medieval illuminated manuscript calls attention to the underlying process of mediation by combining art and text on the page and using art to represent some of the text, as in initial block capitals. Many websites today also call attention to the underlying mediation process. The authors call this hypermediation.

The opposite of calling attention to mediation is trying to hide the medium. The authors call this transparent immediacy. Early examples of this technique exist as well. A modern example is virtual reality.

McLuhan said that a new medium turns the old medium into an art form. This hints at what Bolter and Grusin call remediation, the process by which new visual media pay homage to, rival, and refashion older media such as perspective painting, photography, motion pictures, and television - just as these older media remediated their own predecessors. Remediation balances hypermediation against transparent immediacy.

The authors move from theory to examples like computer games, virtual reality, and the web. Because media so deeply affect our personal and cultural identities, the authors apply their theories of remediation to our own perceptions and thought. The two opposing forces of hypermediation and transparent immediacy give rise to the networked self and the virtual self.

Like McLuhan's Understanding Media, this is a theoretical book with interesting examples drawn from modern life. And like McLuhan's book, it tries to assess the effects of the processes it describes on our consciousness and our society. I don't know if this book is as important as McLuhan's, but it certainly is interesting.


Photoshop

Adobe Photoshop 5 for Windows (Adobe Systems, San Jose CA, www.adobe.com, $995.00 ($199 upgrade))

Photoshop is the most powerful and most widely used software package for manipulating bitmapped images, that is, images made up of essentially unrelated pixels. Scanned photographs, for example, are often the raw material on which Photoshop performs its magic.

Until now, Photoshop has presented intimidating barriers to the casual user. It is not known for its user friendliness. The current version, however, has taken big steps in the right direction. Its history palette, layer handling, "magnetic" selection tools, and macro facility (actions) make it easier for fumble-fingered, not very visual users like me to use Photoshop confidently.

I reviewed the Windows version, but Adobe claims that the product behaves virtually identically in its Macintosh and Unix versions. They didn't send me copies of either of those, however, so I can't say one way or the other.

Even though Photoshop 5 tries to be user friendly, and even though it includes an excellent manual, online help, a guided tour, and a tutorial, I think most casual users will appreciate the guidance of a good book. I review two such books in the remainder of this column.


Adobe Photoshop 5 Classroom in a Book (Macmillan (Adobe Press), Indianapolis IN, 1998, 441pp, ISBN 1-56830-466-8, www.mcp.com, $45.00)

I have reviewed other books in Adobe's Classroom in a Book series (see Micro Review, Dec 97, for example), and I have always found them to be of high quality. Like the others, this one provides a good overview of the products main features.

The key elements of becoming a competent Photoshop user are familiarity with the work area, experience with the tools, and understanding of input and output file formats. The book methodically covers these areas in a series of thirteen lessons, each of which should take about an hour. As a bonus, the book includes a brief tutorial for Adobe ImageReady 1.0, a product currently bundled with Photoshop and designed to help you prepare images for the web.

The CD that accompanies the book contains the files for the lessons. For each they have a starting file and an ending file, so you can see where the lesson is headed. If you have a big enough screen, you can keep the ending file in view as you manipulate your working copy of the starting file. I find this helpful, because I tend to get lost in tutorials. I pursue a side investigation, and when I come back to where I was, I may miss a step. Between the history palette and the ending file, I can usually get back on track quickly.

While the book is methodical and comprehensive, its coverage is not deep, and it gives only the briefest idea of the theories and artistic strategies underlying the tasks it takes you through. For these I found another book extremely helpful.


Inside Adobe Photoshop 5 by Gary David Bouton and Barbara Bouton (Macmillan (New Riders), Indianapolis IN, 1998, 768pp, ISBN 1-56205-884-3, www.mcp.com, $44.99)

Like the Adobe book, this one is also justly famous for its earlier editions. It too takes the form of a tutorial, but it contains many more lessons and vastly more material about the underlying theories and issues.

The authors chose an excellent organization for their book, and they write clear, straightforward prose. I like the mixture of tutorials, task-oriented procedures, and reference material. The excellent graphics and clean page layout make the book a pleasure to look at.

Unlike the Adobe book, which you will go through once and never look at again, this book will remain your guide as you continue to develop into a skilled Photoshop user. I like it a lot, and I recommend it highly.

Tuesday, October 27, 1998

Working on the Web

This article appears in slightly different form in the September/October 1998 issue of IEEE Micro © 1998 IEEE.

Everything in this column relates in some way to working on the web. I have been working with website design issues recently, and it has been a frustrating experience. I attribute this to several factors: 
  • Industry innovation is far out in front of standardization.
  • New tools and new ideas arise quickly, then disappear just as they become familiar.
  • Many tools are primitive, bug ridden, and ill documented -- the results of brutal competition and impossible schedules.
  • Efficient operation requires integration of operating systems, browsers, communication hardware and software, transmission protocols, and Internet service -- all from a varied array of suppliers.
  • I am my own system integrator.

Too Many Pieces

If the above seems overly critical and pessimistic, I hope the following story puts my remarks in perspective.
 
Two years ago my Internet service provider (ISP) was C2Net, which at the time had an office three blocks from mine. I dialed into their system for my Internet connection and maintained a Unix shell account there. I used SoftQuad's HoTMetal Pro to build a website on my local PC, then transferred the files to the C2Net machine via FTP. I used the Eudora email program on my local PC, interacting over the Internet connection with C2Net's POP and SMTP servers. I was pretty much a one-stop shopper.
 
As a backup I acquired another Internet connection via a local call to the Microsoft Network (MSN). If, as sometimes happened, I had trouble dialing into the C2Net modem bank, I connected to the Internet via MSN. This gave me direct access to the C2Net POP and SMTP servers and access via Telnet to my shell account.

This worked well until C2Net decided to get out of the ISP business about a year ago. They arranged a smooth transition of my website, shell account, and electronic mail to my current ISP, Infonex. Unfortunately, Infonex is 400 miles from my office, and they have no local access number, so my MSN backup became my only path to the Internet. Fortunately, it is a highly reliable path. I should have been suspicious when everything went so smoothly.

Electronic junk mail (spam) is a growing problem. It has motivated a large number of security measures, and one of them affected me. Last year, Infonex informed me that I may no longer use their SMTP server for sending mail, because I connect to the Internet via MSN.

Fortunately, MSN had recently installed an SMTP server, and I started to use it, giving me an unusual, but logical configuration. I was sending Infonex mail via the MSN SMTP server and receiving it via the Infonex POP server, using Qualcomm's Eudora email client for both. This worked well for about a year. Then, recently, I once again became a casualty of the anti-spam wars.

The MSN SMTP server had become a favorite tool of spammers, so MSN instituted a secure procedure to prevent its unauthorized use. Unfortunately, the procedure also prevented me, an authorized user, from sending email, because Eudora does not have that protocol in its repertoire. In fact, it appears to be a Microsoft proprietary protocol.
 
I am not usually as critical of Microsoft as many of my colleagues are, but I am disappointed by the way Microsoft handled this situation. They sent no notice of the change, and they sent no error indications. The server simply accepted my mail and appears to have thrown it away. It took me two days to figure out why nobody was replying to my messages. Then after listening to baffled mumblings from Microsoft's technical support staff for another two days, I finally reached someone who gave me a workaround. "We've had this for a week," he said.
 
The workaround was a temporary alternate SMTP server. It worked for another week, then started doing the same thing. This time it took me only six hours to notice that my mail wasn't arriving at its destination. MSN technical support assured me that their SMTP server was working properly, and it did in fact work from my MSN account. I couldn't get Eudora (or Microsoft's Outlook Express) to send Cyberpass mail, and the MSN folks informed me that they didn't feel obliged to help. Two days later, without notice, it started to work again. The delayed mail arrived in a bunch. Perhaps the mail that disappeared a few weeks ago will show up some day too.
 
One advantage of the piecemeal approach is that not everything fails at once. My MSN Internet connection has trouble passing email through its own SMTP server, but it handles my web publishing FTP transfers without difficulty. I use HoTMetaL Pro 4 (Micro Review, Oct 97) to manage those transfers. I still like HoTMetaL Pro 4, though I have run into a few problems. I won't go into all I've found out about its strengths and weaknesses, because a new version is imminent. I expect the new version to support XML, and I look forward to reviewing it when it appears.
 

Books

The above story illustrates how many and varied are the things that can go wrong. Your best protection is an informed guide. Most of us can't rely on a system administrator for every problem. The next best thing is usually a good reference work. Nowadays, you can look for information on many useful websites, but a good book is often a better choice. It may not be as current, but it is usually better edited, better organized, and more carefully checked.

 Web Site Engineering by Thomas A. Powell et al. (Prentice Hall, Upper Saddle River NJ, 1998, 334pp, ISBN 0-13-650920-7, (800) 382-3419, www.phptr.com, $39.95)

Thomas Powell is a computer science instructor at UC San Diego, the developer of a web publishing certificate program, and an independent Internet consultant. His book is the first I've seen that restates well known principles of engineering and software development management in the context of website development.

Website designers -- if they take a systematic approach at all -- frequently apply principles of document design to building their sites. Those principles are helpful, but they don't address the most important problems that designers of large, active sites face.

Powell looks at a complex website as a piece of software designed to run in a variety of configurations -- not all of which the designer can anticipate. Software design principles have not made software design easy, but they have made it manageable. Powell expects his design principles to have the same effect.

Powell recognizes that websites, unlike most software, depend for their success on their look and feel and on their content. Traditional software development methodologies don't address those aspects effectively. Websites must also function in an environment that is more complex and unpredictable than traditional client/server configurations. Powell outlines methods for assessing the characteristics and capabilities of each user's environment and deciding dynamically what files, formatting, and client-side scripts to send into that environment.

Powell's approach starts with traditional project management steps: defining the problem, exploring the concept and analyzing its feasibility, then formalizing the requirements analysis and specification. These become the head of a cascade of project lifecycle steps: prototyping, implementation and module testing, system integration and testing, deployment, and maintenance.

Engineering entails measurement. A designer following Powell's methods must establish measurable criteria for success. Correlating site activity with objective measures of business success is difficult. Powell offers suggestions, but if he had a good way to do this, he wouldn't need to write books for a living.

While focusing on process, Powell also provides practical advice on how to address specific aspects of design and implementation. Nonetheless, this is not really a reference book, as its skimpy index emphasizes. Plan to read the book once to get a good overview of how to approach website design, then turn to other books for detailed help.


 Dynamic HTML: The Definitive Reference by Danny Goodman (O'Reilly, Sebastopol CA, 1998, 1096pp, ISBN 1-56592-494-0, (800) 998-9938, www.oreilly.com, $39.95)

Danny Goodman is a well known and well respected author of computer books. His Complete Hypercard Handbook (Bantam, 1987) was justifiably famous and sold enormously well. His JavaScript Bible has recently come out in its third edition (IDG, 1998). His books have won prestigious awards.

Goodman wrote this enormous tome because he couldn't keep straight all of the changing standards, undocumented features, and browser incompatibilities and idiosyncracies. And if he, totally immersed in this material, can't keep it straight, what chance do the rest of us have?

The first thing Goodman explains is that there is no such thing as dynamic HTML (DHTML) -- or rather that there is a large, ill-defined collection of concepts, standards, and browser features that fall under that heading. In many cases the various elements are battlegrounds for commercial competition -- mainly involving Microsoft and Netscape. Another of Goodman's objectives for the book is to present the elements of DHTML in a non-partisan way.

Goodman devotes the first part of his book -- less than 200 pages -- to an explanation of how the Netscape and Microsoft approaches differ. He explains how to develop applications that run on both browsers.

The remainder of the book -- over 800 pages -- is reference material, including nearly 100 pages of indexes. It provides complete documentation of every feature of HTML, the document object model (DOM), style sheets, and JavaScript. It identifies undocumented features, features that behave differently on Netscape and Microsoft browsers, features that behave differently from the way they are supposed to, and features that don't behave logically or consistently at all.

If you plan to produce professional websites with dynamic content, you need a reference like this one.


 HTML 4 for Dummies Quick Reference by Deborah S. Ray & Eric J. Ray (IDG, Foster City CA, 1998, 240pp, ISBN 0-7645-0332-4, www.idgbooks.com, $14.99)

I have to say at the outset that I know Eric and Deb Ray and have the highest personal regard for them, so you have to take all the wonderful things I say about their books with a grain of salt. Still, they are all true.

The Rays have produced several books about HTML 4. Their Mastering HTML 4.0 (Sybex, 1997) is a piece of engineering in its own right -- almost too heavy and thick to lift with one hand, yet opening easily to any page and lying nearly flat. It looks wonderful, and the prose is easy to read. Someday I'd like to read it. My only reservation is that it might have come out too early to be authoritative.

They aimed their Dummies 101: HTML 4 (IDG, 1998) at beginning to intermediate website designers. If you work your way through it, you can wind up with a professional looking personal site.

The quick reference aims to serve all HTML 4 users. Its explanations are accessible to beginners, but they are sufficiently thorough to satisfy advanced users.

Elizabeth Castro's Visual Quickstart Guide: HTML for the World Wide Web (Peachpit, 1996) is an outstanding example of purely task-oriented documentation (see Micro Review, Aug 96). The Rays aim at task orientation in their quickstart guide, but they blend it with explanatory material. Not only can you find a step-by-step procedure for the task you're trying to accomplish, but you can also read the relevant background material. This kind of just-in-time exposition helps you learn and remember facts and general principles that you might not absorb as easily in another context.

The book is small and light, with a spiral binding that makes it lie flat at any page. Unfortunately, the small pages have forced the Rays to reduce their screen shots to an almost unreadable size. The text, however, is clear and easy to read. Navigation and orientation aids make the book easy to use. A little more attention to the index, however, would have made the book an even better tool.

I think the best thing about this quickstart guide is the blend of task orientation and exposition. If you keep it by your side while you work on your website, it will answer your questions and help you deepen your knowledge. I recommend it.


 XML: Principles, Tools, and Techniques - World Wide Web Journal, v.2, no. 4, Winter 1997, Dan Connolly, ed. (O'Reilly, Sebastopol CA, 1997, 258pp, ISBN 1-56592-349-9, (800) 998-9938, www.oreilly.com, $29.95)

Pundits have described the Extensible Markup Language (XML) as SGML lite or heavy-duty HTML. In fact, it is a subset of the Standard Generalized Markup Language (SGML), surrounded by a collection of web-oriented tools and standards. Unlike SGML, which made a lot of noise but few inroads into mainline applications, XML is creeping, relatively silently, into the underpinnings of basic PC and web-based tools.

It's almost too late to talk about this issue of the World Wide Web Journal. When it appeared about a year ago, it gathered in one place the relevant XML specifications and technical papers. Many of these are now a little dated. The expository papers, on the other hand, provide a great deal of insight into the underlying ideas.

Perhaps you don't feel a need to learn anything about XML right now. Buy this issue of the World Wide Web Journal and keep it on your shelf. When you suddenly find yourself needing a roadmap to the new technology, you'll know where to look.

Saturday, June 27, 1998

Year 2000, Windows 98, Help!

This article appears in slightly different form in the May/June 1998 issue of IEEE Micro © 1998 IEEE.

This column deals with three subjects: an authoritative book about the Year 2000 problem, an introduction to Windows 98, and the ultimate tool for creating online help systems.


Time Bomb 2000 -- What the Year 2000 Computer Crisis Means to You by Edward Yourdon and Jennifer Yourdon (Prentice Hall, Upper Saddle River NJ, 1998, 442pp, ISBN 0-13-095284, www.phptr.com, $19.95)

Ed Yourdon is a world famous and highly regarded specialist in the design of software systems. His daughter Jennifer is a financial analyst with an investment bank. They have written an informative but widely accessible account of why there will be growing problems over the next year and a half, culminating in widespread difficulties in the weeks and months starting at 12:00:01 am, 1/1/00. These problems are known collectively as "the Y2K problem."

The problem arises from the fact that computer systems store and process large numbers of dates. A simple accounting system must keep track of when each transaction is posted, invoiced, due, and paid. A credit card system needs to keep track of the dates of issue and expiration for each card number in its files. An inventory control system needs to keep track of when each item is stocked and when it must be discarded. Your VCR stores the dates of upcoming programs you want to record.

Because computer systems store large numbers of dates in limited storage space, designers have devised ways to limit the amount of memory space a stored date occupies. One such scheme in widespread use today is to store only the last two digits of the year. Under this scheme, 1998 becomes 98.

I first confronted this problem in 1974, when I worked on a retrieval system for radiology films. Patients supply their name, birth date, and mother's maiden name. The system replies with a list of filing numbers and thumbnail descriptions for that patient's radiology films. A birth date of '77 clearly meant 1877, but what if the system survived until 1980? What would '77 mean then?

We solved this problem by storing the date as a 16-bit binary number, giving us a range of 179 years, which we set arbitrarily to 1850-2029. If the system survives into the next century, we can change the range to a different 179 years, say 1880-2059. Any records with dates of birth in the range 1850-1880 will have to go, but by that time those records will probably be obsolete anyway. Making the change requires a single pass over the database and a change in the contents of one memory location. The memory location is shared by the routines that handle all conversion between internal and external date formats.

Unfortunately, this ant-like concern for the future was the exception. Grasshoppers designed most business computer systems of the 1970s. These systems store dates as character strings, with two digits dedicated to the year. To obtain the actual year they add 1900. It's easy to see when that will stop working. In fact, some older programs use "impossible" dates like 99 and 00 as special codes to convey other information about the associated information.

Worse yet, the grasshoppers make no distinction between internal and external date formats, and they do not encapsulate date handling. Many remote corners of their systems contain instructions and constants that depend on the date representation. 

I don't think the retrieval system I worked on is still in operation, and I don't believe it had descendants, so all that foresight was wasted. Many grasshopper systems, on the other hand, survive to this day. In fact, designers used the same techniques through the 1980s and early 1990s. Some may still be doing so.

Work on identifying and solving Y2K problems has been going on seriously in the USA since about 1995. In other parts of the world, awareness of the impending problem has been slow to develop. For example, the following reports appeared in the May 3, 1998 London Times (quoted by Declan McCullagh, www.well.com/~declan/politech):
Last week a source at a north London hospital disclosed that an operation had to be postponed because the computer system told doctors that the swabs needed during the surgery were out of stock. In fact there were plenty available.
   
The confusion occurred because the swabs had a use-by date early in the next millennium. Instead of reading the date as 2001, the computer could recognise only the last two digits and believed the date to be 1901.
   
The Action 2000 survey reveals that National Health Service executives are so worried about the implications that almost two-thirds have drawn up contingency plans for a widescale failure of systems.
   
I like to think that the USA is further along, but I have heard people say that their credit cards with expiration years of 00 have been rejected by merchants.

The Yourdons don't spend a lot of time on the technical details of the Y2K problem. They conclude that it is already too late to avoid the problem completely. They focus on the possible effects. They point out that even though many people thought about what would happen to their own programs on 1/1/00, few people until recently thought about the consequences of having so many programs fail simultaneously.

The Yourdons don't foresee catastrophic failures of individual software packages, but they believe that small failures will have a ripple effect as they propagate through our communication, transportation, medical, utilities, governmental and other systems. By the time the ripples make the rounds of these thoroughly interdependent systems, they may grow into a tsunami.

So how long will the tsunami's effects last? The Yourdons say two to three days in most cases, a month in a significant number of cases, and a year or more in a few cases. Starting from those assumptions, the book identifies specific steps to prepare for the kinds of disruptions that are likely to occur.

I think the Yourdons take a pessimistic but believable view of what will happen. They don't predict airplanes falling out of the sky, and they don't just say "don't worry, be happy." Their view is somewhere in between.

It's a short book, and it's easy to read. The preparations they suggest are not costly or difficult. I think it's good insurance.


Introducing Microsoft Windows 98 by Russell Borland (Microsoft, Redmond WA, 1997, ISBN 1-57321-630-6, mspress.microsoft.com, 800-MSPRESS, $19.99)

Russell Borland is one of my favorite computer book authors. His books about Word for Windows 2.0 are models of clarity and great sources of in-depth information (Micro Review, Dec 92).

As I write this, Windows 98, barring government action, is about to appear. I have installed and used pre-release versions, and they are impressive. I don't think it's a good idea to review beta software, so I'll defer further comment until I see the shrink-wrapped version.

Borland's book lives up to the promise of his earlier works. It goes systematically through everything you need to know about Windows 98. It reveals the underlying logic and structure.

Windows 98 will be here for a long time. Reading this book now is a good investment in the future.


RoboHELP 5.5 (Blue Sky Software, 1998, www.blue-sky.com, 800-459-2356, $499.00)

I first reviewed RoboHELP (version 2.6) in October 1994, when help authoring systems were a novelty. I have come back to it from time to time. The product has evolved substantially over the intervening years. It once had serious competition, but now it is the clear leader in help authoring tools.

Online help is rapidly becoming the principal medium for procedural and reference support of software users. Most such help uses Microsoft's winhlp format. This format is native to the Microsoft Windows operating systems, but software exists to run winhlp files on other platforms.

Blue Sky implements RoboHELP within the environment of Microsoft Word. This ties RoboHELP to the Windows platform, which limits Blue Sky's potential market share. This becomes less and less a problem as time passes.

Microsoft, as part of its integration of the web and the desktop, is moving to HTML-based online help. Netscape was the first to announce a scheme for HTML-based help, but they have done little with it. Microsoft announced a similar scheme about the same time as Netscape. They have developed and refined the idea considerably, and tailored it to the Windows environment.

Blue Sky has worked hard to support all of these help formats and has even introduced a cross-platform HTML-based help format of its own. The need to support a rapidly evolving field has led Blue Sky to innovate continuously. As a result, some of their releases have been more complete, consistent, and stable than others. Many users continue to use version 4 and have been reluctant to upgrade. Even though version 5 represented a major step forward in RoboHELP's user interface, many users reported problems with it. Version 5.5, on the other hand, strongly resembles version 5 but is a lot more stable.

RoboHELP version 5.5 differs from version 4 in two major ways: it allows generation of different help formats from a single source file, and it provides a suite of productivity tools through a new window called the RoboHELP explorer. As a separate entity the RoboHELP explorer is not as intimately symbiotic with Microsoft Word as RoboHELP itself is.

The RoboHELP explorer provides drag-and-drop management of a project's contents, index, and related topic structures. Current RoboHELP users will love the new way of handling these structures. People who start out using version 5.5 will be astonished if anyone ever tells them the byzantine procedures -- involving unlikely footnote symbols -- that these tasks used to require.

The RoboHELP explorer also provides a graphic navigation scheme that clarifies the relationships among topics. With previous versions it was possible to work with a topic but have no idea how to get to it from within the help system. Now you can see instantly which topics point to it and which topics it points to.

Another improvement in version 5.5 is support for large projects with several developers sharing files and using standard source control packages.

One issue many producers of online help systems face is printed documentation. This was once an advantage for Wextech's Doc-To-Help, a RoboHELP competitor, which offers a single-source approach in which the printed document is the master. Earlier versions of RoboHELP had no corresponding feature. RoboHELP 5.5 makes the online help the master and uses a wizard to produce printed documentation. I think using the online help as a master and generating the printed version from it is the more promising approach. I don't know how well it works in real systems.

If you need to generate online help systems and you have access to a Microsoft Windows operating system running Word 95 or Word 97, RoboHELP is your clear choice for a help authoring tool.

Monday, April 27, 1998

Bits and Pieces

This article appears in slightly different form in the March/April 1998 issue of IEEE Micro © 1998 IEEE.

A Personal Note

After many years of working as an independent contractor, I have just joined Oracle Corporation as a full time employee. This column, however, remains an independent activity. As before, the opinions I express here are my own.

I became an employee after so many years of independence partly for personal reasons, but global industry trends also influenced my thinking. I'd like to share that part of my decision making with you.

In the 1980s many companies were caught unaware by global competition. Many suddenly saw their vulnerability to takeovers if they were perceived to be using their assets inefficiently. The result was a massive move toward downsizing and outsourcing. Companies chose to obtain important services from outside vendors rather than from employees.

That pendulum has swung about as far as I think it's going to swing. In some of the leading companies in the computer industry, it's beginning to swing back. These companies have discovered their essential core businesses and competencies. They have come to realize that institutional memory, loyalty, and accountability are more important to them in these core areas than the benefits of using outside vendors.

For most of the last fifteen years I have been an independent vendor of computer-related services, and I've had little difficulty finding plenty of highly remunerative work. I could continue that way for at least another ten years, but the most interesting work at the best companies is moving inside. I think that trend will accelerate. Many independents will see the same trends I do and begin to associate themselves with companies. I decided to beat the rush.

[Postscript: My connection with Oracle was very brief. Shortly afterward, however, the reasons I cite here led me to begin a seven-year association with Documentum, which along the way became a division of EMC.]


Microsoft's Vision

While the US Congress and Justice Department focus on its browser integration and marketing tactics, Microsoft is preparing its final assault against IBM and the mainframe industry. Microsoft's goal is to establish Windows/NT based servers as the platform of choice for distributed electronic commerce. Toward this end, it has devised a multifaceted strategy for supporting distributed objects.


COM and DCOM -- Microsoft's Vision for Distributed Objects by Roger Sessions (Wiley, NY, 1998, 512pp, ISBN 0-471-19381-X, $34.99)

Roger Sessions was one of IBM's lead architects for its support of the Common Object Request Broker Architecture (CORBA) standard. He was also one of the Object Management Group's lead architects on the specification of its Object Persistence Service, one of the CORBA services. Sessions was not a big Microsoft fan when he set out to write this book, but as he began to understand Microsoft's plans, he became a believer.

Sessions believes that Microsoft has built a coherent strategy out of uninspired components. He says:
The main reason Microsoft's Java, COM, and DCOM are so exciting is not because they are wonderful technologies in and of themselves. If we want nothing more than Java programming, Sun offers a better solution. If we want nothing more than distributed components, CORBA is a better programming model than DCOM. What makes Java, COM, and DCOM so exciting is how they feed into a whole vision of how to write distributed applications. This vision includes sophisticated security, an entirely new model for efficient management of shared objects, and an incredible idea of clustered workstation machines, to name but three.
Sessions sees Microsoft's Visual Basic as a big piece of the picture. He predicts a future in which programmers use Java, and to a lesser extent C++, to implement the logic of applications and use Visual Basic to implement the user interface.

Two things make this book useful. Sessions has penetrated the jargon layer to expose the core of Microsoft's strategy, and he has put together a realistic example of sufficient depth to illustrate the essential issues. To read this book you should have access to a programming environment that includes Microsoft's Visual J++ and Visual Basic. If you reproduce the examples as you follow them in the book, you can get a good feeling for how it all goes together.

I recommend this book, but I have to offer one caveat. The publisher has not done a good job of editing. The book contains confusing passages and outright errors. Read it carefully, and trust your own judgment.


Usable Software and Documentation

User and Task Analysis for Interface Design by JoAnn T. Hackos and Janice C. Redish (Wiley, NY, 1998, 508pp, ISBN 0-471-17831-4, $??.??)

JoAnn Hackos and Ginny Redish are experts on usability of software and documentation. Both authors have written other highly regarded books, and both speak to packed audiences at professional society meetings. It was in a conversation after one such presentation that I discovered that Ginny Redish is related to Micro's editor-in-chief, Steve Diamond -- a fact I feel compelled to mention in the interest of full disclosure.

Hackos and Redish have written a practical book. They start from a piece of advice that every designer has heard and most designers try to follow: understand your users and the tasks in which they use your product. This is easy to say but hard to do. Hackos and Redish tell you how to do it. They explain how to observe and interview users at their workplaces and how to record and interpret the data you gather.

I find it telling that the first quarter of the book focuses mainly on justifying the necessary activity. Usability engineering is often hard to sell to managements that already pay plenty of money to programmers and writers. "We're paying top dollar for the best people," they reason, "so they ought to be able to figure that part out for themselves." If you encounter such views -- or if you hold them -- this book can help you understand the strong arguments against them.

While the book is about how to conduct site visits, the visits themselves occupy only two chapters. The bulk of the book deals with advance preparations and post-visit analysis.

Other books about this subject exist, but the authors of this one have applied their own techniques. Their long experience with usability and with technical communication have helped them write an extremely usable, readable, visually appealing book.

Like Hackos' earlier book Managing Your Documentation Projects (Wiley, 1994), this book presents a methodology that requires the efforts and commitment of more than one person to make it work effectively. By all means, read this book. If you agree with its ideas, you'll need to sell them within your organization before you can put them into practice.

Friday, February 27, 1998

Looking Forward

This article appears in slightly different form in the March/April 1998 issue of IEEE Micro © 1998 IEEE.

Everything I've reviewed this time looks forward -- to the year ahead and to the years beyond.


Macworld Expo

I have a sentimental attachment to the Macintosh. I've had Macs continuously since 1984, and for a long time they were the computers I used for most of my work. That began to change in the early 1990s, and today I use Windows-based systems most of the time. My ten-year-old Mac SE/30 isn't good for much of anything any more.

As guest editor for Micro's special issue on PowerPC (October 1994) I've been hoping the line of Macs based on PowerPC would do well. It was beginning to look as though it wouldn't happen without my help, so as part of my year-end shopping spree, I bought one -- directly from the Apple website.

I was surprised how easy it is to spend money on the web. Apple did a good job setting up the Apple Store (www.apple.com). I like to know all the options when I shop. I want someone to answer my questions as they come up, but I don't want anyone hovering over me or rushing me. Other websites I visited -- Micron, Broderbund, HP -- are helpful, but the Apple Store supports this shopping style especially well. I recommend it to anyone who wants to buy an Apple product.

As a proud new Mac owner, I thought I'd better head for the annual Macworld expo at Moscone Center in San Francisco to see what's new. I got there bright and early to catch Steve Jobs' keynote. He put on a good show.

Jobs is mellower than he was in the insanely great days, and he seems to focus more on business issues. The surprise announcement that everyone speculated about beforehand was a $45 million quarterly profit -- about 3% of gross revenues, but a lot better than the previously projected loss. The meat of Jobs' presentation, though, was mostly about software, and the most exciting part was Microsoft's contribution: Office98 and Internet Explorer 4.0.

Office98 hasn't yet appeared for Windows, so Mac users will have the latest software for a change. The expected boos and hisses greeted the Microsoft representative who came to the stage to demonstrate the product, but by the time he left, he had the audience cheering wildly. The product has the look and feel of a Mac application, and the features he showed off were spectacular. I haven't had a chance to work with the actual product yet, so I can't say how good it really is.

Internet Explorer 4.0 for the Mac is ready, and Microsoft gave away copies on CD to anyone who came to their booth. The Mac operating system 8.1, which is about to appear, integrates closely with Internet Explorer. That should add a new twist to the browser wars.

Oracle, another Apple partner, also figured prominently in the keynote presentation. Oracle has ported its main business applications to Java, while Apple has put Java support directly into version 8.1 of the operating system. This means that business users are no longer forced to rule out Apple as a potential supplier for their computer systems. Whether Apple will actually regain lost market share in the business sector remains to be seen.

After the keynote, I toured the exhibit floor. It wasn't as crowded and daunting as it has been in the past, but there were plenty of people and plenty of interesting exhibits. The greatest emphasis, not surprisingly, was on tools for multimedia and web design.

I don't know if you should run out to buy a Macintosh, but I'm glad I did.


Books about Trends

I read three interesting books that seek to make sense of our technological society and help us see where it's headed. Two of the authors are enthusiastic about where technology is taking us. The third sees growing reaction to and refutation of some of technological society's underlying assumptions.


Release 2.0 -- A Design for Living in the Digital Age by Esther Dyson (Broadway Books, NY, 1997, 318pp, ISBN 0-7679-0011-1, $25.00)

For more than twenty years, Esther Dyson has lived and worked at the center of the personal computer and telecommunication revolutions. She is a journalist, conference organizer, analyst, entrepreneur, and informal ambassador. Because they respect and trust her, heads of industry and government seek her advice and counsel. She doesn't have to knock on doors to find out what's happening -- people who know what's happening compete for her attention.

Release 2.0 is a thoroughly accessible book. Most teenagers can probably read it without difficulty. It is, nonetheless, an informative book. Most readers -- even the most technologically aware -- will find new information or unexpected viewpoints.

Dyson perceives and seeks to fill a need for guidance about "the Internet and our roles as citizens, rule makers, and community members." You can think of her book as a seventh grade social studies textbook for the digital age. Fortunately, Dyson is better informed, better organized, more intelligent, and more free from outside pressure than the typical social studies textbook author. Her book is lively and interesting.

You can also think of Dyson's book as an attempt at large scale personal mentoring. The most gratifying review compliment I ever received about my books was from an aspiring young programmer who said "I felt like the author was sitting beside me when I read Inside BASIC Games." I think many young readers will feel that way about Dyson's book. She writes in the first person, talks about her personal life and history, and, most important, makes herself vulnerable. She displays what she explicitly encourages: courage, openness, disclosure, identity.

Esther Dyson has strong opinions. At the same time, she exhibits detached objectivity and fairness -- not just in this book, but in her other public pronouncements. When she talks about subjects like questionable content, unsolicited electronic mail, encryption, or Microsoft -- topics that often generate more heat than light -- she sidesteps the rhetoric and gets to the issues. Then she states her opinions.

In laying out a design for living in the digital age, Dyson addresses nine areas: communities, work, education, governance, intellectual property, content control, privacy, anonymity, and security. We have all heard a great deal about these subjects. Dyson gives overviews of the issues, talks about promising approaches to the associated problems, and shows how it all might work in 2004. Dyson's scenarios for 2004 are pretty lame science fiction, but they are concrete and well grounded in technological reality. They communicate in a way that facts alone can't.

Dyson closes her book with a set of design rules for living. I won't list them out of context. Many were good rules before anyone ever heard of the Internet -- always make new mistakes for example. Some have entirely new significance. For example, you've probably received messages warning you about the Good Times virus or asking you to send postcards to dying children. The normally sensible people who forward such messages need to remember Dyson's rule use your own judgment.

This book is an easy read. Everyone should read it.


The Death of Distance -- How the Communications Revolution Will Change Our Lives by Frances Cairncross (Harvard Business School Press, Boston, 1997, 320pp, ISBN 0-87584-806-0, 888-500-1016, $24.95)

Frances Cairncross is a senior editor on the staff of the Economist, where she has worked since 1984. In recent years she specialized in communications and media.

The Death of Distance explores the consequences of the premise that geography, borders, and time zones will soon be nearly irrelevant to the way we conduct our personal and business lives. Cairncross starts with thirty pithy predictions (she calls them developments to watch) and spends the rest of the book exploring them in the contexts of telephone, television, the Internet, commerce, competition, regulation, society, culture, people, government, and countries.

While this book seems to cover the same ground as Esther Dyson's the two books are quite different. Dyson concerns herself with how individual readers can live in the digital age. Cairncross focuses mostly on technologies, business strategies, and government action. Dyson maps the surface of the digital age. Cairncross takes core samples at interesting places.

Cairncross's predictions are interesting and largely believable -- especially since many of them have already begun to come true. For example, she believes that companies will adopt the model of Hollywood studios -- bringing in top stars at high prices for specific projects. I think she's right about that. She predicts a worldwide leveling of wages, and that too has already begun to happen. She predicts the rise of English as a global second language and the simultaneous strengthening of less widespread languages and cultures. Again, these things are happening already.

I don't entirely agree with Cairncross's prediction that email will lead to better writing skills. Email is asynchronous, that is, non-interactive, and it is written. The fact that it is asynchronous won't go away, so perhaps that will force people to express themselves more clearly and completely. The requirement that it be written is already disappearing. People who dislike written communication may soon be able to communicate electronically by voice and gesture instead. In any event, being forced to do something doesn't guarantee excellence -- or even improvement.   

Cairncross's predictions about company size ring true: more minnows with scarce resources but global reach; more giants offering high quality local services. On the other hand, her predictions about social and political issues seem naive to me. I think that the outcomes she predicts will instead be influences -- competing with many other social and technological influences to produce outcomes that no one can yet predict. For example, she predicts that we will have little true privacy and little unsolved crime. I say maybe so, maybe not.

This reservation aside, I think Cairncross has written an important book. It is well researched and thorough, and it covers the ground. I recommend it.


The Resurgence of the Real -- Body, Nature, and Place in a Hypermodern World by Charlene Spretnak (Addison Wesley, Reading MA, 1997, 286pp, ISBN 0-201-53419-3, $22.00)

Charlene Spretnak comes out of a very different tradition from the other two authors featured in this column, but she is just as deep a thinker. Over the last twenty years she has written about the women's spirituality movement, green politics, and a cosmological basis for comparative religion. In each case she found an overarching framework to help herself and her readers understand the subject. These frameworks, she claims, are not her inventions. Rather, their inherent coherence, wisdom, and healing potential led her to discover and embrace them.

In The Resurgence of the Real, Spretnak applies this same technique to various social trends of the 1990s. In doing so she provides a strong counterpoint to the other two books in this column. While the disembodied citizen of cyberspace represents the ultimate evolution of rationality, Spretnak sees growing signs of reaction to this and other manifestations of modernity's mechanistic world view.

Spretnak begins her analysis as follows. I've shortened it for reasons of space, so imagine that the text is sprinkled with ellipses:
We are told that the world is shrinking, that vast distance has been conquered by computer and fax, and that the Earth is now a global village. It feels, however, as if distancing and disconnection are shaping modern life. If anything is shrinking, it is the fullness of being experienced by the modern self.

For most people today the web of friends, nearby family members, and community relationships is a shrunken fragment of what previous generations experienced. "Leisure time" is now spent at a second, low paying job or in a numbed state of recuperation alone in front of a television.
This sounds pretty depressing, but Spretnak is an optimist. She immediately cites signs of hope: alternative medical therapies, complexity theory, and independence movements. These are examples of the resurgence, respectively, of body, nature, and place, the components of what Spretnak calls the real.

This is an academic book, and, not being an academic myself, I jumped immediately to the appendix, in which Spretnak spells out the system of beliefs, assumptions, and ideologies that social scientists call modernity. She calls the appendix Modernity is to Us as Water Is to a Fish. She means that we don't even notice -- let alone question -- these ideas, because they seem natural to us. In my case at least, she's pretty much right about that.
 
Spretnak spends a great deal of the book exploring the roots of modernity and the absorption or marginalization of schools of thought that opposed it. It's a fascinating story.

Her final chapter, Embracing the Real, contains a utopian, or at least optimistic, vision of the future in the form of a time-traveler story. William Morris, an actual nineteenth century writer who resisted modernity and who wrote time-traveler stories, visits the world of 2024, where he is a hero. Body, nature, and place figure prominently in this world. Computers do not.

Morris is fascinated by two shallow arches, side by side, leading only into the brick wall of a school building. The inscription over them says "Abandon All Hope, Ye Who Enter Here." His guide explains "Those are our Bill Gates. The faculty designed them as an irreverent monument to computerized education -- which was actually used in elementary schools back in the 1990s!"

I don't know if I agree with everything Spretnak says, but this is a good book. I'm glad I read it, and I hope you read it too.