Why CD-ROM Games Won't Work (Yet)

Note written December 27th, 1997:

This essay has interesting historical value. In it, I predicted the future course of CD-ROM technology with some accuracy. I failed to anticipate the increase in speed and the decrease in price, but that didn’t seem to affect my prediction badly.

I once had an argument with a friend over CD-ROM. He was all fired up with enthusiasm for the technology. Can you blame him? Here we have a technology that stores 550 megabytes of information on a platter that costs $1.50 to manufacture. Think about that. A typical CD stores roughly a thousand times more information than a typical floppy disk. That’s three orders of magnitude! Can you imagine the revolutionary impact of a thousand-fold improvement on any other aspect of computing technology? For example, suppose we could increase processor speed by three orders of magnitude, up to, say, 10,000 megahertz. What could you do with that kind of speed? Or suppose that RAM sizes jumped up to a gigabyte? Or what if the price of a typical home computer system fell from $3000 to $3? That would change things, wouldn’t it?

These were the kind of thoughts going through my friend’s mind as we talked. I was the pessimist who saw lots of problems. My friend declared that, within two years, compact disk technology would "blow everything else right out of the water."

That was in 1983.

True, we have seen progress. CD-ROMs are now readily available, and we have our first entertainment CD-ROMs courtesy of Activision. NEC has released a videogame machine that includes a CD-ROM drive as a peripheral, and is already shipping some CD-ROM titles. The Headstart computer has been released with a CD-ROM drive built in. Next year we will see the commercial release of CD-I, a box with a CD-ROM drive, a 68000, and a megabyte of RAM. Both IBM and Commodore are rumored to have impressive CD-ROM hardware coming Real Soon Now. It would appear that the long-delayed CD-ROM tidal wave is about to inundate us.

A number of software publishers and developers have been preparing for this day. Electronic Arts, Sierra, Cinemaware, and Software Toolworks are all rumored to have made major investments in CD-ROM technology. New development houses have been tooling up to create the new software required for CD-ROM. Industry wisdom has reached the point of certainty: CD-ROM is the Next Big Thing. CD-ROM will revolutionize the entertainment software industry. CD-ROM will blow everything else right out of the water.

I disagree. I think that there are fundamental constraints on this technology that will hobble it for years to come. The three constraints I see are access time, development cost, and data intensity.

Access Time
Access time is a well-understood problem. Track seek times are measured in seconds; worst-case access times are five seconds. Average access times are still in excess of one second. Once the track has been acquired, transfer speeds are still little better than those of floppy disks.

Now, one might think that such performance, while regrettable, remains acceptable. After all, a slow floppy disk drive is still usable. The problem with this thinking is that it ignores the vastly greater size of the CD-ROM. The whole point and purpose of the technology is to provide megabytes of information. Sucking an ocean of information through a skinny straw intended for a glassful is not a useful exercise.

The problem is particularly pronounced with games. If you need some obscure bit of data available on a CD-ROM database, waiting a few seconds for the data is still faster than looking it up manually. But a game is not measured against such standards. It’s not as if all previous games are even slower than the CD-ROM games. And the user isn’t required to play the game in the first place. If the delays intrude into the enjoyment of the game -- and they surely will -- then the user won’t enjoy the experience.

CD-ROM programmers have responded to this problem with a series of clever innovations that dramatically reduce effective access time. I say "effective" access time because these are software techniques that work around the limitations of the hardware, not fundamental improvements in the performance of the hardware. The basic approaches all involve crafty organization of the data on the CD-ROM to insure that the chunks of data most likely to be called on next are immediately adjacent to the current track position. The result is software that appears to respond to the user with barely perceptible delays.

The problem with such schemes is that they distort the design of the game. In effect, these schemes transform a CD-ROM from a random-access medium to a serial-access one. The designers insist that random access is retained, but the smooth performance is only obtained when the predesignated serial path is followed. In other words, if you play the game in serial fashion, like a story, it works great; deviate from the intended path and the whole show gums up. The game designer who is aware of this problem will lean towards a more serialized product, with larger chunks of static data and less branching.

Now, this would be acceptable if you wanted to tell stories in the first place. But people don’t need computers with CD-ROM drives to experience stories. They need VCRs -- and they already have those, thank you very much.

Development costs
The second problem arises from the huge costs of developing entertainment software for the CD-ROM medium. Now, software costs have been rising steadily since Day One. Currently, a top-of-the-line product will cost $100,000 or more to create. That covers the programming, artwork, sound effects, animation, and music. But CD-ROM is another story entirely. You’ve got to come up with 550 megabytes of entertainment to fill this baby; where you gonna get it? Program code will cost you something like $500K per megabyte; artwork will cost maybe $10K per megabyte. Sound, in large volumes, is cheaper but I don’t have good estimates for its cost. It is likely the CD-ROMs will be filled with digitized video taken by film crews with actors and cameras (some work along these lines has already been done at several development houses.) However, even this approach costs a great deal of money. It is very likely that a full-scale CD-ROM product will cost well over a million dollars to create.

The economics just don’t support such development costs. To earn back a development investment of a million dollars, you’d need to sell several hundred thousand units. That is at the upper end of what’s been sold in the disk-based world -- and the installed base of CD-ROM drives is unlikely to approach the installed base of floppy drives for quite a few years.

Data intensity
The major argument against CD-ROM games is neither technical nor economic it’s theoretical. CD-ROM runs against the grain of good game design because it’s a data-intensive technology, not a process-intensive one. Interactivity, the essence of the game experience, springs from process intensity, not data intensity. Data can support and enhance the gaming experience, but it plays a secondary role. Processing is the core of a game, and CD-ROM does not enhance processing one whit. You can’t interact with data. You can’t play with it. You can look at it or listen to it; that’s all.

see "Process Intensity"

This is an abstruse point that I have had difficulty impressing upon people. The concept of process intensity is a rarefied one, and some people are all to ready to reject what they don’t understand. To make matters worse, I have done a lousy job explaining it, partly because I don’t fully grasp the concept myself. So let me try again. This time I shall use a more concrete approach.

Suppose I presented you with a movie on videotape. Glorious graphics and sound it has, with magnificent acting, brilliant directing, fabulous set design, and so forth. But this videotape has one restriction: it only works on a player with no pause, no fast forward, and no rewind. You may only watch it straight through. It would still be a great movie, but its interactivity would be nil.

Now suppose I permit you to watch this videotape on a regular VCR with the normal controls. The interactivity of the experience would increase slightly, because now you could go back and review good scenes, stop the action to consider it, or play with it in any fashion you desire. Of course, the amount of interactivity is still low, because your options are so limited. The experience is not very malleable. Besides, the videotape is so slow to rewind or fast forward that the effective interactivity is very low. We might call it a "90% serial access device."

Now suppose that I gave you the same movie on optical media. The constraints are basically the same as with the videotape. It is still primarily a serial access device. You still watch the movie with the option to jump around. There is one small change: the access times to jump around are lower than the videotape’s. Now, if the access times were zero, then we would have a random access device, but they are not. This might be called a "60% serial access device." The potential for interactivity is higher than with the videotape, because you can jump around faster. But we are still talking about a low-interactivity device.

OK, now let’s take the next step in the progression. What if you could actually change the movie? That is, what if you could change not only the viewing sequence, but also the actual content? Clearly, this would be much more interactive than merely being able to skip around.

A simple and dumb way to provide this capability would be to anticipate likely variations, film them along with the rest of the movie, and devise a scheme for permitting you to choose between options. This idea was first implemented more than twenty years ago. You could do it with a CD-ROM; it would be even more interactive than previous schemes.

But there would always be limitations on this method. There are only so many scenes that you can film, and only so many minutes of action that the CD-ROM can hold. So the interactivity, while better than earlier schemes, would still be limited.

A better way to let you interact with the movie would be to generate everything on the fly. Suppose that the scenes of the movie could be algorithmically generated. Suppose that the characters could be slapped together on command. Suppose that their appearances, personalities, dialogue, and so on were all computed at run-time in direct response to the actions of the player. This would be interactive! If the player wants to take the action off in some new direction, the algorithm could follow him more readily because it’s not tied to any mass of data on the CD-ROM.

Whoops! We just passed out of the realm of data and into the world of algorithm. You can’t process algorithms with a CD-ROM. You need a processor to do that. And a CD-ROM won’t do a damn thing to help you with that problem.

Thus, the CD-ROM represents a diversion from the path towards the very best games. It drags us away from algorithmic approaches and pulls us towards more data-intensive strategies. It therefore represents a wrong turn in game design.

On the other hand...
This is not to say that CD-ROM games will fail miserably. Here’s my guess as to the most likely future scenario for the technology:

The installed base of CD-ROM drives will continue to expand over the next few years, largely because of their undeniable utility for storing huge special-purpose databases. Likely sales to such a small installed base will not be sufficient to justify the development costs. Nevertheless, a large number of publishers will invest a great deal of money in CD-ROM titles in anticipation of being well-placed for what they hope to be a major new market. These publishers will release some impressive titles starting in 1991.

CD-I will hit the streets at about the same time, accompanied by much fanfare and excitement. The stock of entertainment titles will be adequate and sales will be acceptable. The novelty value of the games will provide an initial surge of enthusiasm. However, the steep price will prevent CD-I from achieving high sales volumes.

For the next few years, the market will be driven more by promise than product. Some publishers will continue to pour money into developing titles whose sales do not justify development costs. Consumers will become disillusioned with products that don’t quite deliver the entertainment value they had expected, but will peg their hopes on the new product due out Real Soon Now.

By 1995, the shine will be off the apple. The marketplace will have decided that CD entertainment software offers much sizzle and little steak. The technology won’t die out the way laser disks died out. With a firm foothold as a useful tool for serious applications, and a dedicated cadre of enthusiasts, it will hang in there. Instead of dying, it will lapse into a state of slow growth.

I think the experience of the national consumer telecommunications networks such as GEnie and CompuServe provides us with our best example of the likely growth of CD entertainment software. After an initial burst of enthusiasm, the networks settled down to the slow and painstaking task of building a market. They’ve not enjoyed the spectacular growth that home computers had in 1983, or desktop publishing had in the late 80s. But they’ve continued to grow year by year. I think that the 1990s will be for CD entertainment software what the 1980s were for consumer telecommunications. By the turn of the century, CD-ROM, CD-I, and CD-otherwise will be well-established, profitable, and abundant. The compact disk may well have supplanted the floppy disk as the primary medium for selling entertainment software (assuming ISDN hasn’t already bumped off the floppy disk.)

The important observation is that compact disk technology is not going to blow everything else out of the water. The CD is an impressive technology, but it has fundamental limitations (access time, cost, and data intensity) that will prevent it from running away with the marketplace. This revolution will take another decade to mature.

To make it happen, though, designers will have to get over their initial infatuation with all that data and start to design games in which the CD-ROM plays a supporting role instead of hogging center stage. This will be impossible at first; everyone will want games that show off the capabilities of the technology. Once we get over that phase, then we can start to use the technology effectively.

I realize I’m going way out on a limb here, predicting the development of a major technology over the course of a decade. I’m one of the few people with more than a decade of experience in the industry, so I suppose that qualifies me to make a fool of myself.

Some Spectacular Failures
For those technological optimists who are ready to dismiss my predictions as fogeyisms, I’d like to offer a couple of cautionary tales.

Does anybody remember bubble memory? This technology, developed in the 1960s, relied on the admirable idea of moving the bits of data through the medium rather than moving the physical medium itself. Tiny magnetic field coils move streams of magnetic bubbles through a substrate. Each bubble stores one bit of information. The basic technology boasts several advantages over floppy disks and hard disks. First, bubbles are more robust than rotating media because there are no moving parts. They promised lower power requirements, smaller size, and lower cost. Moreover, because they are a miniaturizable technology, bubbles promised a steep learning curve with large payoffs once production became established. By contrast, rotating media were a mature technology with little apparent potential for further improvements. By the late 1970s, the industry wisdom was that rotating media were an archaic technology that would soon be replaced by bubbles.

It didn’t work out that way. Rotating media continued to improve year after year, always staying one jump ahead of bubbles. There was such a huge installed base of rotating media that the industry could afford to pour millions of dollars into research to improve the media, the drives, the heads, and the interface electronics. With all that R&D money, a fundamentally inferior technology was able to stay ahead of bubbles.

My other example is the laser disk arcade game. The first of these,
Dragon’s Lair, burst upon the scene in the summer of 1983. It was a sensation. With magnificent graphics and animation created by Don Bluth, the game was a smash hit. All the arcade game companies frantically commissioned laser disk games, and by 1984 a number of these were on the market -- where they bombed. Even though the second generation laser disk games were much superior to Dragon’s Lair, they were still market failures. The value of the technology lay in its novelty. Once the novelty value was expended, people recognized that the game play just wasn’t there. The second generation games had better game play than Dragon’s Lair, but they still weren’t good enough to recoup their enormous development costs. Laser disk games have been all but abandoned in the arcades.

On this point, I’d like to lay claim to having called this shot correctly. In the summer of 1983, when everybody else went gaga over laser disk games, I was the only naysayer on the planet. Old Fogey Crawford harrumphed that the laser disk was a data-intensive technology, and so could not provide worthwhile game play. The marketing sharpies and technical whizzes at Atari smiled in patronizing humor at Old Man Crawford’s silly theories, and made their decisions based on solid market research and proven technological capability. Well, they were wrong and I was right. Nyaah, nyaah, nyaah!