Yes, boys and girls, the big new thing in game design is not some new graphics chip or some snazzy new storage medium with 33 zillion bytes of space it’s mathematics (Yay!)

OK, I know that some of you may be less than thrilled by the prospect, so let me explain. The trigger point for this essay was a book I read recently called Capitalism & Arithmetic: the New Math of the 15th Century. The book shows how arithmetic developed in the 15th century as a response to the growing mercantilism of the day. Once people started trading lots of things in lots of different currencies, it became more important to figure prices and costs correctly. Yet the mechanisms for such calculations were woefully inadequate. In the 14th century, most people were using Roman numerals and abaci for calculations, a truly tedious approach that was only useful for simple calculations involving just addition and subtraction. The big developments of the 15th century were 1) the widespread acceptance of the Arabic/Hindu numeral system; and 2) the development of practical methods for multiplication and division. Here is where I came across one of the most startling quotes in the book. It was found in a letter from a German father to his college-bound son. The son had written asking where he should go to school. The father wrote back, "If you want to be just a normal everyday businessman, then any of the colleges in Germany will do fine, but if you really want to master the fine points of business, then you’ll need multiplication and division, and for that, you must go to college in Italy."

Now, it’s tempting for us to laugh at the lack of sophistication of these people. I mean, rully, dahling, any 6th grader can multiply and divide. They sure were dumb back then.

However, let me point out a few things. There were many different methods for multiplication and division, each one most applicable to a different class of problem. However, different methods consumed different amounts of paper, and paper was an expensive commodity in those days, so you had to be well-versed in all the different methods. I hold a master’s degree in physics and am quite well versed in math, but I had to struggle to understand some of the rather convoluted methods these people used.

But there’s another, larger lesson to be learned here: mathematics has taken an increasing role in our lives. A thousand years ago, nobody needed math for anything. It was purely an academic discipline. But as trade picked up, a need for basic arithmetic developed, and by the 15th century that need was so great that most businessmen had to learn it. By the 16th century basic arithmetic (addition, subtraction, multiplication, and division) were commonplace among businessmen. Numeracy the general knowledge of mathematics increased over the centuries. By the 19th century, for example, all military officers were required to learn math up through trigonometry, and in fact trig was considered standard fare at the college level. By the early 20th century, the increasing demand for scientists and engineers had promoted calculus to the level of common college material. Nowadays, of course, the brighter high school students study calculus.

Technology has done much to increase numeracy. When I was an undergraduate student in physics, we used slide rules for all our calculations. (I still have my father’s slide rule and I still know how to use it.) However, such methods required a great deal of mental preparation. I knew all sorts of tricks for performing calculations in my head or boosting the utility of my slide rule. Then came electronic calculators. At a stroke, a tremendous amount of expertise went down the drain. Within two years all the undergraduates had them and suddenly numerical calculations were much simpler. It changed the way we thought, the way we approached calculations. Taking a standard deviation, previously a hideously tedious task, became ridiculously simple with my HP-45. I could do correlation coefficients in a matter of minutes! Suddenly a huge range of computational opportunities lay at my feet. But to take advantage of these opportunities, I had to increase my grasp of math. What good is it to take a correlation coefficient if you don’t know what one is?

Technology continues to advance numeracy. Nowadays, there are literally millions of people out there using spreadsheets; do you realize how much math can go into a typical spreadsheet? Sure, lotsa spreadsheets use little more than simple sums, but it’s surprising how many users, once they discover all the functions built into the software, start using those functions. Some people build impressive spreadsheets with all sorts of mathematical complexity. And look at all the other mathematical calculations that are part of our everyday lives. How many times a day do you calculate tips at 15% or sales tax at 8%? If your marginal tax rate is 35% and it costs $800 a month to rent a home and $1200 a month in interest costs to buy it, should you buy or rent? If you’re earning $65K and your boss promises you a 5% pay raise, how much is that? If your credit card has a interest rate on unpaid bills of 2% per month, is that good? Do you see how much this stuff has permeated our lives?

This may surprise those of you who have read some of the doomsday books bewailing the diminution of numeracy in our society. These books claim that people just don’t have the mathematical skills that they once enjoyed. In some ways, this is true. For example, I have noticed that many people have a problem with scientific notation. Jeez, I learned that in high school, as part of the standard coursework that all college-bound students took. But perhaps scientific notation isn’t as critical in a calculator-rich society. (Does anybody remember "casting out nines"?) I don’t know. I do know that mathematical requirements change in two ways with time: first, they shift upwards in complexity, and second, they penetrate wider parts of society.

"What does any of this have to do with game design?" you ask in consternation. (Many people often ask this question when I start talking.) My point for game designers is that we are now entering a new period in which mathematics will play a larger role than before.

Quick, let me digress again.

Since the dawn of computer game design, far back in the misty ruins of time, the primary design issues have been technical ones, or, more specifically, programming issues. In the early days, everything hung on how clever you were with assembly language code. More recently, assembly language has given way to C, and the gnarliest programming talents have become less useful. With each passing year, programming games has been done at a higher and higher level, and designers have been able to give attention to larger or more involved problems. One of the unnoticed aspects of the smashing success of Doom is that the spiffy 3D texture-mapped graphics engine that it uses relies on some fairly heavy-duty math.

The use of math can only gain ground. Remember, during the 8-bit days, you could only add and subtract; multiplication and division required separate software routines and were much slower. With the advent of the 16-bit processors, hardware integer multiplication and division became standard, and the ability of programs to take advantage of math took a giant leap forward. Nowadays, we’re seeing floating point hardware built into the chips, and general-purpose arithmetic is suddenly becoming much easier. This may be one of the most significant developments of the big new CPUs. Sure, they’re faster and they load more bits, but the big qualitative change is their ability to do real arithmetic.

But there’s an even more important reason why math will play a larger role in our designs: we’ll need it. Just how do you think that we’ll be able to put together big complex interactive storytelling systems? So far, we’ve been able to run role-playing games and graphic adventures with simple-minded boolean logic. But let’s face it, folks, you can only accomplish so much with single-bit calculations. It’s funny, it seems that the whole thrust of everything else has been for higher and higher resolutions. We wanted more color resolutions more bits/pixel in our screens, so we went from black and white to 16 colors to 256 colors. We wanted more spatial resolution, so we went from 320x200 to 640x480. We wanted more aural resolution so we went from simple beeps on the PC speaker to digitized sound effects and music. The march of computer games has been one of adding more and more bits of resolution in everything except our logic. When it comes to logic, we’re still poking along with the same ancient one-bit technology that our great-grandads used on the Atari VCS.

And what is the multi-bit extension of boolean logic? Arithmetic! Don’t you think it’s high time that we brought our logic into at least the 1980s?

How will we use arithmetic? It’s obvious for such things as flight simulators and 3D texture-mapped graphics; but it’s also necessary for such things as interactive storytelling. For example, if you’re going to simulate a character’s emotional reactions to a situation, do you really expect that reaction to be handled as a simple boolean on-off switch? Will your characters be one-bit, black-and-white characters, or will they have multi-bit shades of gray? Will their responses to emotionally complex situations be simple IF-THEN relationships, or will they be more probabilistically calculated?

Note that I am not arguing for high-powered stuff. We really don’t need to talk about differential equations, integral calculus, tensors, or that other stuff. For now, simple four-function algebra is all we need to do some really impressive stuff. We can get snazzy later.

This raises another point: technology progresses much faster than art. When we contemplate the astounding pace of improvement in microcomputer technology over the last twenty years, we are tempted to sweep it all together into a glowing fog of technological optimism. But if you think about it, there have been just two fundamental technological developments affecting microcomputers over the last twenty years: the decreasing size of integrated circuit elements and the improvements in disk drive heads and media. We might also include the compact disk technology, but it was created more than twenty years ago and all we’ve seen in the last twenty years has been the hooking up of I/O circuitry to the medium. Everything else has been followup or secondary-level technology. And consider further that both of these fundamental improvements were primarily matters of refining a technology. Integrated circuits and disk drives are 1960s technologies; we’ve spent the last thirty years just touching them up.

Compare all this dramatic growth with the rate of improvement of our appreciation of the technology. Consider, for example, the idea of the spreadsheet. We had the hardware for spreadsheets in the 1950s, but it wasn’t until 1979 that somebody actually thought of the idea. Telecommunications systems such as CompuServe and the Internet all existed in the early 1980s, yet none of this really caught fire until the 1990s.

And look at games as a medium rather than a technology. Sure, we’ve seen dramatic leaps in the technological side of games, but what about the design side? I think we all have to admit to ourselves that our current products are little more than rehashed versions of designs dating back to the 1970s. Doom is just a souped-up version of Combat! for the Atari VCS. Myst is just a snazzy version of Zork. I suppose it’s unfair to use the term "just" in these comparisons; after all, a lot of impressive work went into both products. But the fact remains that, in design terms, we really haven’t made that much progress.

That’s because technology progresses on a simpler basis than art. To get from an 8008 to a Pentium, all you have to do is decrease the channel size. That’s one dimension of improvement. It takes a lot of work, mind you, but it’s still basically a single-goal task. But to go from Pong to interactive storytelling is another matter entirely. To do this, we have to integrate a much larger range of ideas. This process is necessarily slow and clumsy.

Consider the artistic versus technological development of other media. The basic hardware of music-making was pretty much nailed down by 1700 AD, yet it took another 150 years to develop the theory for how to use all that hardware effectively. The technology of moviemaking was in place by 1900, with sound coming in around 1930 and color around 1945, yet the art of moviemaking has greatly lagged the technology. They didn’t really start to get a good grip on plain old silent moviemaking until the 1920s; good talkies became commonplace ni the late 30s; and the full panoply of moviemaking that we see today has developed all the more slowly. Compare one of the James Bond movies of the 1960s with one from the 80s and you’ll see what I mean. Even though the technology hasn’t changed much, the art has changed dramatically.

So we must accept that game design will advance much more slowly than the technology. It will only advance as we integrate a wider variety of disciplines into the overall art of interactive entertainment design. One of those disciplines, I am certain, will be mathematics.