It's easy to forget how cartoonishly blocky they were in the 8-bit era with modern consoles that offer graphics so realistic. In his new book, Creating Q*Bert and Other Classic Arcade Games, legendary game designer and programmer Warren Davis recalls his days imagining and designing some of the biggest hits to ever grace an arcade. The industry made a technological leap from 8- to 12-bit graphics.
Santa Monica Press is published in 2021.
I was particularly interested in the A-Squared video digitizer that came out for the Amiga computer, because it was a new product that I had never heard of before. Slowly, let's unpack that.
The Amiga was a home computer that could display 4,096 colors. The sound is eight-bit stereo. The IBM PC could not do things that image manipulation programs could. We had one at Williams because of its capabilities, but also because of our own Jack Haeger, who was the art director for the Amiga design team.
Video digitization is the process of grabbing a video image from a camera or videotape and converting it into data that a computer system can use. A full-color photograph might have millions of different colors. Even though the Amiga could only display 4,096 colors, it was enough to see an image on its monitor that looked almost perfectly photographic.
The video game system only displayed 16 colors. It was not possible to take photographic images at that level. Everyone in the video game industry knew that would change. As memory became cheaper, we knew that it would soon be possible to have a 128-color system. Mark Loffredo, our hardware designer, was already thinking about a new hardware system when I started looking into digitized video.
Let's discuss color resolution for a second. You know you want to. If you like, you can skip the next few paragraphs. The number of colors a computer system can display is called color resolution. It is all tied to memory. The video game system could display 16 colors. Artists weren't locked into a particular color. The hardware used a piece of wood. Artists could choose from a wide range of colors, but only 16 of them could be saved at any given time. The colors could be changed while the game was running. Changing colors in a palette allowed for a technique used in old video games called "color cycling."
The hardware had to know which color to display at each location by looking at the 16 colors in the palette. The collection of memory was called screen memory. It takes 4 bits to represent 16 numbers, so if 4 bits are 1pixel, then 1byte of memory can hold 2 bits. It would take 8 bits to represent the number of colors. That is 8 bits per piece.
You would need twice as much screen memory to display 16 colors. Game manufacturers wanted to keep costs down as much as possible because memory wasn't cheap. Before management approved doubling the screen memory, memory prices had to drop.
We take for granted the 24 bits per inch color resolutions we have today, which could potentially allow up to 16,777,216 colors and true photographic quality. It seemed like a luxury back then to have so many colors. Even though it didn't approach the Amiga's 4,096 colors, I was convinced that it could result in close to photo- realistic images. The idea of having movie-quality images in a video game was exciting to me, so I pitched to management the advantages of getting a head start on this technology. They bought me a digitizer so I could play with it.
The Amiga's digitizer was crude. It was very crude. It had a piece of hardware that plugged into the Amiga on one end and a black-and-white camera on the other. The camera had to be mounted on a tripod. You put a color wheel between the camera and the subject after pointing it at something. The color wheel was a circular piece of plastic and divided into quarters with different colors.
When you started the process, the motor turned the color wheel slowly, and in about thirty to forty seconds you had a full-color image of your subject. A total of 4,096 colors were possible with the Amiga's "full-color" meaning 4 bits of red, green, and blue.
It is hard to believe that this was so exciting. It was like something out of a science fiction novel. The potential that was there was coolness, but it wasn't so much how it worked. The Amiga digitizer was not practical, and the time it took to grab each image made the process mind-numbingly slow, but just having the ability to produce 12-bit images at all enabled me to start exploring algorithms for color reduction.
The process of taking an image with a lot of colors and finding a smaller number of colors to represent it is called color reduction. If you could do that, the image would have a number that pointed to one of the colors in the picture. Each index could fit into a singlebyte with the help of a 256 colors palette.
I needed a program to figure out how to pick the best colors out of thousands of images. Since there was no internet back then, I scoured through academic journals and technical magazines for research in this area. I found some. There were many papers written on the subject, each with a different approach. I implemented a few of these methods over the last few weeks to generate a 256 color palettes. Some gave better results than others. Many of the colors could be allotted to different shades of a single color, so images that were inherently monopolized looked the best.
Loffredo was working on his hardware. His plan was to support multiple circuit boards, which could be inserted into slots as needed, much like a PC. A single board would allow you to draw on one surface plane. You were given two planes, foreground and background by a second board. You could give the illusion of depth in a game by having enough planes and having each plane scroll horizontally at a slightly different rate.
Eugene Jarvis was going to head up the video department at Williams after completing his degree and the day came that he was going to return to Williams. This was big news. Most people were excited about this. I know I was because the video department was still without a strong leader. Eugene, given his already legendary status at Williams, was the perfect person to take the lead, partly because he had some strong ideas of where to take the department, and also due to management's faith in him. Eugene had carte blanche in their eyes, unlike anyone else who would have to convince management to go along with an idea. He told management what they needed to do and they made sure he and we had the resources to do it.
Loffredo's hardware system was toast because of this. Everyone jumped on board when Eugene had his own ideas. He wanted to create a system with a new chip from Texas Instruments. The core of the 34010 was graphics-related. Normally, the graphics portion of the hardware would not have a direct connection to the CPUs. The need for a graphics co-processor was eliminated by the 34010's capability.
The speed of its graphics functions, while well-suited for light graphics work such as spreadsheets and word processors, was certainly not fast enough for pushing pixels the way we needed. Mark Loffredo went back to the drawing board to design a chip for the new system.
The next generation of video digitizing was signaled by a new piece of hardware that arrived in the marketplace. The ICB was developed by a group within AT&T called the EPICenter, which eventually split from AT&T and became Truevision. The ICB was one of three boards that were offered, the others being the VDA and Targa. The ICB has a piece of software called TIPS that allows you to do some minor editing on images. The boards were designed to plug into an internal slot on a PC that was running the original text-based operating system for the IBM PC. Where was Windows? Windows 1.0 was not widely used or accepted. The release of Truvision's boards in 1990 caused Windows to become popular, but it wasn't until version 3.0 in 1990 that it became popular.
Truevision created the file format for the TARGA series of boards, which is why the TGA file format is still around today. The ICB was a huge leap forward from the Amiga digitizer in that you could use a color video camera, and the time to grab a frame was greatly reduced. It stored the colors as 16-bits instead of 12. This meant 5 bits each of red, green, and blue, the same that our game hardware used, and resulted in a true-color image of up to 32,768 colors. The reduction of the Palette would be crucial in the process. The Truevision boards came with a Software Development Kit, which allowed me to write my own software to control the board, tailoring it to my specific needs. This was amazing. I was so excited that my head was spinning.
Most people making video games in those days thought about the future. The speed and memory limitations we were forced to work under were temporary. We realized that we were at the forefront of a new form of story telling, even if the video game industry was not a fad. Maybe this was true for me because of my interest in making movies. My experience in the game industry has made me think about what might happen. The holy grail for me was interactive movies. The idea of telling a story in which the player was not a passive viewer but an active participant was very compelling. People were experimenting with it. The earliest examples of text adventure games were probably Zork and the rest of the Infocom games. I didn't know if the technology needed to achieve my goal of fully interactive movies with film-quality graphics would ever be possible. I didn't think about the visions of the future. They were thoughts in my head. It is nice to dream, but at some point you have to come back to reality. You can be sure that you will never reach your ultimate destination if you don't take the one step in front of you.
I began to learn about the board's limitations and capabilities after diving into the task. You could grab a single image from either a live camera or a videotape with the first iteration of my software. I made it possible to find the best palette for that image by adding a few different reduction methods. I added the ability to find the best palette for a group of images, since all the images of an animation needed to have a consistent look. Artists had to manually erase the background in those early boards because of the lack of a chroma key function. I made some tools to help them do that.
This was far from what I wanted for, which was a system where we could point a camera at live actors and instantly have an animation of their action running on our game hardware. It was a start.