One of my pet peeves about gamers and discussing game consoles is the “#-bit graphics” phrase & argument. We have all used it to try and convey some idea what what a game looks like but truly the better phrase would be to define graphics by era or generation as opposed to “bits.”

In other words, BITS DO NOT EQUAL GRAPHICS

Here is a quick example. Behold, the graphics of a game running on hardware powered by a 16-bit CPU (but an 8-bit VDP)

classic99_2

Granted I love the TI 99/4A but if we are talking bits and graphics, it oddly does not look like a Sega Genesis or SNES game. If I have been taught that bits=graphics, something is terribly wrong here!

When I was a kid, everyone talked about bits. I have a few vivid memories of arguing or hearing others argue over which systems were better because of their bits. I even remember reasoning out with a friend that the Atari 2600 must have been 4-bits because the NES was 8-bit (we were 5-6 years old at the time). I also remember a big deal being made out of PCs in conversation, when it came to why a 386 was better than a 286(there were other more valid reasons but bits would come up). Little did I know at the time that it would become a marketing ploy invented as the easiest way to convey the difference between game systems when the 16-bitters rolled around because we didn’t have a clue how to convince Mom and Dad why they should spend a lot of hard earned cash on the new game console. “What’s wrong with the one you have? It’s just fine!” we would hear. The problem is that in reality, the whole issue of bits is much more complicated than that.

This video is a fine example of the same game concept coming to a bunch of different hardware platforms, most of which happen to use 8-bit primary components. I should mention that the artist(s) behind a game makes an enormous difference but I want to focus more on hardware.

Yes it is true that processors do have bit levels where they process a certain amount of data in a cycle or on the bus as defined by bits. But you can have different parts of the processor operate at different bit levels. The famous Motorola 68000, used in many arcade games, the Sega Genesis, the Atari Jaguar and other machines processed information on a 32-bit level internally but only on a 16-bit level externally. This might be confusing as to what that means, when I trained people on computers years ago I explained it as the PC being a heart and the bits being blood cells. Every heartbeat was a clock cycle and the bits were how many blood cycles could be processed per beat. Sure it is simplifying things but it should convey the general idea. If I had two processors that were equal in every feature except the bits, then the one with the higher bit bandwidth would be more efficient and faster. But the higher bit one would need software written specifically for it to take advantage of it. For a long time Windows OS has been available in 32-bit and 64-bit flavors. The transition over to 64-bit CPUs (which all new PCs use and have used for some years now) has taken some time but if you are reading this on a PC built within the past 10 years, chances are you have a 64-bit CPU in your box.

The general idea is that Bits have to do with speed more than they do appearance. This is a concept that is contrary to what kids of the 80s/90s understood when we had marketing departments constantly beating into our heads the importance of bits and graphics. But they were blowing smoke up our you-know-whats. As just mentioned, a 1024-bit processor will be better than a 64-bit one since much more information can be processed in the same amount of time. But that does not define a boundary of what the graphics can look like.

Tell me, what “bits” are the Nintendo WiiU, the Xbox One and the PlayStation 4? You can even take a step back to last generation, what “bits” were those? Notice that no company brags about what bits their systems are anymore. The PS2/Dreamcast battle is the last time I heard that brought up, then the Xbox came along with its 32-bit weak CPU, a decent 128-bit GPU. With that, the bits discussion ended for new systems. No systems are named after the bits anymore; it is not printed right on the console like it was with the Sega Genesis, the Nintendo 64, the Jaguar, etc. Look up spec sheets for the newest consoles and that info is generally not mentioned or revealed by the manufacturers. Do you know why? Because players would read that a lot of the components in contemporary systems are 64-bit. Much like the Original Xbox’s 32-bit CPU, this would cause a lot of confusion if it was pushed heavily to the generation raised on the “bits=graphics” idea. So they don’t even bother to mention it  anymore. To do so would admit that they were pushing a bit of BS to get you to buy. It may come as a shock to some people but the CPUs for the PS4 and the Xbox One are 64-bit. If each generation was higher in “bits” they would all be pure 512-bit consoles by now. But they aren’t. From a marketing perspective, it is much easier and smarter to just ignore the issue than to try and explain to everyone that they were hoodwinked. Shake your fist and curse Sega in the early 90s for being the ones to really push that in their Genesis marketing. Here are a few commercials from the time, the last one is in Dutch I think but you can hear them drop the 16-bit line several times:

If you still aren’t convinced then let’s assume for a moment that you are right, that there is a clearly defined level of bit graphics, such as “64-bit graphics”. How come these new systems sporting 64-bit CPUs are LIGHT-YEARS beyond what an Atari Jaguar or Nintendo 64 can do? Also, please explain these PC graphics cards on Newegg.com. That search shows a couple of hundred currently available cards that are all listed as “64-bit”. Are these all supposed to produce something that looks like it is straight out of the mid-90s or can they do better? Oh they would be better? WHY IS THAT SENATOR???

Because bits aren’t the most important piece in the system. They do matter to a degree but again, it has more to do with speed that information is processed within the system.

You have many components to any computer system. Typically the CPU, the RAM, the GPU are the parts most people know about. Terms like hard drive or modem don’t matter in this discussion. But there are other pieces to the puzzle as well. One that is rarely mentioned is the bus. If you look at a motherboard, these are the tiny lines you see connecting different microchips together. Those have speeds measured in both MHz/GHz and bits. RAM has that too. But there are other features you can cram into a microchip that doesn’t have a whole lot to do with bits. One well-known feature for example are Cores. That has been a buzzword for a long time. Hyperthreading too. In GPUs they have Stream Processors, hundreds of them. But GPUs have many other components which help draw the graphics in certain ways – pixel shaders, RAMDACs, VRAM and a bunch of other stuff that have little or nothing to do with bits aside from needing a higher bit-bus width to squeeze that stuff out a little faster.

The architecture of game systems is very important but not well understood by most gamers. That is how all these things are tied together and operate. There are various ways to design a computer, it does not have to fit into the notion you think of either. x86-64 PCs are just one way; Macs are another, Android tablets have a ton of variations, game consoles almost always go about shuffling and processing data in different ways, etc. All of these factors have to do with how games can potentially appear on a system. Bits are just one component of it all.

The Texas Instruments 99/4A was the first PC to use a 16-bit processor but no one would compare it to a Genesis. The TurboGrafx-16 took some heat for promoting itself as a 16-bit system while having an 8-bit CPU but a 16-bit video display processor. The Atari 2600 had a bunch of 8-bit components but it could not do things that other 8-bit systems like the NES or SMS could do. If you compare Combat to Solaris though, you probably wouldn’t know it was on the same hardware. The Atari Jaguar is always one that people argue about. It had a 16-bit CPU, a 32-bit GPU with two 64-bit chips embedded into that GPU which directly could manipulate the screen, a 32-bit sound chip, a 64-bit bus, 64-bit RAM chips. The architecture was very odd though – using a unified memory bus (which the Xbox 360 does, although it has better cache memory to avoid some issues such a bus can have) you could get more by running everything in parallel or you could completely ignore chips like the 16-bit Motorola 68k (some coders experimenting on the Jaguar have found performance boosts from not using that chip since it slows everything down when it has to use the bus). But you can’t explain that in an ad. It had enough 64-bit components they could market it as a 64-bit system but they took a page from Sega and tried to use that a way to get people to buy. It worked for the Genesis but not the Jaguar; looking at sales of the N64 that gimmick also didn’t win over more consumers at the end of the day compared to the PS1.

(Jump to 2:00 min for the game. The Jaguar was made for pushing 2D graphics and the main hardware was finished in 1992, it was released in 1993 while Atari scrapped enough cash together for a launch. With proper coding and art, it could pull off some great stuff, like this NATIVE demo above. It is silly to believe that it and the Nintendo 64 released in 1996 should have similar graphics capabilities. Or even the PlayStation which had a couple of extra years of hardware development time and a much bigger budget behind it. If Sony or Nintendo made a system that couldn’t produce better than Jaguar graphics by 1995/96 they would have rightly been called incompetent.)

If I designed an 8-bit system with today’s technology, I could blow away anything a system from the 80s could do, even if I kept it all 8-bit because I could run it at a clock speed and with much more memory and a few other bells and whistles that wouldn’t have existed then. It would be an 8-bit system but the graphics it could produce would go way outside of the presumed “8-bit graphics” realm. An 8-bit CPU running at 1 GHz would crush the old 6502 which ran at 1.79 MHz. A GPU/VDP running at today’s speeds and some other features but limited with an 8-bit bandwidth could produce much more color, higher resolution and more effects than any processor made at the time. Most 8-bit systems used around 4k of RAM; 4 GB would seem like overkill. Even 4MB of RAM in an 80s era system would have made a huge difference. There is one area where bits are helpful – in generating 3D calculations 32-bit math is the most efficient. But it isn’t impossible to do on an 8-bit system (here is a demo showing some 3D; another one showing raycasting on an Atari 8-bit computer)  Recall that the NES stock hardware wasn’t all that great – to produce many of the games people still remember fondly like Castlevania III, it needed extra hardware in the cartridge to overcome some of the hardware limitations the system had. .

It is 2014. We need to bury that stupid argument over which old 90s system is better solely due to bits or that “I thought x-system should have been able to produce better graphics because of the bits” nonsense. It is far more interesting to see developers create games pushing the boundaries of what a system was designed to do than quibble over what should be such minor things.

 

Advertisements

About Shaggy

I addition to my professional work in the arcade industry which has ranged from operator to consultant, I like to write about other subjects that interest me as well...if I can find the time.

2 responses »

  1. […] or mini-games, etc.). In the confines of the “is the Jaguar 64-bit” arguments, well I’ve already wrote about how bits don’t matter anywhere near as much as Atari or Sega’s own marketing made it seem. What should have been […]

  2. […] Jaguar. I’ve spoken of it often on this blog, hoping in my little way to correct the record from do bits matter to games it has that are actually good. While I obviously am pro-Jaguar, I can recognize that it […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s