No one pushes a harmless CPU and graphics card to the very brink quite like a serious gamer. It’s a point of pride with many players that their gaming system can skillfully walk the line between bleeding-edge game play and suddenly melting processors. (It’s one thing to have an advanced cooling system, but it’s something else completely when you feel compelled to keep a fire extinguisher close at hand.)
Gamers can go to extreme lengths just to squeak out a better refresh rate or a few more frames per second, and they’re quick to take advantage of the latest technology as it is released. Over the last 30 years there have been some significant advances in CPU and graphics technology and an occasional plateau as well. Together the gaming industry and gamers have pushed the evolution of the common CPU and associated graphics cards to the point that it’s hard to believe how far we’ve really come.
The Genesis of Gaming PCs
The Commodore 64 was introduced in January 1982 with a CPU that clocked in at 1.024 megahertz and featured a wholesome 64 kilobytes of RAM. The 16 colors on the 320×200 display definitely don’t seem like much today, but it was enough to play the likes of Q*Bert, Pole Position, and Dig Dug. The games were simplistic but fun, and 30 years later we still talk about them and play them on a variety of systems. These specs seem miniscule by today’s standards, but they were enough to make the Commodore 64 the best-selling single personal computer model of all time.
For the next decade, things really didn’t get that much faster. Intel began to release the X86 series throughout the 80s and early 90s, which made some strides and created an environment where games like SimCity and Doom could start changing the way we thought about games. We now had enough power in these computers to start creating an immersive experience. These games would set the tone for the industry for the next twenty years.
Major CPU Advances
The Pentium Pro started to really increase the performance of the average gaming computer, and we saw these CPUs start to clock much higher speeds. Between ‘95 and ‘99 we saw a consistent growth in processor speeds, but it was in 2000, with the launch of the Pentium 4 and AMDK7 that we started to see some major advances in capabilities.
In the gaming world games like Doom led to Quake, Unreal, and Half-Life, and real time strategy games began to take off. We had the power to manage millions of polygons for a complete first-person experience and well-balanced and addictive games like StarCraft began to make competitive (and professional) gaming possible.
Jumps and Plateaus in Gaming Pc’s
Between 2000 and 2004 there was a major jump in technology, and the Pentium 4 nearly doubled its speed from 1.5 GHz to 3.06 GHz (on average). This happened in time for the push toward online gaming and MMORPGs like World of Warcraft as well as some new developments in the FPS world. Serious gamers could really push the graphics and frame rates of their PCs to get an edge on their competition, and huge guilds of virtual adventurers could raid a massive dungeon without experiencing undue amounts of lag.
Once we hit this point, though, the actual clocked speeds of these processors hit a plateau for a while. This wasn’t because there weren’t any advances in technology, it’s just that the CPUs had changed their focus to dual and quad core processing. Of course, this didn’t stop serious gamers from overclocking their new Core i7 and pushing it to a nice, warm 4.2 GHz to really enhance their Modern Warfare 3 experience.
An open question to gamer out here, which graphics card and processor do you prefer for your gaming Pcs?