I would say that most long time tech readers will remember that fateful summer day in 2006 when Intel finally brought an end to the period of hateful Intel Pentium 4 and Pentium D processors, unleashing their secret weapon that turned the tables of CPU warfare. Prior to this, AMD had a three year long performance and value for money crown while Intel continued to release multiple iterations of their Netburst based processors, hoping that they could fend off the smaller underdog with clock speed alone. No doubt that this worked just fine in terms of mass orders to OEMs, many DIY builders were left feeling unsatisfied to say the least. I am an optimist however and strongly believe that most bad patches don't last forever and in the context of a huge and profitable chip giant such as the likes of Intel, I'm surely right and on that note, it truly seemed as though AMD were practically caught napping amidst their acquisition of ATi.
Most should also remember that the 24 months that followed were just as dire as AMD/ATi began to pull together new products such as the AMD Phenom X3/X4 range and the ATi Radeon HD 2900 series that were simply too little and too late. Combined with falling confidence, the plummet of share prices and the impending economic downturn that was soon to follow, things just weren't looking great at all. However, at nearly 18 months since the release of the incredibly successful Radeon HD 4800 series and almost a year since the launch of the affordable Phenom II Processor lineup, the doomsday picture that I had just painted out seems as though it was nothing more than a distant memory. It is all but over however as there is no getting around the fact that Intel still possess the performance crown and with their quality assurance based “Tick-Tock” product development strategy, there isn't a lot that's stopping them from maintaining their position. For this reason, it's imperative that AMD do not rely on solely ramping up clock speeds and the sale of units at next to non existant profit margins, but aggressively develop new processor architectures in order to keep up. How much does this really affect the gamer though? Let's discuss.
The majority of todays games are inherently graphics card dependant. While I'm not trying to suggest that pairing a Celeron 430 1.80GHz processor with a top end graphics card will yield similar results to the same card paired with a Core i7 975 3.46GHz processor, but within reason most of the work that concerns the fluidity of the latest and greatest games will be mostly dependant on the graphics card. When you factor in the decreasing price of TFT Monitors and the increases in native resolution, graphical dependancies towards the end user has been further exacerbated. So what we're wondering is, how fast a processor must be implemented in a gaming rig in order to fuel a top of the line Radeon HD 5870 1GB GDDR5 graphics card? Would anyone notice if the same graphics card was paired with AMD's best despite it's shortcomings against the Intel Core i7? It's not an unknown fact that the Core i7 is faster in work per clock cycle and also benefits from Hyperthreading Technology whenever an application decides to harness it. In certain scenarios, it is arguably possible for a Core i7 920 to outpace an AMD Phenom II X4 even with as much as a 667MHz clockspeed deficit. Given the traits of most games these days, would the difference between the two be so prominent?
So what does it take to put together a fine gaming machine for today and the future without breaking the bank? Can the Human Resistance fend off the evils of Skynet and still have enough change left to stop by the local watering hole? Who dares, wins is all we have to say so please join us as we pitch AMD's finest against the mighty Core i7 platform in a classic head to head style gaming shoot out.