Even the best-equipped PC can’t deliver performance if it’s “heart”—the processor (or CPU)—isn’t strong enough. Despite all the amount of RAM and the power of graphic cards, the processor is what really determines the overall speed of a PC.
In this blog post, we’ll take a quick peek behind the scenes of the latest generation of CPUs, both from Intel and AMD and tell you how they’re still managing to make their hardware faster and more power-efficient than ever before. We’ll also give you quick overview of both CPU platforms.
It’s not about pure performance anymore
Remember the performance race? From the dark ages of computing up until about 2005, CPU competitors Intel and AMD tried to one-up each other (and themselves) by increasing the CPU clock frequency incrementally. The processors were simply getting too hot and consumed way too much power. They reached a point at which CPUs were on the brink of becoming unstable. An example of that is the last generation of Pentium IV processors (right). It consumed 100+ watts and required advanced cooling techniques to work properly. PCs at the time were loud power suckers and mobile PCs with the (lower-voltage) Pentium IV mobile processor lasted for about 1½ hours on the go. But it wasn’t just about raw clock speed, the architecture and instruction sets (remember the MMX hype?) improved quite a bit. But it was always about performance. Also, at one point both competitors hit a brick wall because they were stuck in manufacturing technique/technology since it was physical impossible to get more transistors into the CPU’s die and therefore, it was not possible to refactor/improve its architecture.
To advance performance further and decrease power consumption, both Intel and AMD quickly added more cores at lower CPU frequency. Unfortunately, it took quite a bit of time for software manufacturers to catch up—usually only very demanding applications (games, video rendering suites) of the early days of mutli-core processors were capable of delivering increased speeds. Fortunately, the industry adapter and now most tools that require performance are capable of making use of two or even four cores.
These days, even the most basic consumer PCs and desktops have CPUs with two (Duo) and four (Quad) cores in. The more advanced CPUs by AMD and Intel now have even eight cores (Hexacore); AMDs bulldozer platform scales up to 12 or even 16 cores. Performance increases since the operating system and applications are more or less becoming more capable of balancing the load across all CPUs. Instead of having one core with 3.8 GHz you’ll have more power running a quad-core CPU with 2.6 GHz each—of course, that’s only the case if your software has been thoroughly optimized to run on several cores. Software that hasn’t been optimized for multiple cores will perform better on the 3.6 GHz single core (if the architecture, etc. is comparable between these CPUs).
And with that, the never-ending race between Intel and AMD came to an abrupt end.
An Overview of Today’s CPU World
Sandy Bridge And Beyond
“Sandy Bridge”, Intel’s new CPU architecture, was introduced in 2010 and increased the technological gap between AMD and Intel even further. Intel increased performance thanks to more cores, hyper-threading (the simulation of cores within a core), and an increase in processor cache. You can spot Sandy Bridge processors by looking at their names: A Core i3-530, for example, is a model of the first generation of Intels Core i-Processors while Sandy Bridge sports four-digit names, such as Core i3-2600. Sandy Bridge owes it success mainlyto their very compact form as the CPU, graphics processor, and RAM controller have now been brought together on one chip. This increases performance drastically. Second, an enhanced “Turbo Boost” feature increases clock speed whenever you’re running resource-intense applications such as games or video editing suites. In such cases, Sandy Bridge increases for example, from 2.8 GHz to 3.4 GHz and clocks down automatically to save power. And although the built-in GPU (e.g. the Intel HD 3000) is powerful enough to watch full-HD video, it’s not quite capable of running newer games. That’s why, in many cases, you’ll have both the Intel HD 3000 and a dedicated graphics chip (for example, a NVIDIA GeFore or AMD Radeon) built into your system.
What’s up with AMD?
AMD is still obviously a big player in the CPU business, but their market share and advancements have diminished over the past years. Their recent “Fusion” chipsets, aimed at low-end PCs, integrated CPU and GPU into one core, though the platform never really caught on. Last year, AMD introduced a new generation of processors based on the “Bulldozer” architecture. It sports between 6 and 16 cores and advanced core distribution techniques. The new CPUs are all dubbed “AMD FX” and offer performance that’s almost on par with Core i-processors, yet at a lower price.
Which CPU is the Best?
Right now, there’s no easy way to answer this. But, let me put it another way: Since the early days of Intel’s Core i-platform, processors are way too powerful to be bogged down by “normal” applications. And by that I mean web browsing, e-mail, (light) photo editing, or file management. If you only perform such tasks on your PC, you don’t want to spend much time worrying about your next CPU—they’re all fast enough. Even netbooks processors are capable of handling these basic tasks just fine. If you’re a gamer or depend on heavy virtualization, a mid-range to high-end processor such as a Core i5 2600 or Core i7 2600K and even AMDs FX 8100 chip might be something to think about getting. My advice, tough: Stay away from the “Extreme Editions” of CPUs. They only offer a marginal performance benefit over the regular chip, but cost a whole lot more.
If you really need more performance for high-end scenarios in which a combination of raw CPU power is critical (audio editing, video editing, some games), I suggest you wait for the second generation of Bulldozer processors or “Ivy Bridge” slated for release this year.