Intel Ivy Bridge graphics processors review
By Loyd Case | PC World | Published: 17:47, 23 April 2012
The most impressive feature of Intel's new Ivy Bridge CPU is the graphics portion of the chip. The HD 4000 GPU built into these processors is a huge improvement over the chips found in current-generation Sandy Bridge products, and is fast enough to make inexpensive entry-level graphics cards obsolete. You'll still want a good discrete graphics cards for serious gaming, but our benchmarks show that there's just no reason to buy a $50 graphics card anymore.
This stands in contrast to the processor's general compute performance, which is just slightly faster than current Intel CPUs. (For more, take a look at our testing of overall Ivy Bridge performance.) Ivy Bridge is more energy-efficient, which will be especially useful in laptops, but the most noticeable change in performance will be felt when you run 3D graphics applications.
Features, speeds and feeds
At first blush the Ivy Bridge GPU might seem as if it would run slower than the Intel HD 3000 technology built into Intel's Sandy Bridge processors. The Sandy Bridge GPU runs at a base clock frequency of 850MHz, with a Turbo Boost clock as high as 1350MHz. Ivy Bridge’s HD 4000 GPU, on the other hand, is 200MHz slower, operating at a base clock speed of 650MHz, with a Turbo Boost clock of 1150MHz.
Related Articles on Techworld
Intel has made enhancements to the GPU engine to improve efficiency, but other factors help to mitigate the clock-rate differential, too. First, the new HD 4000 GPU contains 16 execution units, versus the 12 built into Sandy Bridge. Second, Ivy Bridge supports DDR3-1600 memory, as opposed to the Sandy Bridge memory controller, which officially supports only DDR3-1333. Ivy Bridge gains 25 percent more parallel compute power and higher potential throughput due to the added memory bandwidth.
Before diving into performance comparisons, it’s worthwhile to examine additional Ivy Bridge GPU feature changes.
- Full support for Microsoft’s DirectX 11 API, including hardware tessellation and GPU compute: Intel claims that GPU-compute applications will run exclusively on the Ivy Bridge GPU if so instructed, and GPU-compute tasks won't be offloaded to the CPU.
- Two texture units are present, as opposed to a single texture unit in Sandy Bridge.
- The new compute shader enables greater data parallelism, and full support for Shader Model 5, which is required for DirectX 11.
- A shared L3 cache is built into the GPU core itself, which reduces the need to fetch data from the ring bus and the CPU cache.
- Support is included for up to three simultaneous displays (DVI, HDMI, or DisplayPort).
- HDMI 1.4a, including high-bit-rate audio, is supported.
- Quick Sync video is improved, and includes better support for Blu-ray stereoscopic 3D.
Those feature additions, along with the internal rearchitecting of the actual compute units, suggest that Ivy Bridge 3D graphics performance should be better than what you can get from other products. Let’s take a look at actual benchmarks.
We ran benchmarks in several configurations, but performed all of them on a common platform:
- Gigabyte Z77-UD3 motherboard
- Frame buffer memory for Intel HD Graphics set in the system BIOS to 1GB
- 8GB DDR3-1600 memory for Core i7-3770K, DDR3-1333 for Core i7-2600K
- 1TB, 7200-rpm Western Digital hard drive
- 750W Antec Power High Current Pro power supply
We also followed certain procedures:
- We ran all game benchmarks at 1080p resolution. We also set Unigine Heaven at 1080p, with normal tessellation enabled. We ran 3DMark Vantage and 3DMark 2011 in their “performance” mode.
- We ran game benchmarks on Ivy Bridge in DirectX 11 mode when available, but also ran them in DX9/10 modes on the same games for comparison.
- We used an XFX Radeon HD 6450 - an entry-level, DirectX 11-capable graphics card costing roughly $40 - with the Ivy Bridge system, to get a feel for how the Intel HD 4000 might compare to an entry-level, discrete graphics card.
First up is 3DMark 2011, a DirectX 11-only graphics benchmark.
The Radeon HD 6450 fell far behind the Intel HD 4000. It was no contest, really.
Now it’s on to another DirectX 11 synthetic test, Unigine Heaven 2.5.
Here, the Intel GPU actually lapped the Radeon HD 6450, achieving over double the frame rate at 1080p, with normal tessellation set. It’s not a very high number, to be sure, but the results from both 3DMark 2011 and Unigine Heaven are solid proof that the Intel HD 4000 is DX11-capable.
Next, let’s look at 3DMark Vantage, a DirectX 10 synthetic test.
The Core i7-3770K has a 100MHz frequency advantage over the Core i7-2600K, but the GPU cores run at the same frequency across all CPUs with the same GPU core. So the 2600K's GPU still clocks at 850MHz and the 3770K’s HD 4000 GPU still runs at 650MHz. However, the Ivy Bridge GPU posted a score that’s almost 900 points higher. Meanwhile, the Radeon HD 6450 kept chugging along in a distant third.
Synthetic benchmarks are fine, but how did Intel's graphics technology fare on the real games in our tests? Read on.
Some of the games we used support only DirectX 9 or 10. On the games that support DX11, we tried to run Ivy Bridge in both DX11 and DX9/10 modes. That wasn't always possible, however; for example, Dirt 3 set to general medium detail level ran in DX9 mode. Below, we show all the results where applicable.
We ran all games at 1920-by-1080-pixel resolution, at medium detail levels.
Shogun 2 looked to be almost playable on the 3770K in DX9 mode; the result was still faster than what we saw from the Radeon HD 6450 running in DX11 mode. Meanwhile, the Sandy Bridge GPU just couldn’t keep up. Shogun 2 is very graphics intensive, even at medium settings, so you’ll need to dial back the graphics detail and resolution if you want to hit consistent frame rates above 30 fps.
Although labeled DX11, both the Radeon HD 6450 and the Ivy Bridge GPU likely ran in DX9 mode in this test, though it’s hard to be sure. Whatever the API used, Dirt 3 seemed playable with Ivy Bridge at 1080p and medium detail levels--and the Ivy Bridge GPU was considerably faster than the other two.
Just Cause 2 created a stir when it first shipped because it was DirectX 10 only, and it hammered most of the GPUs of the day. At medium settings and 1080p, it still wasn't really playable on any of the GPUs we tested, so you’ll want to dial back the settings even more. However, the Core i7-3770K once again took the crown by a substantial margin.
Metro 2033 is a huge performance hog, and hitting nearly 15 frames per second in its very demanding, built-in benchmark is actually pretty remarkable. Still, you’ll want to dial back to lower detail levels and probably reduce the resolution to make this game fully playable on the HD 4000.
In DirectX 10 mode, the 3770K reached a frame rate just short of 24 fps, but actual gameplay at 1080p with medium detail levels seemed to be fairly smooth. DirectX 11 mode was another story--you’ll probably want to avoid running in DX11 mode with this game on the HD 4000.
Stalker: Call of Pripyat is an aging title these days, but its DirectX 11 benchmark remains quite demanding. In DX11 mode, the HD 4000 still outpaced the HD 3000 and Radeon GPUs. In DirectX 10 mode, it got about 25 percent more performance.
We ran only a single media-transcoding test, so the results are by no means definitive.
Both results were repeatable, but the minor differences really signal a dead heat, with a slight edge going to the older Sandy Bridge GPU. This may be a case where simple clock frequency may have given the older GPU an advantage. Or perhaps applications need to be tweaked a little to make full use of the Ivy Bridge video engine. We’ll have to withhold any final conclusion until we find a few more applications to test.
Entry-level GPUs are dead
AMD and Nvidia created a healthy business out of selling $40-to-$70 graphics cards, offering improved performance and compatibility with modern games. However, the Intel HD 3000 built into Sandy Bridge suggested that these low-end cards may have limited utility, and the HD 4000 component of Ivy Bridge pretty much puts the nail in the coffin.
Serious gamers will want midrange or high-end graphics cards for the smoothest frame rates at high detail settings. At the same time, though, laptop users with Intel graphics may be able to look forward to reasonable performance in today’s games, if they’re willing to sacrifice a little eye candy. Smaller laptop displays aren’t conducive to showing off all the capabilities of modern game engines anyway, so dialing back detail levels may not matter.