Radeon HD 2400 XT And 2600 XT Review From Guru3d.com
AMD has now launched no less than five desktop graphics cards, in the high-end range you'll see the 80nm fabrication process Radeon HD 2900 XT, a product that is still discussed a lot for sure. But there are four other products as well. ATI is also launching an all new Radeon 2400 and 2600 series (low- and mid-range). These products will be based on the newer 65nm fabrication production process.
Let's have a brief overview of the five new graphics cards and on the next page we'll go a little deeper into the silicon that is Radeon HD 2000.
Radeon HD 2900 XT
Radeon HD 2600 XT
Radeon HD 2600 Pro
Radeon HD 2400 XT
Radeon HD 2400 Pro
Obviously for all products, we are talking about the next-gen product series here and that means 100% DirectX 10 compatible products, so you'll understand that we'll also talk a little about DX10, the Unified Shader Architecture with those lovely stream processors; a new function on the 2000 Series called Tessellation and last but not least, Avivo HD.
So what we'll do today is go deep into the technology that is the Radeon HD 2400 and 2600. We'll have an overview of the product line-up, the features, do a good number of gaming performance tests and, after the 2900 XT UVD debacle, we'll test Avivo HD with a self-developed performance test with the help of a Blu-Ray and HD-DVD title.
In total we'll test three cards today, the Radeon HD 2400 XT 256MB (ATI reference), the Radeon HD 2600 XT GDDR3 256MB and the Radeon HD 2600 XT GDDR4.
So to get you a quick overview of AMD's Radeon product line. Basically we see three new product series:
ATI Radeon HD 2400; a value series
ATI Radeon HD 2600; mainstream performance segment
ATI Radeon HD 2900; enthusiast segment
Today is obviously all about the Radeon HD 2400 and 2600 series. All these products are Vista DX10 compatible and the entire 2400/2600 series is made on the all new 65nm fabrication process.
Under codename RV630, ATI developed the Radeon HD 2600 and it'll become available in two (Pro and XT) models. The value-targeted RV610-based products will carry the ATI Radeon HD 2400 name with two models; Pro and XT again.
Both RV610 and RV630 support PCIe 2.0 for increased bandwidth. Native support for CrossFire remains, as with current ATI Radeon X1650 XT and X1950 Pro products. Compared to the R600 (HD 2900 XT), AMD is manufacturing RV610 and RV630 on a 65nm manufacturing process as it's on a quest for low power consumption and our review today will show that's exactly the case. Expect RV610 products to consume around 25 to 35 watts. RV630 requires more power nearing 75 Watts.
Radeon HD 2400 Pro & XT
The Radeon HD 2400 series will be the cheap DirectX 10 compatible product. It'll also include ATI Avivo HD technology for HD video playback and get this: It has built in audio that it can transmit to your HDMI connector. This is a new function on the entire HD 2000 series.
The graphics core itself has 180 million transistors, which lead to 40 Stream (unified shader) processors inside that core. There will be a Pro and XT version of these cards and clock speeds will respectively be clocked at 525 and 700 MHz on the core. For all cards we can tell you that the shader domain runs at the same speed as the core. We tried to unlock this but it's a no-go, they run synchronously.
HD 2400 Pro
HD 2400 XT
# of transitors
Stream Processing Units
Math processing rate (Multiply Add)
Pixel processing rate
Triangle Processing rate
Power Consumption (peak)
Memory wise it can't get any cheaper. You are looking at 64-bit memory; these cards will come in 128/256MB GDDR2 configurations and some board partners can opt for 256MB GDDR3. The Pro will have its memory clocked at 400-500 MHz where the XT model will have a 700-800 MHz (x2) memory clock frequencies.
Radeon HD 2600 Pro & XT
The HD 2600 series is probably what you guys will buy the most. It's again a fully DX 10 ready product and can do everything the 2400 series can, yet a tad better and faster. Full 1080P HD decoding? Not an issue, seriously not an issue at all! We see the Avivo HD technology for hardware HD video processing with 5.1 audio over HDMI. We'll explain this a bit better in the coming pages. Here's also where the Crossfire fun starts. Have a look at some photos and you'll see the new recently introduced Crossfire connectors (bridged just like NVIDIA's SLI connector). You insert two of these cards in a compatible mainboard, apply the two Crossfire bridges, enable it in the Catalyst driver and you are home-free.
The HD 2600 series will again be released in a two fold series again each with several configurations, a Pro and an XT version with each in that series containing submodels with 256 and 512MB configurations.
The GPU core has 390 million transistors, which is a friggin lot for a mid-range product. We see a good number of shader processors; 120 Stream Processing Units. From the numbers the cards look very interesting. Clock speeds are high, very high. The Pro is to be clocked at 600 MHz and that XT at an amazing 800 MHz. Now among board-partners and models this can and will vary a little. But again 800 MHz for a mainstream graphics card with such a large number of transistors is an all new record by itself.
As mentioned in a previous chart, there will also be a Gemini version of the XT model. We're a bit unsure what it entails, but I can say with certainty that it is either a Crossfire package or dual-gpu based card as when you look up the word Gemini in the dictionary, you find the following:
A constellation in the Northern Hemisphere containing the stars Castor and Pollux. Also called Twins.
The third sign of the zodiac in astrology. Also called Twins.
The memory then. I was really hoping to see AMD be the first to go for a 256-bit wide memory bus but unfortunately just like the competition they are sticking to 128-bit. This is where the cards will hurt from the most.
HD 2600 Pro
HD 2600 XT
# of transitors
Stream Processing Units
Math processing rate (Multiply Add)
Pixel processing rate
Triangle Processing rate
512/256MB GDDR3/4 128/256 DDR2
Power Consumption (peak)
AMD has tagged HD to the product name to designate the entire lineup’s Avivo HD technology, for a good reason.
I've been preaching for a while now that we see the living room entertainment coming to the PC more and more, in a very fast fashion. One of the most popular things we've noticed here in Europe has to be HDTV and everything related to it. The trend really started last year already and with the help of Blu-ray and HD-DVD it's coming in faster and quite frankly, thank God for that, as watching content in HD is simply breathtaking.
What exactly acronyms like HDMI and HDCP mean? The HDTV market continues to heat up, and who has not heard about terms like HD Ready? Let's run through some terms. HDTV stands for High Definition Television, the current image standard is know as Standard Definition. The high definition format uses up to 1080 lines to make up the picture you see on your TV compared to 576 for the current standard, HD will also be broadcasted in widescreen 16:9 format rather than the conventional 4:3 format. This will make for a truly cinematic experience.
Very blunt: HD = More lines = more pixels = better picture quality
In simple terms the image you will see with HD will have vastly improved image detail and color reproduction.
HDMI means High-Definition Multimedia Interface. It is a new kind of digital audio and video connector that will replace all connectors currently used by DVD players, TV sets and video monitors. The big idea here is that we should all use a single cable instead of several cables when connecting your DVD player to your TV set, for example. Interesting fact: HDMI is similar to DVI with three exceptions; HDMI is a much smaller connector (it pretty much looks like an USB connector), HDMI utilizes copy protection called HDCP (high definition copy protection) and finally; HDMI carries multi channel digital audio. HDMI, like DVI, is ALL-digital therefore picture quality is “perfect” from source to display.
HDMI also implements a copy-protection mechanism called HDCP (High-Bandwidth Digital Copy Protection). First off... from now on, all series 2000 cards are HDCP compatible as the much needed crypto chip is embedded into the core logic of the card. Why the need for it? Well... with Vista when you want to playback HDCP content (movies) on your monitor, the resolution could be dumbed down or even worse if your monitor, content and graphics card do not have a HDCP (content protection) handshake.
It's like this: Your screen will go black during playback if you do not have an HDCP encoder chip working on the graphics card. So close to the cooler you'll notice a small EEPROM slash CryptoROM doing that magic for you. Galaxy included it on these boards. Mind you that if you like to playback media files with a HDCP ready graphics card, you'll also need a HDCP compatible monitor. (Hey, don't look so angry. Don't shoot the messenger!).
So, a HD Ready television will have either a DVI (Digital Video Interface) or HDMI (High Definition Multimedia Interface). Both connections provide exceptional quality. HDMI is often referred to as the digital SCART cable as it also provides audio. DVI supplies picture only, separate cables are needed for audio. Both HDMI and DVI support HDCP (High-bandwidth Digital Content Protection) which will be a requirement for protected content.
Why does Avivo HD apply to you ?
Well just look at the latest trend of HTPC's, Home Theater PC's. Things like Media Center PC's. It's exactly these kind of things I am talking about. This is the future of media playback and the PC is going to play such an important role in that.
The key advantages of Avivo HD technology are twofold.
First and foremost to offload the CPU by allowing the GPU to take over a huge sum of the workload while being more energy efficient aware. HDTV decoding can be very demanding for a CPU. These media files can peak to 20 Mbit/sec easily as HD streams offer high-resolution playback in 1280x720p or even 1920x1080p without framedrops and image quality loss.
By offloading that big task for the bigger part of the graphics core, you give the CPU way more headroom to do other things, which actually makes your PC run normal.
Make no mistake though, as our tests have proven, any modern CPU is quite capable of doing the same job just fine. But a combination of factors offer you stutter-free high quality and high resolution media playback. All standard HDTV resolutions are of course supported, among them the obvious 480p, 720p and 1080i modes and now also 1080p (P=Progressive and I=Interlaced).
The new HD 2000 series will also offer you HD noise reduction, which is a great feature with older converted films. And this is where we land at the second advantage of Avivo, Image Quality.
Avivo HD can offer a large amount of options that'll increase the IQ of playback. This can be managed with a wide variety of options. Obviously AMD has some interesting filters available in the Avivo HD suite like advanced de-interlacing, which can greatly improve image quality while playing back that HD-DVD, MPEG2, TS AV-1 or H.264 file (just some examples). Aside from that, things like colour corrections should not be forgotten. All major media streams are supported by AMD with Avivo HD. And yes, High Definition H.264 acceleration, which will eventually become a big, new and preferred standard, is also supported.
AMD’s upgraded Avivo with a new Universal Video Decoder, also known as UVD, and the new Advanced Video Processor, or AVP. UVD actually made its debut in the OEM-exclusive RV550 GPU core. UVD provides hardware acceleration of H.264 and VC-1 high definition video formats used by Blu-ray and HD DVD. The AVP allows the GPU to apply hardware acceleration and video processing functions while keeping power consumption low.
UVD expands on the previous generation’s AVIVO implementation to include hardware bit stream processing and entropy decode functions. Hardware acceleration of frequency transform, pixel prediction and deblocking functions remain supported, as with the first generation AVIVO processing. AMD’s Advanced Video Processor, or AVP, has also made the cut for low power video processing.
UVD is a feature only present in HD 2400 and 2600 series, not 2900.
To give you an idea how intensely big one frame of 1920x1080 is with a framerate of 24 frames per second. Click on a the two example images above. Load them up, and realize that your graphics card is displaying that kind of content 24 times per second, while enhancing them in real-time.
Testing HD decoding performance
Here at Guru3D we have developed a new test. As most of you know we where pretty much surprised after posting our HD 2900 XT review that there was no UVD engine present while roughly the entire world believed it had that engine. To prevent such situations we decided to develop our own decoding test where we can measure CPU utilization very precisely during HD playback.
The most important bitstream for decoding HD content is VC1. Armed with both a HD-DVD and Blu-ray drive we'll decode 140 second clips from two movies. These clips are in each benchmark run the same. Each .5th of a second we'll measure CPU load and register it. After the 140 seconds we'll have an average, minimum and maximum CPU load to observe. We'll focus on the everage framerate.
Let's see what that looks like:
Above we can see a GeForce 8600 GT decoding a HD-DVD VC1 stream, in this case it's decoding the movie Bourne Supremacy.
On the Blu-ray side we look at a VC1 stream of the movie blood Diamond, exceptional good (low) CPU load.
Let's measure with a couple of HD ready cards and have a look at the outcome.
So first and foremost, it is not a requirement to have your graphics card decode any given HD stream, it's preferred though. In the charts below you for example can see PowerDVD decoding a HD stream over the CPU. On both movies this causes the most CPU utilization. Quite honestly that's really not even half bad. We did however use a 2.9 GHz Core 2 Duo X6800 processor on this system. However, after testing that also, changing to a cheaper E6600 would only results in a slightly higher utilization (we're talking 2-3% here).
For the GeForce cards we used NVIDIA's NZONE ForceWare 158.45 driver, don't use 158.24 as HD decoding simply does not work properly. On ATI's side we used a beta 22.214.171.124-rc2_48912 driver.
Let's have a peek how well things scale.
Above you can see the movie Blood Diamond being decoded on Blu-ray.
Mind you that the number you see is the average CPU utilization during the 140 seconds of decoding. Thus lower means better. As you can see the 2400 and 2600 products decode HD VC1 streams like there on dope or something, only a 5% CPU load. That's exceptionally good. All the way to your right you can see the clip decoded over the CPU with PowerDVD where we spot a CPU load of 30%
This movie however has an average bitrate of 10-15 Mbit/sec. Let's move on to the next chart.
Here we see on HD-DVD the movie, the Bourne Supremacy. Again exceptional performance from the 2400 and 2600 series, this HD bitstream was crazy, 25 Mbit/sec and higher was no exception.
As the results now show, clearly the HD 2900 XT does not have UVD. The GeForce 8800 doesn't do bitstreaming as well, yet seems to be dealing with decoding a tad better. Then looking at the 8600 series however we see that they keep your CPU nicely chilled, but not as good as the UVD supported Radeon HD 2400 and 2600 cards from ATI, that's just amazing.
The reason behind this is VC-1. NVIDIA's new Bitstream processor (BSP) does not support it, it's silly. It does H.264, CABAC/CAVLS but not VC-1. Therefore NVIDIA remains smack in the middle, ATI definitely is king of HD decoding today.
So for the 2400 & 2600 series goes the entire process of Bitstream processing, Frequency transform, Pixel prediction towards deblocking up to post processing (up/downscaling / deinterlacing, color correction) and displaying is managed by Avivo HD on the 2400/2600 graphics cards and that's including Blu-ray content with a 40MBit/sec bit-rate.
We already mentioned this briefly, yet it's imperative that you understand that the entire HD 2000 series of cards offer HDMI connectivity with the help of a DVI adapter and again, all cards have support for HDCP. Unlike current HDMI implementations on PCIe graphics cards, the HD 2400, 2600 and 2900 integrate (secondary) audio functionality into the GPU. Instead of passing a PCM or Dolby Digital signal from onboard audio or a sound card, RV610 and RV630-based graphics cards can directly output audio – removing the need of a separate sound card over your HDMI connector. So you do not have to load sound to your graphics card which leads it to HDMI. Now, the card will receive its audio from e.g. your integrated audio solution and lead it straight towards the HDMI connector where it'll output that sound in 16-bit PCM Stereo sound or AC3 5.1 compressed multi-channel audiostreams as Dolby Digital and DTS. A pretty sexy feature, especially for those who use their PC as a HTPC and are connecting HDMI towards a HDMI receiver.
By the way do not be mistaken, for your add-in board (your X-Fi or whatever) the system S/PDIF output is not tied up by routing it to the graphics cards. It's completely a secondary process so you have full functionality over your primary soundcard.
So with the Series 2000 you'll receive a DVI-to HDMI adapter (a board partner option though) which, and make no mistake here, will carry sound over HDMI. That's unlike current DVI-HDMI adapters and cables which do not carry sound. Fantastic if you are watching a Blu-ray movie, simply connect HDMI to wards your HDTV for PCM sound, or connect it through a TrueHD/Dolby HD receiver and get that sound lovin' going on through that receiver of yours. All with one simple cable.
Here we can see that DVI to HDMI dongle that is supported with the HD 2000 series Radeon graphics cards. It'll be included with all Series 2000 graphics cards. A really clever solution to be honest, it saved manufacturers building a separate SKU specifically with HDMI connector. Plug it in and let the fun begin.
We'll now show you some tests we have done on overall power consumption of the PC. Looking at it from a performance versus wattage point of view. Power consumption on the 2400/2600 products is really not bad. Our test system is a Core 2 Duo X6800 Extreme Processor, the nForce 680i SLI mainboard, a passive water-cooling solution on the CPU, 2GB memory, DVD-ROM and WD Raptor drive. Have a look:
System Under load
The methodology is simple: we look at the peak wattage during a 3DMark05 session with hefty IQ settings to verify power consumption. It's a good load test as both GPU and CPU are utilized really hard here. Please do understand that you are not looking at the power consumption of the graphics card, but the consumption of the entire PC.
We simply place a wattage meter in-between the PSU and power socket. It's not the most objective way to test as you have to consider PSU efficiency as well, but it's the best thing we can do, though.
In my view:
Radeon HD 2400 XT requires you to have a 350 Watt power supply unit. Really 20 AMPs on the 12-volts rails should be sufficient.
Radeon 2600 XT isn't that far off either, get a 350 to 450 Watt power supply unit with 25 AMPs on the 12-volts rails; it should be sufficient.
If you have dough to spend and opt the guru path of righteousness by doubling up towards two cards in your system -> Crossfire, then you should end up with a 450 Watt or better PSU with a 30 AMPs 12 Volts rails.
It's weird isn't it? Both the 2400 and 2600 product series are a little hard to position. The bigger issue with graphics cards these days is that they surpassed their primary function, gaming. Yes graphics cards have obtained much more functionality and this is what we are facing in the future, we'll see more technology adapt and be merged together.
Let's analyze, it's fair to say that both the 2400 and 2600 cards do not live up to our initial expectations from a 3D performance point of view, no sir. Also performance wise there's just a huge gap in between the 2900 and 2600 series. AMD has not been able to kick the current performance crown holder in the mid-range segment from its lofty chair. And that's annoying. Honestly it would be intensely good for the consumer graphics card industry if, for example, bi-yearly we'd have a new performance leader. This is the reality though.
This is why I say that both the 2400 and 2600 series are refreshing. Unfortunately they are not the mid-range top performers we all have been hoping for, their 3D rendering capabilities are sufficient; sufficient for the money you have to pay for it. Realistically the HD 2600 XT compared to the GeForce 8600 GT gives NVIDIA a good lead. But the thing here is that all cards tested today are cheaper then that 8600 GT.
*Products & Promotions is the most current as of the date of update. All promotions and prices shownon this website are subject to change without prior notices and are only available to registered Achieva dealers. Achieva Technology will not take any responsibility which may arising due to misleading informations, such as words, images, prices contained on this website.