In late 2010, ATI Technologies Inc, after 2 years of being part of Advanced Micro Devices, re-branded their Radeon line of Graphics Processing units. The Northern Islands cards are the first AMD Radeons on the market, and they're also the second generation DirectX 11 cards produced by ATI. I recently upgraded to a AMD Radeon HD6570 from the nVidia GeForce 9400GT which chose to go kaput, and I have to say, this card comes with rewards for those who can seek them.
Let's start with a bit of background. The Northern Islands series of GPUs, which are sold as the HD6XXX series of AMD Radeons, are the second-generation DirectX 11 GPUs from ATI, which are built on a 40nm SOI process. The GPUs are collectively codenamed Northern Islands; in actuality the Northern Islands GPUs consists of the Turks, Caicos, Cayman, Barts and Antilles GPUs. They support Direct3D 11, Shader Model 5.0, OpenCL 1.1 and OpenGL 4.1. In addition, much like nVidia's PureVideo, AMD also supports accelerated H.264 and XviD decoding on these cards thanks to their Universal Video Decoder (UVD), which has been bumped to version 3. Cards with HDMI support the HDMI 1.4a spec. As for support for the upcoming Windows 8, Windows 8 is supposed to feature DirectX 11.1; it is currently unknown whether DirectX 11.1 will have additional hardware requirements on top of DirectX 11 (remember, Microsoft actually marketed Tessellation as a DirectX 11 feature, but it was actually introduced as part of DirectX 10.1, which came as part of Windows Vista Service Pack 1). Obviously, Windows 8 will work just fine with DirectX 11 cards; it's the games that will require the full gamut of DirectX 11.1 features.
AMD Radeon HD6570
The AMD Radeon HD6570 features the Turks GPU. The Turks GPU was released on February 7th, 2011 – so it’s actually quite old by gaming standards. Anyway this card is marketed as the highest entry-level card of the Northern Islands series. It's actually quite a bit more, as we'll see later, but let's get to see the card first.
This card comes in a full-height single-slot package. It requires, like all cards, a PCIe x16 connector, but it does not need an external PCIe power connector for graphics cards. Considering its performance, this is quite surprising. The current generation nVidia counterpart for this is the GT440, but by all benchmarks, the GT440 has a higher power draw, higher heat generation and performs slightly worse than the 6570. This card has a rather small heat sink, but it does have a fan (i.e., it is actively cooled, not passively).
I bought the card from XFX, and XFX sells it at its stock configuration. This card is priced somewhat low, and isn't really that kind of a high-end card to warrant buying overclocked editions from vendors like ASUS or MSI. But on the other hand, with a 650MHz core clock and a 1GHz memory clock, this card does lend itself well to overclocking. Consider this: this card and the HD6670 both feature the exact same GPU. But the 6670 is clocked at 800MHz, and priced $20 higher (in India, you can expect the 6670 to cost 2 grand more than the 6570). The 6570 is available in a 1GB version and a 2GB version, whereas the 6670 is available in a 1GB version only. The one that I bought features a gig of GDDR5 memory – the 6570 has a GDDR3 version also. The GDDR5 version draws about a watt more of power, but hey, if you can overclock your 6570 to 800MHz, you have a 6670! Trust me, the clock is the only difference between the 6570 and the 6670. The ATI Catalyst Control Center maxes out at 750MHz, but I was able to achieve a stable overclock at that frequency. Considering that even the 6670 can be overclocked by around 25-30 MHz, theoretically, you can, by using third party tools, take the 6570 from 650 to 830MHz. And that is a massive FPS boost in games, let me tell you. I'd say the 6570 is a very sweet deal at 5 grand Indian bucks.
As for heat dissipation, this is almost winter and I've got a pretty cramped stock “baby cabinet” with a 600 Watt CoolerMaster PSU tucked away in a cramped corner with no fan – in other words, the worst case that could happen with regards to airflow – and the card keeps a stable 60 degrees Celsius under load. In idle, it can be anything from around 45 to 50 degrees. That's cool, by all standards. Because the optimal temperature maxima is 85 degrees, and absolute maxima is 105 degrees. This means you have a lot of room.
The card draws around 10 watts of power at idle, but can spike to 70 watts at full load, so make sure you have a PSU capable of handling it – if you have a stock 400 Watt PSU, ditch it for a Antec or CoolerMaster. 400 Watt is the bare minimum required, and you need it to be pretty efficient. Because it requires no external power connector, it does require a lot of power off your motherboard.
On the technical side, the GPU has 716 million transistors, a 118 square mm die, a pixel fill rate of 5.2 gigapixels per second, a texture fill rate of 15.6 gigatexels per second. It has a 64GB/s bandwidth in the GDDR5 version, and a 28.8GB/s bandwidth in the GDDR3 version (which for a 200 rupee difference doesn't really make much sense). It operates on a 128 bit wide bus.
As for connectors, the card comes with 3 – one each of VGA, DVI-D with Dual Link support, and an HDMI connector with 7.1 channel surround sound output. It's also built with support for stereoscopic 3D output.
Unlike my previous card (the GeForce 9400GT), this one isn't a card for digital media – its about as expensive as a HTPC card, but its too darn good to be relegated to hardware assisted Blu-Ray playback. Don't get me wrong – AMD Avivo HD is really good and works really well. But if you want a card solely for the purposes of powering a gigantic 1080p display, you want something worse.
On the other hand, if you're building a multi-monitor setup for digital video creation, read on. Avivo HD not only decodes digital video, but encodes it too, so you'll strike gold with this card if you're using it in a studio environment. In fact, this card works well in Mac Pros too, so you can use this on Windows PCs and Macs and Adobe Premiere and to some extent Photoshop too will be able to take advantage of the exceptional image processing capabilities of this card.
If you have a low-resolution monitor (I'm talking a maximum of 1366x768), you can use this card and max out your game's settings and get around 30-35fps off almost all games. If you have a 1440x900 display, you'll want to overclock the card a tad, and you'll be able to max out the game. If you have a full HD monitor – well, this card isn't a dedicated gamer's card, but you'll still get 25-30fps if you tone down the settings to medium.
This card excels at anisotropic texture filtering and full-screen antialiasing. Yes, you read that right. Anisotropic texture filtering and Antialiasing performance has been a strength of ATI Radeons since the legendary HD4870, but with this card, the performance penalty is less than 1fps at 1366x768. I tried this with Crysis Warhead – which with its ample foliage is a very edge-heavy game, and yes, at Enthusiast (the maximum possible) settings, I got around 25fps. This is remarkable for a card this cheap.
However, there's one game that just refuses to give any graphics card a good day, and that is Microsoft Flight Simulator X. Even with a couple of unofficial mods to improve rendering, I could get only 8-9fps at densely populated areas and landscapes and at airports (it actually dropped to less than 1fps over Hong Kong). This is at 1024x768, with all the settings topped out.
Overall, I tried out a couple of old games (Need For Speed: Most Wanted and Carbon), Crysis, Crysis Warhead, as well as Crysis 2, Call Of Duty 4: Modern Warfare and Modern Warfare 2 (in fact, I bought this card because of the impending Modern Warfare 3 release), along with F1 2010. I could max out the game everywhere except Crysis 2, and it played well at around 25-30fps in all the first-person shooters, and well around the 80fps mark in the NFS titles. This was without overclocking. In course of my research before buying this card, I read online that H.A.W.X 2 is playable at around 58-60fps at 1280x1024, so this card is definitely good for gaming.
Comparing it with the last generation DirectX 10 cards, this one beats the GeForce 8600GT and the 9600GT by a reasonable margin and competes well with the 9800GT and the GT330.
Operating System Support
The Catalyst Drivers support the card on Linux and Windows, with complete OpenGL 4.1 support on the former, and in addition to that, DirectX 11 support on the latter. The card's elaborate packaging says FreeBSD is supported, but there's no Catalyst drivers for FreeBSD. FreeBSD does have access to the open source Radeon driver (like Linux), but the Radeon drivers have as of now major beef with the Northern Islands cards, and are for all practical purposes unusable.
Mac OS X features support for this card since the 10.6.8 release.
The 6570 is a good card. In Kolkata, it cost me INR 5050 inclusive of taxes, and after overclocking, it's giving me the performance of a card which should cost a ton more. So should you buy it? If you're a serious gamer, no. If you're a casual gamer, yes, especially if you want to play your game well. If all you want is to watch HD movies, then no, but if you rip Blu-Rays, or do extensive photoshopping, or work in a studio environment with full HD monitors, then yes. In short, if all you do is gaming, then you should buy something better. But if you're a power user who expects to play games once in a while, and wants to thoroughly enjoy the gaming experience, you better buy this right now.