For the review today we are using Nvidia Forceware 344.60 and AMD Catalyst 14.11.2 beta drivers. All of the AMD and Nvidia hardware in our reviews today used these drivers, tested between the 18th and 24th November. We are using one of our test rigs supplied by DINOPC and built to our specifications.
If you want to read more about this, or are interested in buying the same Kitguru Test Rig, check out our article with links on this page. We are using an Asus PB287Q 4k and Apple 30 inch Cinema HD monitor for this review today.
Comparison cards:
Gigabyte GTX980 G1 Gaming (1228mhz core / 1,753mhz memory)
Inno3D GTX980 ‘iChill Herculez X4 Air Boss Ultra (1266 mhz core / 1753 mhz memory)
MSI GTX980 Gaming 4G (1216mhz core / 1753 mhz memory)
Asus GTX980 Strix (1178mhz core / 1753 mhz memory)
Asus GTX780 Ti Direct CU II OC (954mhz core / 1750 mhz memory)
Nvidia GTX980 Reference (1126 mhz core / 1753mhz memory)
Sapphire R9 290X Tri-X OC (1040 mhz core / 1300 mhz memory)
Palit GTX970 Jetstream OC (1152mhz core / 1753 mhz memory)
MSI GTX970 Gaming 4G (1140 mhz core / 1753 mhz memory)
Palit GTX780 6GB (902 mhz core / 1502mhz memory)
Asus GTX970 StriX OC (1114 mhz core / 1753 mhz memory)
Asus R9 290 Direct CU II OC (1000 mhz core / 1260 mhz memory)
OCUK GTX970 ‘Nvidia 970 Cooler Edition' (1051 mhz core / 1753 mhz memory)
Software:
Windows 7 Enterprise 64 bit
Unigine Heaven Benchmark
Unigine Valley Benchmark
3DMark Vantage
3DMark 11
3DMark
Fraps Professional
Steam Client
FurMark
Games:
Grid AutoSport
Tomb Raider
Metro Last Light Redux
Thief 2014
All the latest BIOS updates and drivers are used during testing. We perform generally under real world conditions, meaning KitGuru tests games across five closely matched runs and then average out the results to get an accurate median figure. If we use scripted benchmarks, they are mentioned on the relevant page.
Game descriptions edited with courtesy from Wikipedia.
Please, for the love of god, change the overclocking page. The boost clock in GPU-z is misleading, either take it from Afterburner sensors, or open the Sensors tab in GPU-z to show the real boost clock.
Really :-S
“the only card in the world with coil whine reduction tech” …
i bought a Club3D R9 280 card about 4 months ago that already has coil whine reduction technology !
When are the 8gb 970/980 cards releasing? With the current dev process games are requiring high vram due to the console shared RAM, and I only see this getting worse – so future proofing with 8gb seems like the smartest move at the moment.
That’s like a stupid movement at the moment. By the time any game will fully utilize 8GB of VRAM, the entire card will become worthless. IMO it will take another year or two at least, before we see a PC game requiring 8GB of VRAM. And by then GDDR5 will become obsolete. HBM is right around the corner.
I would have said exactly that 6 months ago. But then I played shadow of mordor on a 4gb card… And let’s face it, AMD aren’t competing at the moment, so going for one of their 6gb cards isn’t a great option either.
Silent ‘Infinity Black Edition’??? WTF!? the correct neme is GTX 970 EXOC Black Edition
Just ordered mine….Wasn’t called the infinity though.
Zardon: You dont mention in the test, if it uses nvidia reference mounting holes for the GPU cooler. So does it?
As I wanted to smack an RAIJINTEK Morpheus GPU cooler on it.
That’s the wrong card. Your looking at this:
http://www.overclockers.co.uk/showproduct.php?prodid=GX-006-GX&groupid=701&catid=1914&subcat=1010
when you need to be looking at this:
http://www.overclockers.co.uk/showproduct.php?prodid=GX-010-GX
Did you order the wrong card?
http://www.overclockers.co.uk/showproduct.php?prodid=GX-006-GX&groupid=701&catid=1914&subcat=1010
It’s this one:
http://www.overclockers.co.uk/showproduct.php?prodid=GX-010-GX
It’s mentioned in regards to the 970 and the coil whine issues it’s been found to sometimes suffer from.
I ws being a doofus.
I meant I didn’t order the infinity.
I ordered the EXOC black edition.
The infinity was £50 more for only a slight increase in overclock….
ALthough found out today that the infinity is dropping it’s price this week….dammit. lol.
No guarantee there will be an 8GB version. I wouldn’t wait for something that might never appear.
4GB is gonna be fine for the next few years or so.
Only game I know that claimed to use more was Mordor…and that runs fine on 4GB vram with no apparent loss to quality.
Did you notice a difference? 4GB worked fine for me…ran like silk and looked good with the texture pack.
Mind you it looked great on 3GB as well…ddn’t notice any difference.
Yeah…its’ called super glue lol
aaaaaahhhh OMG OMG!!! I can’t believe, I don’t know there is a modded version of black edition, f*ck I have buy the card from Germany and if I was aware of this version I would have bought from you in England.
Shame as it’s not just a clock increase.
That’s a pity when you look at the additional changes made it’s worth a few quid more. Still I’m sure the regular black edition is a pretty good card.
“Although found out today that the infinity is dropping it’s price this week”
where did you get that info from and how much?
cheers
Yes. It plays OK on 3/4GB cards that I’ve seen, but you get big drops and stuttering. The annoying thing is that the textures don’t seem to actually make much of a difference to the aesthetics.
FYI: http://www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-to-reveal-geforce-gtx-970980-with-8gb-of-memory-in-november-of-december/
It looks pretty much certain, and the reveal should be about now, I was just wondering whether I’d missed it really. I would like for 4GB to be fine for the next few years but looking at the current releases I’m not sure I believe it.
This is all assuming 1080p too – you absolutely need more than 4gb for 4k; even Watchdogs uses more on the lowest settings. Right now, SoM recommends 6GB VRAM and you need at least 4GB to be assured of no stuttering, and CoD: AW recommends 4GB, the Evil Within recommends 4GB, while you absolutely need 4GB for Unity all for 1080p. This is not a few years into the future, it is now. And we’re not talking about buying low end cards here: if you’re buying the best card on the market, or at least one of the most expensive (a 980), there’s no way you should be only just hitting the ‘recommended’ specs, or even below them, when you buy.
No… It also has some superglue on the coils and is black.
It was on the overclockers forums from the designer of the card…
We are taking about this one with a pre-order price of £299.99, due out Friday http://www.overclockers.co.uk/showproduct.php?prodid=GX-010-GX&groupid=701&catid=1914&subcat=1010 ?
That isn’t all that was done, Yes glue was used as a buffer between the inductors and pcb to reduce vibration but the components themselves were changed as well.
Yeah…I know.
As I said above it was an extra £50 from the card I ordered…
I did make a mistake in my comment, I thought the price was dropping on the Infinity…but the price on the card I ordered went up.
Wasn’t worth the extra money for me.
.
Especially if the rumours of an upcoming 8GB version are true.
I was joking…but it isn’t worth the price difference. My card (the EXOC black edition) is overclocked to 1500 mhz and still runs under 60 degrees…. hardly any coil whine either.
.
All black does look a bit better.
So glad I went with this relatively unknown brand.
Can you comment at all on the fact that they have one fan still running at idle as opposed to MSI/ASUS having theirs completely turned off? I thought this card looked flawless until I read that part – someone on the OcUK website actually commented that the fan won’t go below 30% which is apparently around 1100rpm
It isn’t misleading.
Then obviously, you got no clue what you’re talking about, in which case, you shouldn’t comment.
How is it misleading then? It’s the correct boost clock because it’s the same one on the manufacturer website. Are you trying to tell me you know better than the manufacturer?
As I wrote, you got no clue then. The boost clock that GPU-z is showing is not correct. I have the MSI variant of this card, and in that tab, it is showing 1403Mhz, but the actual boost clock is 1510Mhz, which is read trough the sensor tab and MSI Afterburner.
The boost clock showing up, is just a guestimate the program is doing, which is clearly wrong.