Today we look at one of the most exciting graphics cards to hit our labs this year – the new Asus STRIX Gaming GTX 980 Ti featuring the latest Direct CU 3 cooling system. This 3 fan cooling system uses a new patented wing blade fan to provide greater air pressure, and cooling capabilities. The DCIII cooler is equipped with two beefy 10mm direct touch copper heatpipes. Interestingly, the Asus GTX980 Ti Strix solution is built with a total automated production process and incorporates a 14 phase (12+2) Super Alloy Power II power system for maximum reliability. With the highest ‘out of the box' overclock speeds to grace our labs yet, is this the high end GPU you should be shortlisting for a new system build this summer?
Asus have adopted a new way of building their graphics cards – opting for 100% ‘automation' – to remove human error from the production line. Their production is now ‘flux-free'. The 12+2 phase Super Alloy Power II system is designed for maximum reliability and quality.
The Asus Super Alloy II MOS is one of Asus talking points, they say DrMOS decreases the temperature and increases power efficiency. The Super Alloy II Capacitors incorporated throughout the design increase lifespan by 2.5 times – Asus claim 90,000 hours longer than traditional capacitors.
One of the most irritating problems with graphics cards in the last year is a buzzing sound, which can be classed as ‘coil whine' by many. Asus incorporate concrete alloy chokes on the GTX980 Ti Strix card which is said to remove all traces of coil whine and buzzing under heavy load. We will test that out later in the review.
GPU | GeForce GTX960 |
Geforce GTX970 | GeForce GTX980 |
Geforce GTX 980 Ti | Geforce GTX Titan X |
Streaming Multiprocessors | 8 | 13 | 16 | 22 | 24 |
CUDA Cores | 1024 | 1664 | 2048 | 2816 | 3072 |
Base Clock | 1126 mhz | 1050 mhz | 1126 mhz | 1000 mhz | 1000 mhz |
GPU Boost Clock | 1178 mhz | 1178 mhz | 1216 mhz | 1075 mhz | 1076 mhz |
Total Video memory | 2GB | 4GB | 4GB | 6GB | 12GB |
Texture Units | 64 | 104 | 128 | 176 | 192 |
Texture fill-rate | 72.1 Gigatexels/Sec | 109.2 Gigatexels/Sec | 144.1 Gigatexels/Sec | 176 Gigatexels/Sec | 192 Gigatexels/Sec |
Memory Clock | 7010 mhz | 7000 mhz | 7000 mhz | 7000 mhz | 7000 mhz |
Memory Bandwidth | 112.16 GB/sec | 224 GB/s | 224 GB/sec | 336.5 GB/sec | 336.5 GB/sec |
Bus Width | 128bit | 256bit | 256bit | 384bit | 384bit |
ROPs | 32 | 56 | 64 | 96 | 96 |
Manufacturing Process | 28nm | 28nm | 28nm | 28nm | 28nm |
TDP | 120 watts | 145 watts | 165 watts | 250 watts | 250 watts |
The Nvidia GTX980 Ti ships with 2816 CUDA cores and 22 SM units. The memory subsystem of the GTX980 Ti consists of six 64-bit memory controllers (384-bit) with 6GB of GDDR5 memory.
The Asus Strix Gaming GTX980 Ti Direct CU 3 has received a clock boost over Nvidia’s reference card, with final speeds set at 1,216 mhz(core)/1,317mhz (boost). The memory is also overclocked, from default speeds of 1,753mhz (7Gbps effective) to 1,800mhz (7.2Gbps effective) – this is actually one of the few GTX980 Ti cards we have reviewed to receive a memory boost.
still laughing at the r9 295×2 performance on the Witcher 3 hahahahahahahahaha (sorry).
Well I suspect that if AMD were to ever work out their driver-level Crossfire support for W3, the 295×2 would likely at least trade blows with the Titan Z. As it is, it looks just like I would expect a single reference 290X to look, because as far as W3 is concerned, that’s what it is.
It’s an Nvidia showcase title, I don’t expect any better.
Amazing card.
Good deal faster than Fury X on 4K, smokes it on 1440p which is my preferred resolution, good price, premium quality components used throughout the card, more quiet than Fury X.
Good test KitGuru.
They sacrificed some temps to make it more quiet. I fully support that !
i just agree with kitguru. < Find Here <
it’s a dual chip card (so basically crossfire) and it can even run past 45 fps on 1440p that is piss-poor performance even their single chip cards beat it
i am getting this and a hyper 612 pwm to upgrade my currently crippled rig
i7 2600
GTX 560 from msi that broke so gt 730 1 gb ddr3 64 bit from msi is the temp. gpu
16GB ram
TBs of HDD
etc
having put up witth infernal stock and laptop coolers running at 6000 rpm (100rps or 10ms/rev)
i won’t mind the extra fan speed for cooling… 3000 rpm seems to be the max i can tolerate a non optimised fan design but with this i think i can go to hell with the fans… although 2-3000 rpm at max should be enough that should be like 50 degrees right
SIDENOTE: because i have a micro atx mb it does mean the graphics card can only use the top pci slot so it means the 612 would almost touch the asus but hey more cooling right…
cool the cpu and backside of the gpu with 1 fan (+case)
in another review the temps were 80 ish ‘stock’ and 70 ish oc with a more aggresive fan
so that means it would be well under 65 with ‘stock’ clocks and more fan speed
40 fps @1440p is disastrous for that card, AMD are slowly killing themselves because of poor drivers and optimisation. If you look at Tomb Raider which is an AMD showcase title (tress FX first outing) Nvidia perform very well on there. If Nvidia can get their cards to perform well on AMD biased titles then why can AMD do the same on Nvidia biased titles ?
Because tech like TressFX is made open, deliberately, by AMD. When Tomb Raider came out and debuted TressFX, for a week or two Nvidia fans were screaming and moaning that it didn’t work properly on their cards, until Nvidia fixed it in drivers, and lo and behold, it suddenly worked BETTER on Nvidia cards. It was easy for them because TressFX is open, Nvidia picked up the base code and fixed it up in their drivers.
AMD can’t do that with Nvidia-biased titles because Gameworks features (such as Hairworks) are a black-box. Nvidia doesn’t open that stuff up, they lock it up.
By the way, the new Catalyst driver adds Crossfire support for Witcher 3, so I would expect the R9-295×2 to start kicking ass again at that game.
I’m definitely getting two of these. Nice review and great looking/performing card.
Either AMD need to start shutting Nvidia out or Nvidia need to start sharing more……it is unfair on the gamers.
This is what AMD fans have been saying for a while now. AMD simply can’t afford to try to shut Nvidia out, even if they wanted to – one failed attempt could be disastrously expensive for them. And their open approach tends to benefit all gamers when it’s successful. Nvidia on the other hand has piles of money to spend when Huang’s not swimming around in it like Scrooge McDuck, and a veritable army of devotees who would rather a feature not exist if it’s not an Nvidia exclusive. In my opinion, TressFX is superior to HairWorks (which is just “tesselate the **** out of it” written into code) and is advantageous because it works really, really well on all platforms, but Nvidia has 75% (ish) of the market, so game companies tend to do (and use) what they say.
why 4 gigabites from gpu?, the gtx 980 ti should be get 6 gigabites right?
Yes 1440p is great – I have a triple 1440p setup – Dual 980ti Strixii 🙂
Me too – They are in the post (hopefully)