Tomb Raider received much acclaim from critics, who praised the graphics, the gameplay and Camilla Luddington’s performance as Lara with many critics agreeing that the game is a solid and much needed reboot of the franchise. Much criticism went to the addition of the multiplayer which many felt was unnecessary. Tomb Raider went on to sell one million copies in forty-eight hours of its release, and has sold 3.4 million copies worldwide so far. (Wikipedia).
We test at 3840×2160 (4K) with the ‘ULTIMATE’ image profile selected. We normally reduce the image quality profile to ‘ULTRA’ at this resolution, but we decided to keep it at the highest image quality possible.
At Ultra HD 4K, the differential between the reference GTX980 Ti and Fury X are almost completely negated. The Fury X responds well to overclocking, pushing the final performance figures to Titan X levels.
Tags Review
There is still a part of me thinks that wonders what if they had just used the regular memory instead of HBM would it have been cheaper ?
AMD have recently said that they are working on the coil whine as it has been noted it is a bit of an issue – if I find the article i read I’ll post the link.
thank you for a well written article with great information
What I learned, 980 Ti>Fury X> 980.
Can’t wait for dx12!
That way you would have ended up with another 290x. And we already have them plenty, don’t we?
The fact is Fury x is very exciting piece of engineering. I personally had two of them a week now, and I am yet to even begin to fiddle with everything they can offer 😉
I think if its price was cut to 600$ it would be a great option .. i believe AMD didn’t do that in the first place due to pride only
As always Kitguru thanks for another great review.
Fury seems to be hitting the 4GB limit and stuttering to unplayable levels, 4GB Fury just doesn’t have enough memory for AAA gaming now and it sure isn’t enough to be called a 4K gaming future proofing solution flagship card.
No 28nm GPU needed HBM1 bandwidth and its measly 4GB of VRAM, the HBM bandwidth is wasted and the 4GB of VRAM isn’t 4K gaming worthy.
Gamers are Much better off with a GTX980Ti especially a heavily-overclocked, custom-cooled GTX 980 Ti. from one of Nvidia’s many partners.
https://www.youtube.com/watch?t=226&v=8hnuj1OZAJs
That was a very good read, I will always have a soft spot for AMD but the R9 Fury X try as it might just doesn’t quite live up to the hype or expectations. With Nvidia pretty much showing their full hand in march, It’s nothing short of disappointing to see the Performance crown remain with Nvidia 3 months down the line despite AMD’s best offering. If you still gaming at 1080p and 1440p there really is only one winner here.
Still going to stick with AMD not going to give my money to a company that has to pay off game devs to make their cards look better.
Fury is essentially a double-sized Tonga chip with a tweak (hence the HDMI 1.4a and DP 1.2a). I think using regular GDDR5 it wouldn’t be fast enough to compete
LOL, AMD paid millions to developers to use mantle and make their cards look better.
You’re a AMDuped hypocrite who doesn’t know what the hell he’s talking about.
At 4K AMD fury x CF won the prize! left Nvda Titan x SLI in the dust read the new reviews from Digitalstorm.com and teaktown.com.
How many games use Mantle compared to Nvidia Gameworks again?
Mantle is open. Game works is closed.
Stupid question AMD hypocrite, go buy a watt sucking AMD Rebrandeon and figure out why your question is stupid.
Not a VRAM problem when other cards with 4GB are not stuttering at 4k.
http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-Fury-X-4GB-Review-Fiji-Finally-Tested/Grand-Theft-Auto-V
And apparently nvidia are running lower IQ settings, at least in BF4.
http://forums.overclockers.co.uk/showthread.php?t=18679713
Who you trying to fool AMD tool.
Mantle was a bug infested beta AMD API and never open.
Game works are developmental tools that Nvidia invested in and owns why should they share it with AMD, nobody is stopping lazy AMD from making their own.
Dx12 = mantle. AMD is not lazy. The intention is Mantle to be open sourced.
great review, well worth the wait – too many fud reviews up for this card on launch day. Always trust KitGuru to be honest. Thanks – JS.
A great read this morning thanks Allan. I love the methodology and all the tests are on the same graphics driver. A few of the reviews on launch day had poor test methods with GPU’s using 3 or 4 different drivers ! and a few like TTL seemed just like a pat on the back for AMD with many contradictions throughout rather than a proper review – quite easy to spot. Always read KitGuru for honesty and this just shows you cut through the crap to get to the facts. Well done.
They already have a v2 watercooled version out there in the wild. You can compare them by the sticker they have. The v1 cooler is the one used here and the v2 is a silver/chromatic logo.
Cheers!
Have a link with that statement? Because anyone could say that nVidia has paid even more since the times of TWIMTBP titles vs GameEvolved from AMD.
No matter how you slice it, nVidia has burned more dough on “”””””helping”””””” developers than AMD.
Also, we love blanket statements, don’t we? 🙂
Cheers!
Can’t compare other cards with HBM1, apparently Fury isn’t able to use all its 4GB of VRAM and becomes a Stuttering mess. Watch the videos Fury is a FUBAR Flagship that needed more VRAM.
Maybe dx12 will utilize HBM a lot better. I struggle to believe AMD have released a useless card lol. Maybe they have who knows. I am waiting for the strix 980 ti sli. I think AMD are no longer the best option for low prices, it seems NVIDIA have taken that crown along with the best performers. It seems AMD is a pointless option for CPU and GPU
That what AMD Fanatics say, that’s NOT what MS says. The intention of AMD GCN only mantle was to gain an advantage for their watt sucking ReBrandeons before DX12 which will float all GPU boats. After wasting millions on over pumped mantle debt laden AMD dumped their mantle for NOTHING, ZERO, ZILCH, NIX, NIL.
maybe
So you really have no argument I see glad to know.
“racistmalaysian 18 hours ago
Dx12 = mantle. AMD is not lazy. The intention is Mantle to be open sourced.”
your assumptions are out of date and wrong unfortunately , however you can write patches and submit them for upstream Vulkan inclusion….
care of Ryan Smith on July 2, 2015
http://www.anandtech.com/show/9390/the-amd-radeon-r9-fury-x-review/12
“….I wanted to quickly touch upon the state of Mantle now that AMD has given us a bit more insight into what’s going on.
With the Vulkan project having inherited and extended Mantle, Mantle’s external development is at an end for AMD. AMD has already told us in the past that they are essentially taking it back inside, and will be using it as a platform for testing future API developments.
Externally then AMD has now thrown all of their weight behind Vulkan and DirectX 12, telling developers that future games should use those APIs and not Mantle.
In the meantime there is the question of what happens to existing Mantle games. So far there are about half a dozen games that support the API, and for these games Mantle is the only low-level API available to them. Should Mantle disappear, then these games would no longer be able to render at such a low-level.
The situation then is that in discussing the performance results of the R9 Fury X with Mantle, AMD has confirmed that while they are not outright dropping Mantle support, they have ceased all further Mantle optimization.
Of particular note, the Mantle driver has not been optimized at all for GCN 1.2, which includes not just R9 Fury X, but R9 285, R9 380, and the Carrizo APU as well.
Mantle titles will probably still work on these products – and for the record we can’t get Civilization: Beyond Earth to play nicely with the R9 285 via Mantle – but performance is another matter.
Mantle is essentially deprecated at this point, and while AMD isn’t going out of their way to break backwards compatibility they aren’t going to put resources into helping it either. The experiment that is Mantle has come to an end…..”
KitGURU please make ********** WIN 10 benchmarks!!! Worst review ever. So useless.
Sure.
great review – are you blind?!
They did benchmarks on win 7!.. who cares about windows 7…. so useless. The only review where I see Fury X behind GTX 980Ti in wicther 3 and metro
Do you even use win 7?!
Win7 tells you everything… who use win7 for gaming?
Why so bad 1440P – how can DX11 on win7 feed 4096GCN cores? It cant… just please remake your bad review on win10.
DX12 won’t do anything extra. All it’s meant to do is give deeper access levels and remove a lot of “going through the CPU” for programs. DX12 is not some kind of magical “fix all” thing. All it is going to do is reduce CPU load and INCREASE GPU LOAD (temps, TDP, etc as well) for games that will be coded in is. THAT’S IT. Maybe there’ll be some extra tech, but aside from that… nothing.
✣☯✤✥☯❋ . if you, thought Gloria `s posting on kitguru …♛♛♛♛ ———Continue Reading
Now I see why AMD didn’t give you a sample. Garbage review. Only 5 games benched? Not a single win for Fury X? Yeah, whatever.
More fun with this site kitguru … Keep Reading
12 year old kids on the internet. Ugh.