Home / Tech News / Featured Tech Reviews / Visiontek Radeon R9 Fury X 4GB Review

Visiontek Radeon R9 Fury X 4GB Review

The R9 Fury X 4GB is a positive move forward for AMD as it indicates the potential they have to produce small, powerful graphics cards – able to fit inside very small cases and system builds. HBM certainly seems to be tightly engrained into the future of graphics cards, even though the first iteration ships with a rather major caveat – the 4GB stacked limitation.

Right now in the real world, a 4GB framebuffer isn't really an issue if you want playable frame rates at 4K resolutions. I did notice many of our readers on Facebook were holding back from a GTX980 purchase claiming that 4GB isn't futureproofed. When the GTX980 Ti launched, that extra 2GB of memory seems to have appeased a wide audience. AMD, and the Fury X are likely to suffer from the same ‘4GB is just not enough' mindset.

Within the last week I have reviewed the AMD 390 (review HERE) and 390X (review HERE) , both of which ship with 8GB of GDDR5 memory. While undoubtedly reassuring to see on paper, it is clear to us that the 390 series Hawaii core runs out of horsepower at 4K before the 8GB of memory can be fully utilised.
650px2
While a head to head battle is going to very game specific, we can see that the R9 Fury X struggles to compete directly against Nvidia's GTX980 Ti. At 1440p the performance differential is heavily in favour of the GTX980 Ti, but at Ultra HD 4k the gap tightens considerably.

Most times the GTX980 Ti claims the top position, although the Fury X delivers fantastic results in Witcher 3, Grand Theft Auto 5 and Metro Last Light Redux at 4K resolutions and when manually overclocked it did manage to outperform the reference GTX980Ti in Metro LL.

Unfortunately for AMD, the reference GTX980ti is only part of the problem. There are many modified partner cards from MSI, Gigabyte and ASUS that bring substantial clock improvements to the table, as well as superlative cooling systems. Our results highlighted that the monster Gigabyte GTX980 Ti G1 Gaming literally leaves the R9 Fury X for dead, even when the AMD card is manually tweaked to the absolute limit.

Overclocking results from the R9 Fury X are a little disappointing. I had hoped that incorporating a liquid cooling system would reap dividends when we got to pushing the card past the reference speeds. I could only manage 80mhz extra from the Fiji core and while a software bug would allow memory sliders to be adjusted, it immediately caused blue screens on our test system.

One of my biggest concerns after testing the Fury X is related to the noise. There is no doubt that the Cooler Master ‘all in one' solution is proficient, and the fan has been designed to emit very little noise, even when loaded. Most of the noise our meter picked up was actually emitted from the pump and other related coil whine. As we detailed (with links) on the acoustics page, this doesn't seem to be isolated either which does raise some concerns. While we have heard that some early samples given to the press were at fault, our own sample is from the retail production line.

When I factor everything in, it is exciting to see AMD becoming competitive again in some areas. The R9 Fury X is not the GTX980 Ti killer that many people had expected and I do have some rather worrying concerns about the cooling system and coil whine. We are also disappointed to see that AMD have NOT decided to incorporate HDMI 2.0 support into the Fury X – meaning many people gaming on large televisions are stuck with 30hz at 4K. This is far from ideal.

The custom designed Nvidia GTX980 Ti partner cards are still our recommended high end choice.

I asked one of KitGuru's Senior Editors (Motherboards and Processors) Luke Hill to get a second opinion on The R9 Fury X:

While AMD's newest single-GPU flagship is not the GTX 980 Ti-killer many were hoping for, it does manage to restore the red team's competitiveness in the high-end GPU market. I feel that some of the technological innovations, especially the first widespread application of HBM, and continued use (leading on from the R9 295 X2) of an AIO water-cooling unit are positive points which lay foundations for the future of graphics cards.

As a fan of the benefits that monitors with synchronising technologies (G-Sync and FreeSync) add to a gaming experience, I feel that the R9 Fury X deserves some credit in this department. Many current 1440P FreeSync monitors seem to be hitting a sweet-spot with gamers, and some are considerably cheaper than similar G-Sync offerings. With many of the monitors featuring a lower boundary of 35 or 40 FPS for FreeSync to run, the R9 Fury X seems to be AMD's first single-GPU card that can consistently push minimum frame rates above the 35/40FPS FreeSync limits in many of today's games at their highest image settings. That, by many, will be deemed a positive from AMD's work.

However, from a raw performance standpoint, the R9 Fury X performance levels force it into a position where the card needs to be undercutting Nvidia's GTX 980 Ti in terms of pricing. A quick look at OverclockersUK shows a ~5% price benefit for the cheapest R9 Fury X against the cheapest GTX 980 Ti, which happens to be a reference model. That slight saving and the benefit of a water-cooling system may be enough to persuade some buyers, although it's quite an ask given the GTX 980 Ti's overall performance lead. When we start entering the £550+ region, AMD's flagship and its limited overclocking potential will struggle to compete against heavily-overclocked, custom-cooled GTX 980 Ti boards.

Both myself and my colleague Allan agree that if AMD can drop the price more towards £449.99 inc vat and get a fix for the coil whine and pump noise, we do feel the R9 Fury X has a logical place in the market.
overclockers logo 250px
You can buy AMD R9 Fury X partner cards from Overclockers UK HERE. Prices currently range between £509.99 and £649.99 inc vat.

Our American readers can get the Visiontek R9 Fury X 4GB sample from Amazon in the US for $682.17 with free shipping HERE.

Discuss on our Facebook page, over HERE.

Pros:

  • runs cool under load.
  • super quiet fan.
  • physically small, ideal for specific system builds.
  • HBM is likely the future of GPU design.
  • AMD's greatest single GPU card yet.
  • strong 4K performance.

Cons:

  • pump noise.
  • radiator takes up more space in case.
  • coil whine is intrusive.
  • minimal overclocking potential at this stage.
  • the GTX980Ti is still the faster board.
  • struggles to compete at 1440p.
  • No HDMI 2.0 support.

Kitguru says: The R9 Fury X is a daring first step for AMD into the world of HBM. While on paper the card looks to be ‘all dominating', in the real world it struggles to compete against the Nvidia GTX980Ti. We also have some concerns over pump noise and associated coil whine.
WORTH CONSIDERING

Become a Patron!

Rating: 7.5.

Check Also

First AMD UDNA GPUs expected in 2026

AMD's unreleased UDNA GPU architecture is back in the news, with a fresh leak suggesting …

36 comments

  1. There is still a part of me thinks that wonders what if they had just used the regular memory instead of HBM would it have been cheaper ?

  2. AMD have recently said that they are working on the coil whine as it has been noted it is a bit of an issue – if I find the article i read I’ll post the link.

  3. thank you for a well written article with great information

  4. What I learned, 980 Ti>Fury X> 980.

    Can’t wait for dx12!

  5. That way you would have ended up with another 290x. And we already have them plenty, don’t we?
    The fact is Fury x is very exciting piece of engineering. I personally had two of them a week now, and I am yet to even begin to fiddle with everything they can offer 😉

  6. I think if its price was cut to 600$ it would be a great option .. i believe AMD didn’t do that in the first place due to pride only

  7. As always Kitguru thanks for another great review.

    Fury seems to be hitting the 4GB limit and stuttering to unplayable levels, 4GB Fury just doesn’t have enough memory for AAA gaming now and it sure isn’t enough to be called a 4K gaming future proofing solution flagship card.

    No 28nm GPU needed HBM1 bandwidth and its measly 4GB of VRAM, the HBM bandwidth is wasted and the 4GB of VRAM isn’t 4K gaming worthy.

    Gamers are Much better off with a GTX980Ti especially a heavily-overclocked, custom-cooled GTX 980 Ti. from one of Nvidia’s many partners.

    https://www.youtube.com/watch?t=226&v=8hnuj1OZAJs

  8. That was a very good read, I will always have a soft spot for AMD but the R9 Fury X try as it might just doesn’t quite live up to the hype or expectations. With Nvidia pretty much showing their full hand in march, It’s nothing short of disappointing to see the Performance crown remain with Nvidia 3 months down the line despite AMD’s best offering. If you still gaming at 1080p and 1440p there really is only one winner here.

  9. Still going to stick with AMD not going to give my money to a company that has to pay off game devs to make their cards look better.

  10. Fury is essentially a double-sized Tonga chip with a tweak (hence the HDMI 1.4a and DP 1.2a). I think using regular GDDR5 it wouldn’t be fast enough to compete

  11. LOL, AMD paid millions to developers to use mantle and make their cards look better.

    You’re a AMDuped hypocrite who doesn’t know what the hell he’s talking about.

  12. At 4K AMD fury x CF won the prize! left Nvda Titan x SLI in the dust read the new reviews from Digitalstorm.com and teaktown.com.

  13. How many games use Mantle compared to Nvidia Gameworks again?

  14. racistmalaysian

    Mantle is open. Game works is closed.

  15. Stupid question AMD hypocrite, go buy a watt sucking AMD Rebrandeon and figure out why your question is stupid.

  16. Not a VRAM problem when other cards with 4GB are not stuttering at 4k.

    http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-Fury-X-4GB-Review-Fiji-Finally-Tested/Grand-Theft-Auto-V

    And apparently nvidia are running lower IQ settings, at least in BF4.

    http://forums.overclockers.co.uk/showthread.php?t=18679713

  17. Who you trying to fool AMD tool.

    Mantle was a bug infested beta AMD API and never open.

    Game works are developmental tools that Nvidia invested in and owns why should they share it with AMD, nobody is stopping lazy AMD from making their own.

  18. racistmalaysian

    Dx12 = mantle. AMD is not lazy. The intention is Mantle to be open sourced.

  19. great review, well worth the wait – too many fud reviews up for this card on launch day. Always trust KitGuru to be honest. Thanks – JS.

  20. A great read this morning thanks Allan. I love the methodology and all the tests are on the same graphics driver. A few of the reviews on launch day had poor test methods with GPU’s using 3 or 4 different drivers ! and a few like TTL seemed just like a pat on the back for AMD with many contradictions throughout rather than a proper review – quite easy to spot. Always read KitGuru for honesty and this just shows you cut through the crap to get to the facts. Well done.

  21. Francisco Andrés

    They already have a v2 watercooled version out there in the wild. You can compare them by the sticker they have. The v1 cooler is the one used here and the v2 is a silver/chromatic logo.

    Cheers!

  22. Francisco Andrés

    Have a link with that statement? Because anyone could say that nVidia has paid even more since the times of TWIMTBP titles vs GameEvolved from AMD.

    No matter how you slice it, nVidia has burned more dough on “”””””helping”””””” developers than AMD.

    Also, we love blanket statements, don’t we? 🙂

    Cheers!

  23. Can’t compare other cards with HBM1, apparently Fury isn’t able to use all its 4GB of VRAM and becomes a Stuttering mess. Watch the videos Fury is a FUBAR Flagship that needed more VRAM.

  24. Maybe dx12 will utilize HBM a lot better. I struggle to believe AMD have released a useless card lol. Maybe they have who knows. I am waiting for the strix 980 ti sli. I think AMD are no longer the best option for low prices, it seems NVIDIA have taken that crown along with the best performers. It seems AMD is a pointless option for CPU and GPU

  25. That what AMD Fanatics say, that’s NOT what MS says. The intention of AMD GCN only mantle was to gain an advantage for their watt sucking ReBrandeons before DX12 which will float all GPU boats. After wasting millions on over pumped mantle debt laden AMD dumped their mantle for NOTHING, ZERO, ZILCH, NIX, NIL.

  26. maybe

  27. So you really have no argument I see glad to know.

  28. “racistmalaysian 18 hours ago
    Dx12 = mantle. AMD is not lazy. The intention is Mantle to be open sourced.”

    your assumptions are out of date and wrong unfortunately , however you can write patches and submit them for upstream Vulkan inclusion….

    care of Ryan Smith on July 2, 2015

    http://www.anandtech.com/show/9390/the-amd-radeon-r9-fury-x-review/12

    “….I wanted to quickly touch upon the state of Mantle now that AMD has given us a bit more insight into what’s going on.

    With the Vulkan project having inherited and extended Mantle, Mantle’s external development is at an end for AMD. AMD has already told us in the past that they are essentially taking it back inside, and will be using it as a platform for testing future API developments.

    Externally then AMD has now thrown all of their weight behind Vulkan and DirectX 12, telling developers that future games should use those APIs and not Mantle.

    In the meantime there is the question of what happens to existing Mantle games. So far there are about half a dozen games that support the API, and for these games Mantle is the only low-level API available to them. Should Mantle disappear, then these games would no longer be able to render at such a low-level.

    The situation then is that in discussing the performance results of the R9 Fury X with Mantle, AMD has confirmed that while they are not outright dropping Mantle support, they have ceased all further Mantle optimization.

    Of particular note, the Mantle driver has not been optimized at all for GCN 1.2, which includes not just R9 Fury X, but R9 285, R9 380, and the Carrizo APU as well.

    Mantle titles will probably still work on these products – and for the record we can’t get Civilization: Beyond Earth to play nicely with the R9 285 via Mantle – but performance is another matter.

    Mantle is essentially deprecated at this point, and while AMD isn’t going out of their way to break backwards compatibility they aren’t going to put resources into helping it either. The experiment that is Mantle has come to an end…..”

  29. KitGURU please make ********** WIN 10 benchmarks!!! Worst review ever. So useless.

  30. Sure.
    great review – are you blind?!
    They did benchmarks on win 7!.. who cares about windows 7…. so useless. The only review where I see Fury X behind GTX 980Ti in wicther 3 and metro

    Do you even use win 7?!

  31. Win7 tells you everything… who use win7 for gaming?
    Why so bad 1440P – how can DX11 on win7 feed 4096GCN cores? It cant… just please remake your bad review on win10.

  32. DX12 won’t do anything extra. All it’s meant to do is give deeper access levels and remove a lot of “going through the CPU” for programs. DX12 is not some kind of magical “fix all” thing. All it is going to do is reduce CPU load and INCREASE GPU LOAD (temps, TDP, etc as well) for games that will be coded in is. THAT’S IT. Maybe there’ll be some extra tech, but aside from that… nothing.

  33. ✣☯✤✥☯❋ . if you, thought Gloria `s posting on kitguru …♛♛♛♛ ———Continue Reading

  34. Now I see why AMD didn’t give you a sample. Garbage review. Only 5 games benched? Not a single win for Fury X? Yeah, whatever.

  35. More fun with this site kitguru … Keep Reading

  36. 12 year old kids on the internet. Ugh.