Home / Component / Graphics / AMD’s Arctic Island GPUs to offer double performance per watt over Fiji

AMD’s Arctic Island GPUs to offer double performance per watt over Fiji

Next year is going to be an interesting one for graphics cards, Nvidia has Pascal up its sleeve and AMD is expected to launch its new Arctic Island GPUs, which will apparently offer double performance per watt over the current Fiji graphics cards, which already offered quite a substantial improvement, as we saw with Fury offering 1.5x performance per watt over the R9 290x.

The Arctic Island claim was actually made by none other than AMD CEO, Lisa Su in last week's earnings call: “We are also focused on delivering our next generation GPUs in 2016 which is going to improve performance per watt by two times compared to our current offerings, based on design and architectural enhancements as well as advanced FinFET products process technology.”

amd_radeon_fiji_gpu

We don't know a ton about the Arctic Island series of GPUs at this moment in time, though the flagship is apparently codenamed ‘Greenland' and will feature HBM2, which will bring 1TB/s memory bandwidth. Rumours also point to 18 billion transistors on the upcoming flagship card from AMD next year, a big jump compared to the 8.9 billion currently found on Fiji GPUs.

We have reported on Greenland in the past and it is currently rumoured that AMD has already taped it out, though we don't know when to expect more official information at this time.

Discuss on our Facebook page, HERE.

KitGuru Says: We're hearing big promises from both Nvidia and AMD for their upcoming GPUs next year. It will be interesting to see how things stack up when the new architecture rolls in. Are any of you waiting on next year's GPU releases before upgrading? What sort of performance are you hoping for? Personally, I'd like to see single GPUs finally able to handle high frame rates at 4K. 

Become a Patron!

Check Also

Nvidia reportedly ramps up production on RTX 50 GPUs

Nvidia is reportedly shifting things up in the production lines as it gears up for the launch of its next-gen RTX 50 series graphics cards.

23 comments

  1. Oo this looks promosing! they showed that HBM was quite an efficient improvement so hbm2 danggg cant wait to see the results!

  2. valgarlienheart .

    I keep getting tempted to get a fury but the sensible option is to wait until next year

  3. i cant wait to see what nvidia will show us with pascal, a72/denver and 16nm finfet + for tegra
    And i’m even more interested what volta, 10nm, lpddr5, wide io3, ufs 3 will give us.

  4. I have always bought nVidia since Riva TNT but competition is always good for consumers.

    Also, would like to see other players entering the gaming GPU industry: Intel, Matrox etc.

  5. N7 Manchild Elite

    AMDEAD

  6. Very sensible. Next year wont be your standard 20-30% jump. It will be 50%+. This is what we need as a minimum for 4k gaming. Although 1440p is probably gonna be the next standard for the next few years till budget cards can handle 4k at 60fps.

  7. 18 Billion result in a big GPU for sure. Hope yield at TSMC 16nm wont be an issue.

  8. 4K with >60 hz is whats required but the hardware hasn’t come up yet . Love to see them though but its gonna take sometime for 4k to become available in every house

  9. Note she says “Current offerings” not Fiji specifically. This means that 2X perf/watt could mean over the R9 390x, which although is not power hungry and performs quite nicely, is nowhere near the perf/watt of the Fury series.

  10. Fury is not a current offering ?

  11. True, I may have just misread the quote to mean something else :p. I read that as “up to 2X perf/watt”

  12. The exact numbers are not clear yet. Depending on the focus the upcoming cards can either be way more faster, less power consuming or a mix of both. Performance per watt is a bit misleading because the figure can be calculated in so many ways.

    Both AMD and Nvidia have shown their ‘creative calculating’ regarding such numbers. We should wait til the actual cards are released. I agree with your prediction for resolutions. My personal favorite WQHD at 30 to 34 inch. I will aim for a 21:9 ratio monitor in 2016. Those 3440×1440 curved displays should be manageble with the new generation of GPUs of both vendors.

  13. girls make a shitload these days with those cam sites, bet you have a quad titan-x running in your system with your hourly strip-… paycheck

  14. I also bought only Nvidia since Riva, but switched to AMD with the 3870 and stuck with them since. Part of the reason being that I didn’t see enough people buying ATi/AMD to fund them enough so they could keep Nvidia honest.

    I highly recommend that anyone who can buy AMD stuff do so, as Nvidia having over 75% of new GPU sales is not healthy for competition. Don’t rely on other people to do it, they won’t, which is why after being Nvidia-only for over 10 years, I’m now mostly AMD only.

  15. The really ironic thing about all the N7’s knocking AMD is that EA (the makers of Mass Effect) love AMD and were big pushers of Mantle and GCN and stuff, and that N7 colors are AMD’s, including the very N7-looking Fury reference cards.

  16. That’s just stupid. You buy what you think is good and worth your money. I’m not saying ati/amd is not good. But it is up to them to create products that people would buy. Lower power consumption and temps and they’d be as good as nvidia if not better

  17. valgarlienheart .

    I agree I feel the same way, AMD do so much good for the industry with open standards and such that I want to support them, Nvidia have shoddy practices at best, it’s not to say I won’t buy Nvidia products (have a 970 in the GFs machine) but if the situation allows I’ll get an AMD. Also I won’t buy another Nvidia product unless they start supporting FreeSYNC.

    Nvidia are just out to separate people from their hard earned cash any means necessary.

  18. With over 80% of all high performance cards sold being NV, I don’t see how that is even relevant anymore. AMD is going the way of dodo, they are half-dead already, so why bother?

  19. I buy what I need. Unfortunately every Autodesk and Adobe package uses CUDA for 3d Physics/video/image processing acceleration. If software houses used more OpenCL, I would go AMD all the way, since their cards are cheaper and in GPU computing are not that bad, really.

  20. Damn, that’s crazy considering GeForce 5 and 7 were so garbage and during HD4800 to HD7900 series AMD mopped the floor with desktop NV cards in price/performance, while making money with crypto-currency. Even if you were only gaming, how in the world was it possible to buy GeForce 4 and 5 during the 9500Pro/9700Pro/9800Pro/9800XT era? You need to evaluate your GPU purchase from scratch every new generation — that’s the only way to keep competition alive. Buying blind based on brand name isn’t going to provide competition.

  21. You assume AMD will never make good products worth buying? This is already false right now where besides 980Ti/Titan X AMD offers better value for the $ for desktop gaming than NV. Since most gamers don’t buy $650 cards, having AMD providing better graphics cards in the $100-500 space is vital for the rest of the market.

    If it wasn’t for 980Ti’s stellar overclocking, this would have been one of the closest generations in a long time. Let’s not forget the insane pounding AMD has done to NV when NV had to drop prices on GTX280 and 780 to fight HD4870 and R9 290/290X. We need competition and if you just want to blindly buy NV every generation without considering tech rather looking at brand name, that’s 100% your choice but a lot of gamers don’t want to pay extra for similar FPS simply for NV’s marketing brand name/image. Once I put a GPU inside my system, I don’t care if it’s NV or AMD. It puts out X FPS and it cost me Y.

  22. Correct me if I’m wrong, but AMD’s gpu’s are far superior in professional software compared to Nvidia cards (sans CUDA).
    The lack of support for OpenCL however is disturbing, but not surprising when you take into account that Nvidia bribes big companies to exclusively write their code to support CUDA… and most companies just go with the cash offers (sadly), and don’t want to bother with more open source possibilities.

    AMD however recently presented an option that would allow developers to simplify writing code that works for CUDA on their hardware as well.

  23. Still wont be enough for 4k, i’ll be sticking with 1080p 120hz for the forseeable future.. might go 1440p next. 4k means you need 2 GPU’s and with the issues they have i don’t think i wanna do that.. plus i prefer high refresh rate.