Home / Component / APU / AMD: Installed base of GCN GPUs is over 100 million units

AMD: Installed base of GCN GPUs is over 100 million units

Advanced Micro Devices has been using its latest GCN [graphics core next] architecture for its graphics processing units for over three years now and will release at least two new GCN-based product families. Without any doubts, GCN is one of the most successful GPU technologies ever. Apparently, it is one of the most widely used graphics processing architectures too. According to AMD, more than 100 million GPUs worldwide are GCN-based.

“With graphics core next, we provided an architecture to make it easier […] for programmers to unlock the value of our graphics both for visualization and compute,” said Mark Papermaster, chief technology officer at AMD at the company’s financial analyst day. “Really a ground-breaking architecture!

Different incarnations of AMD’s GCN architecture power Radeon graphics cards, accelerated processing units, various embedded application processors as well as high-performance system-on-chips for Microsoft Xbox One and Sony PlayStation 4 game consoles. In fact, game consoles alone account for well over 30 million of GCN-based graphics processors.

“And you can see the adoptions. It is, of course, our discrete graphics business, but it is much more than that,” added Mr. Papermaster. “It is in the game consoles, you see it in the Mac Pro, the iMac 5K, leading workstation designs, server installations. It is scalable, from mobile applications to high-performance computing applications. It has got the installed base of over 100 million for developers to leverage. This has been a very successful architecture!”

amd_radeon_hd_7000_artwork

So far, AMD has released three iterations of its GCN architecture. Next month the company intends to unveil its new GPUs based on the fourth-generation graphics core next – GCN 1.3 – design. Sometimes next year AMD plans to introduce graphics processing units based on the GCN 1.4 architecture, which are projected to deliver two times higher performance-per-watt efficiency compared to today’s GPUs.

Discuss on our Facebook page, HERE.

KitGuru Says: 100 million graphics processing units is clearly a lot. Imagine how many GCN GPUs AMD would have sold if its accelerated processing units were more competitive in general purpose workloads that rely on x86 performance…

Become a Patron!

Check Also

Adventures LEGO Horizon

LEGO Horizon Adventures launches to mixed critic reviews

Critic reviews for LEGO Horizon Adventures are now officially live – with the game seeing a somewhat mixed reception overall.

21 comments

  1. You are slow AMD that’s the problem. Despite being in all consoles, you are very slow.

  2. I wouldn’t say they are slow. Except in performance CPUs. But who is fast all the time really

  3. Irishgamer Gamer

    Will say it again, does any gamer give two fecks about how much power consumption. Its speed, noise and heat. I accept the more heat, probably means more power, but its not the first, second or third question u ask yourself when buying a card.

  4. So nVidia knocking out their flagship cards based on the GK110 for almost 3 years is lightening fast?

  5. mclarenfan1968

    Quite amusing the press have made up numbers like GCN 1.0, 1.1, 1.2, 1.3, 1.4 etc. There is no such nomenclature. It has always been GCN, GCN 2nd generation (GCN2),GCN 3rd generation GCN3). To say 1.x would imply they are supplementary extensions to the first GCN which clearly they are not, you only need to look at the ISA + uArch changes to know it.

  6. i, personally, never cared about power consumption, i’m all about power.

  7. doesn’t really matter how you call it, but it’s common to put version numbers… your argument would mean they should name it 1.0.1 and 1.0.2 etc. but then you just always have a useless 0 in between, so it doesn’t seem practical

  8. also, a big change to GCN would mean it goes from 1.x to 2.0, so the numbering system by “the press” seems to make the most sense

  9. mclarenfan1968

    It’s not what makes sense to you that is important, it is what AMD themselves call it in their developer documents for ISA documentation. Plus the changes are big.

  10. mclarenfan1968

    No it does, the changes are big, perhaps you are not aware of the extent of the changes and the implications when it comes to designing them in RTL.

  11. performance and price are always the most important for me… power consummation can really add up over time and cost a lot of money, so you should think about that too when looking at the price.. let’s say you’re a hardcore gamer (quite likely when you buy a high-end GPU) and play 8 hours a day (a demanding game where the GPU easily is a constant full load) then the difference between 400W and 200W means 584 kw/h per year… say you keep your GPU for 2 years, then it’s over 1000 kw/h… depending on where you live, you might only pay 0.1$ for power but in Germany for example it’s around 0.3$, so that would be 300$ in additional power cost

  12. well, AMD themselves always call it GCN, so the press has to give it version numbers so it’s clear what they’re talking about

  13. mclarenfan1968

    You don’t seem to read what is posted, AMD have a scheme, it’s GCN, GCN2 and GCN3, very clear to anyone who reads with attention. The press makes up their own numbers due to lack of knowledge or just pure laziness. The rest merely parrot it to imply the changes are minor when they are actually significant.

    Here http://amd-dev.wpengine.netdna-cdn.com/wordpress/media/2013/07/AMD_GCN3_Instruction_Set_Architecture.pdf

  14. the difference is that you’re talking about developer content, while I’m talking about marketing content for consumers… AMD only uses the term GCN for consumers and then lists it’s features (like True Audio yes or no)

    but anyway, yes, the press could take the same naming scheme that is used for developers instead of taking the marketing scheme and extending it themselves..

  15. Well, it is the job of the “press” ( such as this) to make AMD look marginal and intel look larger than life. Just scan the ads on this page and see how many of them do NOT have the intel logo. Now ask yourself how this site and many others like it can survive without “sponsorship”. It is no wonder most of the tech rags have been bought out by microsoft and intel and constantly spew nonsense to their advantage.

  16. Same here, if a customer is willing to shell out $600 for a videocard, how will it be concerned that the card will increase the powerbill by $2.00?? I am very very happy with my R9 290X.

  17. < col Hiiiiiii Friends….upto I saw the draft ov $8289 , I accept that my father in law woz trully earning money parttime from there labtop. . there moms best frend has been doing this for only about 8 months and recently cleard the mortgage on there house and purchased a new Dodge . linked here HERE’S MORE DETAIL

    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~l

  18. Someone who games 8 hours a day is not a hardcore gamer – but an addict without friends, a full time job and social life, responsibilities outside of their gaming life in his mother’s basement!

    I think power usage comes into play if both cards are similar in features and performance. If I can buy a card that uses 50-100W less power and both are similar, I’ll take the more power efficient one. But in the US when R9 290/290X sell for $220-280 but GTX970 is $320, the choice becomes much harder. I would also take R9 295X2 500W for $600 over a GTX980 for 550 because the former is 60% faster.

  19. What? R9 290x is as fast as GK110. You might want to look up reviews in the last 12 months
    for $150 less. Today a $280-300 R9 290X means anyone who purchased a $650 780 or $700 780Ti overspent/wasted $.
    https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/29.html

    Where AMD majorly dropped the ball was their brand image by shipping reference coolers on 7970/290 series cards instead of sending open-air cooled cool and quiet factory pre-overclocked cards to reviews like NV always does. Also, AMD messed up pretty bad by allowing Maxwell to dominate the marketplace for so long, starting with GTX750/750Ti as early as Feb 2014. Having said that, Maxwell hardly revolutionized the landscape. The only truly impressive card is the Titan X as far as performance goes but its price is laughable really. Waiting for GM200 6GB (980Ti) and R9 390 to shake up the GPU landscape. 🙂 If not Pascal and 14nm next year we go! This entire 28nm tail-end generation sounds boring anyway.

  20. I was referring to people complaining that it’s taken so long for AMD to replace the 290 when nVidia are working to the same timescale just a couple of months ahead.
    I own a 290x vapor-x and have yet to see it struggle with a game @ 1080p. What I don’t get is why everybody is so hung up on the reference cooler. I have never seen a reference 290 in the metal.

  21. I’m glad you chimed in to explain this very important detail. Our kids today really don’t look at this or care. Howe er, us adult games, who love power gaming rigs and video cards do in fact make informed choices when it comes to paying the electric bill. I used ot not care either in the early 2000’s. But today as energy has become very much increase in cost that .1 makes an enormous difference over a year or two. It practically can buy ANOTHER VIDEO CARD. No kidding. You are totally right. In fact after doing my own energy star updates to the house I would like to see the program (energy star) rolled out to target AMD and Nvidia both to make them compliant to energy consumption specification. People don’t understand that gaming on these things is like running a microwave continously at half power in most cases. That’s alot of energy use when playing a game. And were also going to start seeing the problem get worse with Direct X 12. DX12 as you already know is highly efficient at utilizing all cores and GPU pipelines. Thus increasing energy consumption by a very large factor. I am an advocate at this point that we need to start pushing energy star programs onto Intel, Nvidia and AMD at this point. Im glad AMD is taking this into it’s own hands. But we need to do better than what we already are right now. I’m sure you agree this can get out of hand. And alot of gamers parents probably aren’t aware of the fine details of how a simple software API such as DX12 can cause increased energy costs on that computer. If they knew you’d have parents all over who are green people all over these three companies.