Home / Component / CPU / Is Intel about to start using Radeon tech for future iGPUs?

Is Intel about to start using Radeon tech for future iGPUs?

Earlier this year, Intel ended up cutting its workforce down by 11 percent, pushing 12,000 or so people out of the company. As it turns out, a heck of a lot of graphics engineers may have been caught in the line of fire back in April, which has since spurred an apparent deal between Intel and AMD, with Intel said to be licensing Radeon GPU technology for future integrated graphics.

The initial story here originated back in May, when HardOCP's Kyle Bennet broke a story claiming that multiple sources had come out to confirm that the Radeon Technologies Group was in fact in talks with Intel about supplying graphics technology for future CPUs. With a significant number of graphics engineers being pushed out the door at Intel, the company wanted to hand more of the heavy lifting over to AMD and RTG.

radeon

Things fell quiet on this front for a few months but just a few hours ago, Bennet returned to the HardOCP discussion thread on this topic to say “The licensing deal between AMD and Intel is signed and done for putting AMD GPU tech into Intel's iGPU”. This was followed up with “Intel in no way wants this to be public”.

Obviously, this is quite the bombshell if true but it also gives us some potential insight into the Radeon Technology Group and how it is being run. It almost sounds like RTG's leadership is distancing itself from AMD's core CPU team and putting itself in a better position to free itself from AMD entirely and return to ‘the ATI days'. It is being speculated that all of these recent graphics deals with Apple and now potentially Intel could all be contributing to a potential acquisition or spin-off.

In preparation for this story, KitGuru spoke with AMD PR representative Joseph Cowell who told us that he “can’t comment on Rumours”. Unfortunately, this appears to be the best statement we can get out of AMD right now, so nothing has been truly confirmed or denied at the time of publishing.

Discuss on our Facebook page, HERE.

KitGuru Says: While a lot of this story is based on anonymous sources for the time being, it is certainly an interesting spin on the direction things are going over at AMD and the Radeon Technologies Group. We will be keeping a close eye on this story for any future developments.

Become a Patron!

Check Also

AMD CEO Lisa Su confirms RDNA 4 for early 2025

AMD has kept much of its plans for next-generation GPUs under wraps. Aside from rumours …

27 comments

  1. From an earlier far east leak, lookit https://www.ptt.cc/bbs/PC_Shopping/M.1479306781.A.82D.html translation of a paragraph, lookit ‘Finally, on the GPU, AMD now talk about “AMD”, talk about “Radeon”, the future GPU part may be independent, and CPU separation, , In fact, early GPU can be independent. Ah, I remember when the ATI’…..

  2. Well thia would be a good boost if true

  3. For which brand? If Intel takes over RTG, this isn’t about IGP, it’s going to be Intel versus nVidia in the graphics side and AMD will not play a graphics role. If Zen can’t bring AMD back, they have nothing to fall back on. It’s a gambled suicide move.

    If AMD kept RTG, and allowed their IGPs in Intel APU/SoC, it would add sales count towards AMD due to Intel’s vast IGP market and improve support towards AMD graphics if its in all Intel processors with graphics. Plus most consoles have AMD in them anyway, so the transition for ports isn’t unique.

    In this manner, nVidia is in a bad way.

  4. So far, the only part of AMD tech will be their GPU software and firmware, not actual silicon copies of AMD APUs. Big difference than say grafting RX 460 cores onto a Intel CPU. Intel and AMD have MANY cross licensing agreements like this. No big deal and I doubt it will bring Intel integrated graphics up to AMD standards.
    Then again, this could indeed be another attempt by AMD to make needed cash on their extensive patent portfolio and indeed bump Intel’s IG up. Or it could be a rumor to put pressure onto NVidia to up their graphics tech with Intel. Who knows?

  5. To be honest, Intel’s GPU cores are actually more powerful than both nVidia and AMD.

    Even the first few generations of GT2 graphics with just 16 cores were managing to give playable framerates in games at low settings/low res. A GeForce 210 also has 16 cores, but can handle barely more than Windows 10 desktop acceleration (opening/closing the start menu chokes it up and makes it lag).

    The 48 core Iris models are reaching up towards the 320/384 core R7 240/250s. Not bad for something that comes free with a CPU and has less than 1/6th of the cores. Not to mention how long its taken for GPUs to really reach 1500+ MHz while the HD 4000 in my work PC’s 3770K will run happily at 1700MHz and beyond.

    Put a stack of 1000+ Intel GPU cores and it would easily kick ass.

  6. IM ok with this if RTG can be a complety sperate comapny and go back to the ATI days and only focus on Graphic Tech… I can see RTG cards once again being a buyable option … same with AMD…. if amd gose back to only CPU tech and dev I can see amd cpu once again completeling with Intel on all fronts…

  7. Nikolas Karampelas

    the number of cores means nothing, the only thing that matters is how they perform.
    Just adding cores it doesn’t mean you add performance, you may add performance but you also add thermal and electric load and from a point on the performance will start to degrade and the 1 core 1 point of performance will start being 1 core 0.5 performance and so on (just a random number e.g.)

    Also you don’t just put 1000+ cores on silicon and magic happens. You need to find a way to make them work together and have enough bandwidth in the inner connections to support them working together, which is just easy to say but is hard as hell, also until recently it was just impossible to efficiently have more than a 2 digits number of cores working together (I don’t remember the number, but I read somewhere they had a breakthrough and they can now link more cores).

  8. if you are going by that metric, the highest end Mali cores from ARM are the most powerful

  9. I don’t think this is something that is real, not once has this come up in an earning conference call and I would expect that something of this magnitude would get a passing comment. The Apple thing has no weight, Apple for years avoided Intel and most likely would be using Raven Ridge APU’s in their laptops. It just makes no sense why AMD would license RTG tech to Intel, AMD’s integrated graphics have always been a significant advantage of theirs so why give a competitor something that would hurt the licenser, then there is the Intel acquiring RTG, which is just not ever going to happen, a significant amount of revenue for AMD comes from graphics and semi custom SOC’s, the development of their processors is tied to the development of their graphic to begin with

  10. IF this happens I always thought AMD wanted to do it to get a TB3 license for external graphics with AMD CPUs…

  11. Google is paying 97$ per hour! Work for few hours and have longer with friends & family! !mj153d:
    On tuesday I got a great new Land Rover Range Rover from having earned $8752 this last four weeks.. Its the most-financialy rewarding I’ve had.. It sounds unbelievable but you wont forgive yourself if you don’t check it
    !mj153d:
    ➽➽
    ➽➽;➽➽ http://GoogleFinancialJobsCash153TopDataGetPay$97Hour ★★✫★★✫★★✫★★✫★★✫★★✫★★✫★★✫★★✫★★✫★★✫★★✫★★✫★★✫★★✫★★✫★★✫★★::::::!mj153d:….,…..

  12. No mystery here. This is a licensing patent agreement similar to the one that Intel have had with Nvidia since 2011.

    But of course whereas Nvidia fanboys didn’t claim that “Nvidia and Intel signed for putting Nvidia GPU tech into Intel’s iGPU”, the same is not happening with AMD fanboys.

  13. Very good point

  14. Yeah, AMD fanboy like Kyle Bennet!

  15. Thats where AMD struggles, Fury X with 4096 cores is slower than 980Ti with 2816, it’s how the front end of the chip spreads the work across the cores, and AND just concentrating on adding more cores didnt give asgood result as it should have

  16. Well, it was all fake. Charlie said that Kyle invented a fake rumor, and it is now confirmed that is the Intel AMD collaboration (really Intel Radeon Group collaboration)

    https://www.kitguru.net/components/graphic-cards/matthew-wilson/intel-and-amd-are-teaming-up-for-a-cpugpu-bundle/

  17. In which market does Intel compete with nVidia? They are in direct competition in the CPU/APU market with AMD, your argument is flawed.

    Really it could be argued that AMD has realized that the only area it is ahead of the competition is iGPU. Yet by the nature of that market they can only command the same percentages as they realize in the CPU/APU market. This deal allows them the opportunity dominate the iGPU market, if they are in Intel and AMD CPU/APU’s that is pretty much 100% of the market.

    This is more a question of the competition between Intel and AMD on the CPU/APU market. Has nothing to do with nVidia.

  18. Intel and Nvidia are major competitors in hpc. Both are trying to get their hardware in AI, automatic cars, etc and these are lucrative markets.

  19. Yeah I see your point, there are developing markets that nVidia may well compete with Intel. But neither are reliant on these markets, they are plans for the future (probably investing more than profit realised) rather than needed sources of income.

    I was trying to make the point that this issue has nothing in the slightest to do with nVidia and doesn’t affect them in any way. I find it fascinating that everything AMD or nVidia do turns into some sort of trolling session.

    AMD and Intel compete in the market what they both consider their main source of income. They are direct competitors in their biggest markets!

    nVidia could possibly end up being a bigger competitor to Intel but right now they don’t really compete with them at all.

    So yes your argument is flawed!

  20. You might think it flawed but I don’t consider present AMD as competitor to Intel. If zen is a success though, things will change but at the moment Intel will be more wary of Tesla than firepro. Tesla competes with Xeon phi quite well in HPC while it has won contracts in AI and automotive cars. Nvidia is already winning contracts in these developing markets and Intel is lagging behind.

  21. I do get what you are saying.

    I am pointing out nVidia NEED the GPU market, Intel NEED the CPU market.

    That is what all their projections are based on, these are the markets they have to concentrate on. They are as we speak bringing in billions of dollars every year for each of these businesses.

    AMD NEED to become more competitive in both of these markets or split there efforts. Or open up the iGPU market (think outside the box) like this article suggests.

    Yes I fully accept that the world of IT is constantly moving forward and maybe in the future nVidia and Intel could be completing head to head in a market each considers it’s most lucrative market.

    But nVidia can’t really get involved in the CPU market as they can’t allow their CPU’s to run x86 code. Intel have tried producing GPU’s and gone quiet on that front (maybe even outsourcing to AMD, as this article says)

    The markets you mention I would say are insurance, in the world of IT a market can disappear over night and you can’t have all your eggs in one basket.

    Take the nVidia Shield line for example. Had gaming in general gone over to streaming, the gaming GPU market would shrink dramatically. nVidia protected itself just in case that is the way things went.

    Both these companies will be constantly looking for markets and developments which they feel they have an advantage. But they are not really direct competition at the moment is the way I see it.

  22. Even tho it’s the first sentence of his comment, you don’t get it, it’s not about the number of cores, it’s about performance. GPU architectures are way more diverse than CPU.

    “AND just concentrating on adding more cores didnt give asgood result as it should have” And then you shoot yourself in the foot. GPU architectures are complex and different. And also more power doesn’t mean more performance when you compare different architectures, with GPU there is a BIG reliance on the SOFTWARE side, API, libraries, etc. whereas a CPU is the actual main processor of the machine and everything is build from the ground up for that specific architecture, and architecture’s variations are also required to follow a similar scheme with a set of characteristics that can’t be altered.

    Also you fail on realizing something: AMD’s solutions are not slower, they have WAY more power than Nvidia’s, AMD’s issue has always been drivers and software integration, that’s why back in the days of bitcoin mining an AMD HD7770 was performing better than a GTX 680.

  23. Utter Bollocks.

  24. This is ‘UTTER FANTASY BOLLOX’, just in case anybody missed the fact this RAVING FUCKING LUNATIC is Jabbering on from the depths of a PADDED CELL, SOMEWHERE AT THE BOTTOM OF ARKHAM ASYLUM’S CELLAR!
    Clueless meandering guess work at best. .
    Pathetic.

  25. Yet ANOTHER diatribe of meaningless bull shit drivel.
    Shut up if you have nothing to say of consequence.
    A good indicator for you would be to write it down first and see if it makes any sense
    to you personally?
    Because you are jabbering on like a weasel on crack cocaine to everybody else,you
    MUPPET.

  26. ‘I’m o.k.’ .. More fucking DRIVEL from a self important nobody piss artist too drunk to complete a line without spelling mistakes.

  27. Yeah i know its about the arch not just the core count, thats what i was saying, their drivers and stuff are really not taking advantage of the core count in their GPUs,

    AMD kicks nVidias ass in bitcoin mining because of poor GPGPU performance on nVidias part, AMD built GPGPU into their arch hence why it performs so much better.