Home / Lifestyle / Mobile / Apple / Apple closes in on own GPU design

Apple closes in on own GPU design

Once upon a time, Apple was in the habit of developing an armada of hardware itself. Jobs and Co loved the idea of being full-blown designers where each and every component could be unique, stylised and better than everyone else's bits. Apple's drive to mass market status saw that strategy dumped. KitGuru ponders whether the future will look a lot like the past.

While a lot is made of the fact that Apple is wonderfully unique, when Steve Jobs returned to his baby's helm at the end of 1996 he had two very strong directional ideas in mind.

First, he took a wad of cash from Microsoft to allow the Bill gates outfit to create a version of Windows which had a strong ‘look and feel' of an Apple environment.

Second, he welcomed Intel processors with open arms [sad pun intended] – at least into Apple's desktop systems.

The fact that Microsoft leapt at the chance to copy Apple's interface shows, in retrospect, the kind of illogical glee you might expect from chickens being given the best corn available around the 20th December. By making Windows more and more like Apple, it made it easier and easier for PC users to migrate. Wonder if Steve Balmer ever invents a time machine, would he go back and develop Windows as the ‘anti-Apple' interface?

Anyway, back on to graphics.

Word reaching KitGuru's Echelon tap says that Apple has now been through many iterations of a next generation graphics chip design and it's now into the final straights, in terms of finalising the design and beginning to think about when production might take place.

Those in the know, might remember Raja ‘Buttery Nipple' Koduri and Bob ‘Quiet Man' Drebin joining Apple a couple of years back. These guys know a thing or three about creating new graphics architectures from scratch and would provide the ideal core [sad pun, again, intended] around which to build a full blown development team.

Koduri and Drebin hand it to each other. Credit for Apple's new graphics core, that is.

Why would Apple want its own architecture?

Well, as portability becomes more and more widespread – the demand for high end CPUs drops off considerably. Creating a new CPU from scratch is a mammoth engineering effort and all you will end up with is a me-too product that has to fight tooth and nail with Intel, AMD and Arm etc.

Graphics allows you to innovate in new and unusual ways – plus you can easily offload production to places like TSMC.

While nVidia and AMD are bemoaning Microsoft's total indifference to developing DirectX further than its present '11' format, open graphics platforms allow for a lot more flexibility – IF you can be certain what kind of hardware you are targeting.  With Apple, there is no option, so it will be a very fixed/stable environment – perfect for introducing a bunch of new and unusual features and methodologies in a GPU.

A GPU that can be radically different from everything else in the market, because of Apple's 1984-style control of its environment.

All your graphics chips are belong to me

KitGuru says: Would Apple use its new ‘GPU' simply for PCs? Not likely. With generations of new iPads to follow and Apple's eyes all over cinema and home consoles, you can get that any new graphics chip worth its salt will see itself being used far and wide across the Apple range.

Comments below or in the KitGuru forums.

Become a Patron!

Check Also

Leaker claims Nvidia RTX 5070 Ti will pack 8,960 CUDA cores

Leaker Kopite7kimi, known for accurate Nvidia leaks, claims that a GeForce RTX 5070 Ti is in the works and could launch alongside the RTX 5080 at CES.

2 comments

  1. So Apple might become huge in graphics without ever building a single graphic card to sell elsewhere? Its like India being able to produce more rice than anyone else on the planet, and then eating all themselves 🙂

  2. If anyone can do it, they can. lets be honest they already do it with the cpu in a partnership with IBM. GPU like intel HD wouldn’t be hard.

    something like the HD6990 or 580 gtx is a different story.