Home / Component / CPU / How would ARM finance the purchase of nVidia?

How would ARM finance the purchase of nVidia?

One of the most persistent rumours of the past 5 days has been that nVidia might be in play on the market. The market for desktop graphics has not suddenly become significantly bigger over the past 3 months, yet nVidia's share price has been steadily inflating updward. Overall bouyancy in the market, or does someone know something?  KitGuru investigates.

If you'd told industry folk that AMD was going to buy ATI in 2006, they would have laughed in your face. AMD merging with nVidia was always on the cards, and the subject of many board level meetings behind closed doors. The ATI purchase came completely out of left field, surprising everyone.

We have just searched the interwibble for ‘ARM buys nVidia' and there's nothing. There are 3 possibilities. First: It's not happening. Second: It's been rumoured, but no one gives it any credibility. Third: It's in the process of happening, but is such a well guarded secret that the combined secret police of Santa Clara, USA and Cambridge, UK have been told to clamp down on any rumours or even rumours of rumours.

As KitGuru already reported on 14th November, nVidia's revenues have been very flat for a number of years. While integrated graphics booms (and will boom even more with Sandybridge and Fusion), the dedicate graphics market is finite. With game developers now fixated on the 1080p console market, while a cartel (led by Apple and Dell) seemingly limits the production of 30″ panels with 2560×1600 resolution, the demand for high end graphics grunt is not the same as it was in yesteryear. Improvements to SLi and CrossFire have also meant that if you really, positively, definitely need more frames per second, then simply adding in another GTX460 or Radeon HD 6850 will quench your GPU thirst.

The boom time is coming in mobile devices, it is coming with a broader range of household products and it is coming in places we never imagined we would see graphics. You can't access those markets with a GPU alone. You need a CPU and you need an infrastructure (chipsets, protocols and what have you).

Born from the ashes of the UK's top computer company of the 1980s (Acorn), ARM makes its real money from developing cool stuff (Intellectual Property etc) and then licensing it to a massive ensemble of customers. Although everyone has heard of ARM, it doesn't seem to want to position itself as ‘hero'. That mantle is left to its customers. With more IP and more licensing deals, ARM would grow even more. Indeed, one look at ARM's financials is quite an eye opener:-

Strong Arm or Arm Strong - KitGuru is lost as to which pun fits best. But you get the idea.

ARM has its hand inside most of the 4 billion phones that are presently connected on the planet, as well as the Apple iPhone and iPad products. The connection with Apple is strong since, alongside Acorn and VLSI, Steve Jobs' outfit was responsible for getting the company off the ground.

OK, enough with the background, what would be the mechanics/challenges that such a take over would entail?

How Much?
First question has to be “How much is nVidia worth?”.  Strangely, the answer to that question would probably depend on the answer to question two, who runs it. ARM and nVidia have very different business models, but almost identical MarCap values at $8 Billion. When ARM was subject to buy out speculation earlier in the year, the figure quoted was $8 Billion (although its share price was hovering around $10). If nVidia's board and major shareholder backed the deal, could it be bought for $10 Billion? That depends on the share price. If it sky rockets, then the deal becomes much more difficult. A sharp enough spike and the deal could cost $15 Billion or more. Where would they get that money from?  Now THAT's a good question.

Who Runs It?
Little secret that the biggest stumbling block to other company takeovers involving nVidia has been Jen Hsun's insistence that he be the main man after the companies come together. If the role of CEO were on offer, then maybe it would be pushed through at the lowest possible price.

When?
Tough one. Probably the best indicator would be share price. nVidia's has been rising steadily since August. That could be (a) because of the launch of the GTX460 or it could mean that (b) some people are speculating that it's ripe and wanted to make sure they have a position.  If we see a sharp spike in share price over the next 8 weeks, then that would be a very powerful indicator.

Post Merger
If it happened, then the world would become a very strange place for both companies for at least a year. With 2,000 working for ARM and almost 6,000 working for nVidia, there would be a substantial number of lay offs. All those years spent implementing SAP etc would suddenly pay off as you could dump a load of accounts, human resource and other paper pushers but standardising onto one central admin core. Do you need two sets of sales people and two sets of support/IT/admin?  Same for buildings. You might not lose 1,500 jobs within 3 years, but it could be something like that. Whether the heads that roll are green or blue would depend a lot of who gets the top jobs. Wonder how many of these folks will still be at ARM in 24 months' time? We're prepared to go out on a limb and say that, judging from his LinkedIn contacts with nVidia folk, Antonio Viana's position would be rosier than most. But this is all just speculation, for now.

Winning in a new, faster-moving & greener regime could come down to who has the better contacts

KitGuru says: Maybe we're wrong, but it does not seem possible – in the long term – that a graphics only business can survive against Intel and AMD. The market is simply not big enough. ARM buying nVidia would really put the cat among the pigeons on a global scale and make licensing out nVidia's technology much easier. Watch this space!

Comments below or in the KitGuru forums.

Become a Patron!

Check Also

Nvidia reportedly ramps up production on RTX 50 GPUs

Nvidia is reportedly shifting things up in the production lines as it gears up for the launch of its next-gen RTX 50 series graphics cards.

7 comments

  1. Interesting indeed – having in mind the “battle” on notepad/books/tablets (and the rest of the small personal devices – PDA term anyone? 😉 processors between Atom and ARM…
    Put Anroid OS in the play abd you have nice Intel/M$ contender… 😀

  2. Mergers are natural for business. Its gonna be an oligopoly for processors not a duopoly

  3. “If you’d told industry folk that AMD was going to buy ATI in 2006, they would have laughed in your face”

    http://www.theinquirer.net/inquirer/news/1024447/i-convinced-amd-ati

    I didn’t think it was all that funny, but people did laugh. I was right though. 🙂

    -Charlie

  4. No idea if you’re the real Minneapolian deal… but ‘welcome’ if you are 🙂

  5. This story only analyses the mechanics of how a takeover would occur, and not actually why. It also doesn’t relate how the rumour came about.

    Well, the story is completely absurd. ARM operates independently of any of its customers, and NVIDIA is one of those customers. NVIDIA can quite happily sell Tegra into the CE markets, and seem to have made the connections to proliferate into a wide variety of companies and devices with Tegra2. ARM is NOT – in any way – built to sell into the CE market via OEMs and ODMs, since ARM does not – and will not – get into selling SoCs. ARM’s primary business is selling to semiconductor companies – fabless and “fabbed” alike – of any description, including Intel (who pay ARM lots of money in royalties via ARM11 and continuing XScale sales – I kid you not!)

    ARM has its own graphics outfit – they’ve spent millions penetrating the mobile graphics market, and will continue to do so – and has no need to purchase NVIDIA on that front.

    This article simply doesn’t live in the real world, and if ARM makes any acquisitions, it will be more intelligent than the one proposed here!

  6. Actually many takeovers don’t make sense and if. Anyone tells me intel spending 8 billion on mcafee was a great move they need slapped. AMD buying ati at the time was seen as a rash decision and strange, but for a CPU developer to exist and grow now they need gpu elements. Apu is the future and neither intel or Nvidia are in a strong position. Larabee is a disaster and Nvidia have no cpu expertise. While it may very well not happen, it would be a very good Move for both Nvidia and arm, especially for mobile supremacy.

  7. I’d agree that Intel wasted $8Bn on Mcaffee trying to prop up the PC software market.
    AMD/ATI’s merger, however, was a meeting of minds that did make sense, even if it didn’t seem so at the time (by virtue of it being the first of its kind). The reason it made sense was because each company sold chip products direct to the consumer. ARM is not in that marketplace, so the melding of ARM and NVIDIA does not make any sense.
    NVIDIA does have a CPU architecture – ARM – and they are putting it to good use. They will no doubt continue to work with the ARM architecture in future, and would be quite comfortable rolling out awesome products not only for the mobile world but the desktop too. NVIDIA don’t need Intel to survive in the post-PC era, and they have actually been quite prescient developing tegra – they saw the future even if Intel didn’t.
    This concept of an “APU” is a fallacy – the mobile world has long been the leader in silicon integration – especially of CPUs and GPUs – and will continue to be so. The sheer variety of SoCs available means you can have whatever mixture of CPU/GPU you please – I wonder how limited the choice will be with these APUs?
    The PC hasn’t changed much in 18 years, while the rest of the world has moved on, and its pathetic attempts at catching up are to integrate just a couple of PC components together. NVIDIA has got it right, and already Tegra2 is a superior integration of functionality to any APU, and I dare say the devices into which they will be integrated are vastly more useful on a day-to-day basis than any desk-bound PC.

    I pity the old order of computing, but at least NVIDIA seems to have seen it coming.