Home / Component / APU / Interview: AMD claims huge software advantage over Intel

Interview: AMD claims huge software advantage over Intel

Much of KitGuru's unique content focuses on the hardware that enthusiasts use to fulfil their dreams. But there's something to be said for the software too. In the same way that a pair of near identical F1 cars can finish miles apart, based on the ability of the driver and the nature of the test – so computer hardware tests can be won and lost with drivers. KitGuru caught up with everyone's favourite Pink Floyd fan, Terry Makedon, to find out how the Fusion-class battle is being waged in the software trenches.

The idea of having everything you need, integrated onto the mainboard, is far from new. Indeed, in an industry full of ironies and mis-placed research budgets, maybe it's not that surprising that Intel itself is credited with the production of the first video graphics controller board, when it launched the iSBX 275 in 1983. Two years later, Commodore's Amiga had a separate chip for graphics, which we guess makes it the first GPU. In 1987, IBM offered the idea of a separate graphics card that could be added to an existing system to improve graphics.

Complete systems integrated into one stylish unit? Dedicated hardware for graphics? It will never catch on.

From these uncertain beginnings, things progressed relatively slowly for 16 years, until nVidia's marketing team landed the GeForce 256 and the world became a much more competitive place.

More recently, Intel has been wondering about ways to create massive arrays of X86-type processing units which deal with graphics in much the same way as any other data, while at the same time ATI/AMD has been working out new ways to bring multiple GPUs inside a single chip, next to multiple CPU cores.

Right now, Intel's challenger for the integrated arena of tomorrow is Sandy Bridge and for AMD we have Accelerated Processor Units (APUs) like the A8 3850.

For AMD these APU chips are Fusion-class processors. When you create an entire system based on an APU, then you are experiencing the AMD Vision platform.

World's first shot of a second generation Fusion chip - as brought to you by KitGuru and pepper pots, many eons ago

In the same way that a Formula 1 car's maximum speed is ‘zero' until a driver steps into the cockpit, so these new integrated solutions need their software and drivers in order to achieve maximum performance. This is the area in which AMD believes it has Intel beaten. On the right track and in the right conditions – either combination could become world champion in that specific set of tests.

Onto the Q&A with AMD's well known public personality Terry Makedon. KitGuru had some very specific questions and, for a change, we're going to break with tradition and publish the whole thing, in its original order, so you can see exactly where Terry is going.

Traditionally, we don’t think of CPUs needing software drivers – what’s the difference between a driver for an AMD Fusion APU and the old style GPU/chipset drivers?
Terry: Every piece of hardware in a PC requires drivers. It’s just that some are more visible or important than others. The driver for the AMD Fusion APU (Accelerated Processing Unit) is literally the exact same as the traditional AMD Catalyst drivers for our AMD Radeon GPUs. Since an APU has full discrete level GPU functionality in it, there is a need to have our GPU drivers to support both discrete GPU’s and the graphics found within an APU.

We’ve heard you on a recent conference call saying that, while AMD Fusion hardware is impressive, you have significant advantage over Intel on the drivers/software side. Care to clarify/augment?
Terry: Since our APU drivers (now called AMD VISION Engine Software) are built on the foundation of our AMD Catalyst drivers we continue to update our software every month. Microsoft WHQL certified drivers every month is something we have been doing for many years now. We continually roll out free updates to performance, bug fixes and even new features. Our pedigree in driver development is well documented and as such, we feel that it will continue to give us an advantage on the APU side as well.

AMD's Terry Makedon loves classic cars, fine music and traveling. Don't we all?

There has been a lot of press coverage recently about benchmarks. Specifically, Nigel Dessau seems to have been really vocal about the way older benchmarks do not consider the advantages of shared workload between CPU and GPU. What is a fair test for today’s AMD Fusion-class systems?
Terry: I am a fan of real world usage scenarios. For me it means how the system will behave when it is doing real workloads. I have never used OCR nor do I care how my system performs in it – so why would I use that as a benchmark? I believe the fairest test is a combination of what the majority of people do on their PC’s – web browsing, HD Video viewing and converting, gaming, USB photo transfer speeds, etc.

Once upon a time, a system powered by AMD CrossFire technology required a huge external cable. Later on, it became a neat internal bridge. Now the original vision for AMD Fusion – where multiple graphics and CPU cores co-exist in a single chip has become a reality – so how has the driver effort been forced to evolve?
Terry: A lot of work was needed to re-architect the driver for what we call asymmetric AMD CrossFire technology. Traditionally each GPU would render a frame so GPU1 would do frames 1,3,5,7,9, etc and GPU2 would do frames 2,4,6,8, etc. Now with our new Dual Graphics feature (which lets a GPU and an APU work together to render games) that limitation is gone – we can now do asynchronous rendering so for example GPU1 does frames 1,2,3,5,6,7,9,10,11, etc and APU1 does frames 4,8,12, etc. This was the biggest change required in our driver effort to support this.

With continued hardware development and even more integration, do you see a time where driver-type processes are actually embedded in update-able microcode on the chips themselves?
Terry: Maybe way out in the future, chips will be smart enough to auto update themselves and correct their own bugs, but in the meantime due to the way operating systems work (i.e. they manage the memory rather than letting the GPU driver manage the memory) drivers will still be a separate component that will let the hardware communicate with the OS.

There is a lot of new language around AMD Fusion-class products. What are the key terms that a regular user MUST be able to get their heads around, if they are to make an intelligent decision in store?
Terry: We have attempted to simplify terminology to its bare minimum. The AMD VISION Engine is the main thing that enables the APU to do its great video/graphics/compute ‘stuff’.  It is composed of AMD Radeon Cores, AMD Video Accelerator and AMD VISION Engine Software. I am a fan of keeping things simple so at its most fundamental level if it has an AMD VISION Engine you are pretty well set.

From KitGuru's own testing, even a low cost Fusion system with a Radeon 6670 card, can pump through benchmarks in a very effective way. When you see this in action, you have to remind yourself that CrossFire is taking place between the graphics component within the APU and the add in graphic card itself. When SLi first launched, a specific connector was required between your 6800 graphic cards, in order to allow fast enough direct communication for this kind of technology to work. Now it all happens through standard system buses.

KitGuru says: No matter how hard AMD tries, it is really up against it in a direct fight with Intel. The Sandy Bridge architecture was stunning, Intel has its die shrinks ready to go in 2012 and the next architectural improvements are well underway. Against that, the world is a very graphical place, so AMD and nVidia have certain advantages over traditional CPU companies when it comes to moving and abusing pixels. Either way, 2012 is shaping up to be a much better place for customers on a budget who just want to buy a single machine that's pretty good at everything – without breaking the bank or ramping up a huge electricity bill.

We thank Terry for his openness and time – always a pleasure to chat with the Catalyst Maker.

Become a Patron!

Check Also

AMD Ryzen Z2 APU to pack up to 8 CPU cores and 16 GPU CUs

According to one leaker, Ryzen Z2 chips will offer up to 8 CPU cores, paired with 16 RDNA 3.5 GPU cores. The first Ryzen Z2 devices are coming in 2025.

5 comments

  1. Excellent. he is a fun guy. miss him on the forums.

  2. If they had the hardware to compete with intel they would dominate. but I have high hopes for bulldozer to take on the 2500k and 2600k.

  3. Lovely suprise to see a picture of the Amiga there 🙂 Wonder what Amiga’s we’d be using if it was still alive?

  4. The idea of custom chips for graphics is much older than the Amiga. The Amiga was the third step in an evolution starting with the Atari 2600 gaming console. In 1979 the Atari 800 also had graphics (and sound) custom chips. All three were designed by design teams around the late Jay Minor.

  5. Yes first i had a Amiga 1000 when it first came out. I beleave Jay minor was the original creator of the custom chips in Commadore’s Amiga 1000. Great Machine
    loved Electronic Arts Deluxe Paint. It took IBM and Microsoft so many years to surpass it!!! Yes if it had been number one there is no telling where we would be today.