Home / Tech News / Featured Tech Reviews / Asus GTX580 Direct CU II Review – the ultimate 580?

Asus GTX580 Direct CU II Review – the ultimate 580?

Futuremark released 3DMark Vantage, on April 28, 2008. It is a benchmark based upon DirectX 10, and therefore will only run under Windows Vista (Service Pack 1 is stated as a requirement) and Windows 7.  This is the first edition where the feature-restricted, free of charge version could not be used any number of times. 1280×1024 resolution was used with performance settings.

The GTX580 from Asus walks away with the performance level test in 3DMark Vantage, scoring 25,660 points – 4,000 points more than the reference GTX570.

Become a Patron!

Check Also

Ducky One 3 Pro Nazca Line Keyboard Review

The One 3 Pro Nazca Line keyboard from Ducky feature the revamped Cherry MX2A switches

32 comments

  1. Lol Asus are insane. love it, even if I could never afford it

  2. Some of the best reviews on the net here, without a doubt. awesome work.

  3. Man, that got me hard.

  4. well thank god its not another AMD review ! great job on the testing. that card is amazing looking and in performance. cant see it anywhere for sale. not that I can afford it mind you, but im curious.

  5. Pinnacle of power

    Read it and weep AMD ! these cards are kicking ass all over the world 🙂

  6. I would sell my left kidney for this card, what an amazing bit of work from ASUS. well showcased on KG, great review.

  7. I love the metal coolers, this is way out of my league, but i love the metal cooler on the msi card you reviewed today too. so much nicer looking than plastic crap. (HIS blue cooler springs to mind!)

  8. Something interesting has happened to me since November. I generally opted out of AMD hardware. not because its poor or bad, but because nvidia have always been better.

    Reading all the reviews here lately and especially the FLeX 6870 review has really made me want to get into 3 or even 4 screen setups. I know a stunning card like this ASUS board will do it, if you end up with two of them, but the cards alone would cost one thousand quid. Two HD6870’s and 3 24 inch screens would cost the same as two of these cards.

    its horses for courses and im still not sold on AMD drivers, but its got me thinking. At least I have till the summer before I work out what to do. (getting married soon!)

  9. Absolutely love it. its Asus insanity again. engineering overload 🙂 great job.

  10. Nice to see you reviewing some nvidia hardware again. Always enjoyed reading your reviews of the new hardware. This is certainly one hell of a sexy card, but the price is a little too rich for my blood. Still, great to read and a good highlight on ASUS, as its often with sapphire!

  11. good lord, what a monster of a card. My 5670 is shaking.

  12. Imagine four of these in SLI, how good would that be !

  13. Lovely bit of work from Asus there. Totally overkill, but we all love that, right?

  14. I still these cards are overpriced, not just the Asus one, but the 580s in general.

  15. The backplate is mounted with screws in a reverse position, that’s unusual.

  16. Nice cooler design, wonder how it would compare against the arctic cooling extreme.

  17. I see a problem, it’s really too powerful for a single screen, but to use it across 3 you need two of them.

  18. I will be ordering this when I can find one.

  19. We all love the attention to detail, but I would love to know how many of these really do sell, a few thousand worldwide maybe? How many did the ares sell?

  20. Awesome! I love Asus designs

  21. Id still rather have an hd6970

  22. I like my games but I see no real need for this power. I’ve still got a 5870 and at 1920×1200 its brilliant. I’d rather have a new CPU and a psu 🙂

  23. Why no direct comparison with a standard GTX580 ?

  24. Price, link, availability?

  25. @ Madone.

    Why would you compare results against a reference GTX580? the clock speeds are basically the same.

  26. It says in the article,. only SLI for 3 screen gaming. so you would need two of them.

  27. One question … how did you do the stability testing on your overclocking? I have one and… is that 973Mhz core and 4700 memory furmark stable? I think that 1.125v is not enough for those clocks … at least, not to pass furmark 🙂 I’m really curious on how did you do the stability testing 🙂

  28. Bear in mind every card will be slightly different. How far can you push yours? I generally test with unigine for artifacting, it’s a great stress test and easily to see when issues occur

  29. I only test with furmark … and my is topping out at 961Mhz, 4335Mhz mems … at 1.15v …

    That’s why i’m kind of doubting the 1.125v will resist to furmark … but that’s me 🙂
    Runing 3dmark 2010 it will do 976Mhz without crashing and 4450 on the memories … but that’s not 100% stable because at furmark benchmark … once it gets at 98% the fans stop and the graphic driver gives stops responding …

    But yes, i know not all cards are equal 🙂 I was just thinking about the testing method … because, those 4700 memory is huge … it’s 350Mhz more than mine … not even talking about the core because it’s “not that high” …

  30. Interesting, I’ve had two and I get similar with both cards. Is your memory artifacting in games or is it just fur mark stress testing?

  31. Nope… it just crashes the graphic driver… no artifacts what so ever. I can run it at 961 and 4315Mhz zero problems … more than that … caboom 😛

    It simply quits the application … sometimes i get a reboot, other just crashes the explorer and graphic driver 😡

  32. Furmark has it’s own issues as stated over at Guru3D, it’s been shunned!

    “Note: As of lately, there has been a lot of discussion using FurMark as stress test to measure power load. Furmark is so malicious on the GPU that it does not represent an objective power draw compared to really hefty gaming. If we take a very-harsh-on-the-GPU gaming title, then measure power consumption and then compare the very same with Furmark, the power consumption can be 50 to 100W higher on a high-end graphics card solely because of FurMark.

    After long deliberation we decided to move away from FurMark and are now using a game like application which stresses the GPU 100% yet is much more representable of power consumption and heat levels coming from the GPU. We however are not disclosing what application that is as we do not want AMD/NVIDIA to ‘optimize & monitor’ our stress test whatsoever, for our objective reasons of course”.

    Original link: http://www.guru3d.com/article/liquid-cooling-overclocking-geforce-gtx-580-danger-den/8

    Worth a note I thought.