Home / Component / Graphics / Will GTX570 be caught in new app detection cheat?

Will GTX570 be caught in new app detection cheat?

Over the years, there have been many famous ‘app detection' revelations. Probably the most famous was revealed by Beyond3D when genius uber-geeks discovered that changing the name of the 3DMark 2003 executable file created a huge difference for the performance of nVidia cards. Looking at recent documents posted into the KitGuru forum, we have another cause for investigation on our hands. If it's a lie, then it is a very clever one, and we will work hard to find out who perpetrated it. If it's the truth, then it's a very worrying development. KitGuru powers up a robotic Watson and takes it for a walk on the Image Quality moors to see if we can uncover any truth implicating a modern-day Moriarty (of either persuasion).

Quick backgrounder on image quality, so we all know what we're talking about.

Journalists benchmark graphic cards so you, the public, can make a buying decision. So far, so simple.

The first rule is to use the right tests. There is no absolute right and wrong, but if you remember how far Bob Beamon jumped in the Mexico Olympics in 1968, you will see that his record lasted for 23 years simply because he set the record at high altitude. That's a case where someone ended up with an advantage, without any malice whatsoever. All jumpers were at the same altitude and the jump was amazing anyway.

When Ben Johnson took gold for Canada at the 1988 Olympics in Seoul, he ran straight past favourite Carl Lewis and appeared to take the world record for 100 metres with a stunning 9.79 seconds. He had used steroids. No one else was on drugs. The gold medal and world record were stripped.

So, two key things to look for with benchmarking are:-

  1. Is the test itself fair for everyone ?
    For example, does the test use a lot of calls/functions etc that only one card has – and which are not found in the majority of games that a consumer is likely to play. Every year, 3DMark is late because one GPU vendor or the other is late, and Futuremark don't want to go live unless they have given both sides a fair shout to deliver next-gen hardware
    .
  2. Is everyone doing the same amount of work ?
    Are short-cuts being taken which mean that the test will automatically score higher on one card than another. If an improvement works in all games/apps, then that is obviously a benefit. However, detecting an application and identifying a way to increase your scores by dropping image quality (in such a way that gamers can see it) is downright sneaky. You could buy a card thinking it's faster than it really is, because it's driver has been tweaked for a benchmark.

The arguments around ‘1' and ‘2' focus on things like ‘If nVidia has more tessellation capability than AMD, then what level of tessellation should be tested ?' and ‘If you want the result for a light calculation, where one side stores all possible outcomes in a table and looks them up, while another GPU calculates the value as it goes – then which is correct ?'. The second was uncovered a few years back in Quake, where it became clear that nVidia's look up function was faster than its calculation ability – so both companies took different paths to the same result. The tessellation argument will probably be solved by Summer 2011 when a lot more DX11 games are out and using that function. We'll then know what constitutes a ‘reasonable load' in a game.

Anti-Aliasing (AA)
This weird and wonderful technique uses calculated blurring to make images seem smoother. Totally counter intuitive, but it works a treat. Why counter-intuitive? Because, in the real world of limitless resolution/detail, when we want to draw the best line possible, we normally try to avoid blurring at all.
Proper AA can also use a ton of memory. At the simplest level, instead of looking at one dot and saying ‘is it black or is it white', you sample dots in a region and set levels of grey. While at the lowest level, it can seem messy, the overall effect is that the lines themselves appear smoother. Which is a good thing. There are different ways to do the sampling. There are also different levels. For most modern gaming benchmarks, people tend to set good quality AA to 4x.

Telling the difference
It's harder to see in a still image, but when a game is running, poorer quality AA will make it seem that the edges of things are ‘crawling'. It can have a negative effect on your ability to play. For example, if you're running past trees, looking for enemies to shoot, then your eyes are really sensitive to small movements. When a branch twitches, is it an enemy preparing to shoot or is your graphic card failing to deliver decent AA ?
In a still, it is easiest to tell when you create a simple animation of one image on top of the other and just swap between them (see below).

‘Evidence' posted on the KitGuru forum
High end cards are often tested using 30″ panels with a 2560×1600 resolution. We have had a lot of ‘documents' posted recently that appear to show a difference in AA quality on second generation nVidia cards like the GTX570 and GTX580. Lowering the image quality when running AA can give a card a boost. The stuff we've seen, seems to show a boost of 8%. That's significant. But is it real? We're going to go deep into the validity of these claims and we invite nVidia to confirm categorically that press drivers for GTX570 (launching next week) will NOT include a lowering of AA quality to get a boost in games like HAWX.

Following the launch of the awesome GTX460, nVidia's Fermi series really seems to have come into its own and it would be a real shame if something strange was being done in the driver. We're firmly camped in the ‘let us hope not' area. Plus, we're happy to test this exhaustively and report the truth.

Is there a difference and is the difference real?
OK, so enough preamble, what is it we're really talking about? Below is a simple animated GIF that (according to the posts we have received) seems to show that the anti-aliasing image quality (IQ) drops off for the GTX570/580 is the driver detects that HAWX (commonly used for benchmarking) is running. Looking along the plane's edge, one of the images definitely appears to show more detail. How was this achieved? By renaming the application, we're being told. It could all be an elaborate ruse, but if benchmarks are being detected to allow image quality to be dropped and benchmark scores to be raised, then that's pretty serious stuff. The animated GIF will take a few seconds to load. We recommend that you let it roll through a few times and you'll see that more detailed sampling seems to be done when the same program is called HACKS than HAWX.

If these shots tell the whole truth and you had to write this ‘logic' into a sentence (that anyone could understand), it would say “If you're being asked to run a game that's commonly used as a benchmark, then do less work”. Have a look and tell us if you can see less sampling when the drivers detect HAWK and not HACKS.

Better quality sampling seems to be happening when it's NOT the standard HAWX benchmark program, but is this real? We're inviting opinion from nVidia and will do some of our own testing to check what's been posted.

KitGuru says: This is a very dramatic story. It is something that we have not seen for quite some time. If it's a hoax, then we're going to make sure everyone knows that this is a forgery. If it's real, then it raises serious questions. Either way, we will know shortly. In the meantime, we invite Nick Stam or someone from the GeForce driver team to contact us with an explanation and we will make updates/evolve the story accordingly. Let's all search for the truth. We've heard that it's out there.

TRUTH UPDATE: Please click here for a reply from nVidia on this subject

Comment below or in THIS existing thread of the KitGuru forum.

Become a Patron!

Check Also

AMD CEO Lisa Su confirms RDNA 4 for early 2025

AMD has kept much of its plans for next-generation GPUs under wraps. Aside from rumours …

10 comments

  1. When will real proof be available? The Inquirer has a story which claims AMD have dropped quality. This all seems very bizarre

  2. What seems bizarre? AMD would drop quality? Nvidia? The only real proof you Need is to try it yourself. If you have the hardware. I personally can’t be bothered.

  3. Where is AMD compared? They know not what they write there or is AMD propaganda?

  4. Did someone open the asylum doors earlier? its like a freakshow of drunken posters tonight

  5. “KitGuru says: This is a very dramatic story. It is something that we have not seen for quite some time.”

    this made me laugh. how about AMD/ATI lowering IQ from HD4xxx to HD5xxx to HD6xxxx ? messing with AF quality, causing texture shimmering, banding, and last but not least, using FP16 demotion trick… not only in one game.

    http://blogs.nvidia.com/ntersect/2010/11/testing-nvidia-vs-amd-image-quality.html

  6. Hi Everybody,

    What is being experienced is not an “Antialiasing cheat” but rather a HawX bug that is fixed by our driver using an application specific profile.

    In a nutshell, the HawX application requests the highest possible AA “sample quality” at a particular AA level from our driver. Without our driver fix, the game would be running 16xCSAA instead of standard 4xAA when you select 4xAA in-game. It runs the proper 4xAA with the driver fix. You defeat the fix by changing the .exe name, causing it to run at 16xCSAA.

    You may remember that Geforce 8800 introduced Coverage Sampling AA (CSAA) technology, which added higher quality AA using little extra storage. Prior to 8800 GTX and CSAA, there was only one “sample quality level” for each AA level, so if an application requested four AA samples, the hardware performed standard 4xAA. However, with 8800 GTX GPUs onwards, our drivers expose additional sample quality levels for various standard AA levels which correspond to our CSAA modes at a given standard AA level.

    The “sample quality level” feature was the outcome of discussions with Microsoft and game developers. It allowed CSAA to be exposed in the current DirectX framework without major changes. Game developers would be able to take advantage of CSAA with minor tweaks in their code.

    Unfortunately, HawX requests the highest quality level for 4xAA, but does not give the user the explicit ability to set CSAA levels in their control panel. Without the driver profile fix, 16xCSAA is applied instead of standard 4xAA. Recall that 16xCSAA uses 4 color/Z samples like 4xAA, but also adds 12 coverage samples. (You can read more about CSAA in our GeForce 8800 Tech Briefs on our Website).

    When you rename the HawX.exe to something else, the driver profile bits are ignored, and 16xCSAA is used. Thus the modest performance slowdown and higher quality AA as shown in the images.

    To use “standard” 4xAA in a renamed HawX executable, you should select any level of anti-aliasing in the game, then go into the NVIDIA Control Panel and set 4xAA for “Antialiasing Setting” and turn on “Enhance the application setting” for the “Antialiasing mode”.

    Nick Stam, NVIDIA

  7. Thanks for your reply Nick.
    We have created a new story for you – to clear this up.
    In addition to your reply, we have also explained the context and linked both stories.
    On the plus side, it’s nice for your driver team to know that your 16xCSAA is noticably better than 4xAA 🙂

  8. AA is not a type of calculated blurring! At least not multisampling nor supersampling nor other multiple sample systems. Neither are information construction methods like MLAA that guess at the results of higher sampling. Only edge-blurring AA methods are blurs, and it’s obvious that, although these can reduce aliasing and so broadly come under the umbrella of antialiasing strategies, they are a different branch to the proper AA methods.

    Blurring adds no information to a scene, only reduces high frequency changes by spreading the information around, reducing contrast and detail. Antialising reduces contrast by averaging a set of samples from within the space (pixel), so the pixel value is a better representation of the information within that space. MLAA attempts to predict/fake what averaging a set of samples will look like without actually sampling.

    The key concept here is infomation density, or sampling rate. How many bits of info about surfaces and materials falling within a pixel’s boundaries are you using to construct your pixel data? If you are increasing information within a pixel, you’ll get antialiasing. If you average information across multiple pixels, you’ll get blurring.

  9. So does the max (15x) AA only occur when nVidia cards are used and the selected amount (4x) when AMD cards are used? If so, then it’s fine to “adjust” the IQ. If it is the same for both cards though then nVidia has found an exploit that they can explain away.

  10. Yet in his blog posted on nvidia.com, Nick Stam states that Nvidia is “ensuring the user experiences the image quality intended by the game developer” and ““NVIDIA will not hide optimizations that trade off image quality to obtain faster frame rates.” The HAWX example and response directly contradicts these statements.

    It is very convenient for Nick to state that there is a “bug” in HAWX, implying that the game developer did not intend to request the highest possible AA quality. Nvidia has been encouraging developers to use these AA modes for some time, per their developer documentation (referenced below). Was the “bug” to follow Nvidia’s advice?

    http://developer.nvidia.com/object/coverage-sampled-aa.html

    CSAA Guidelines
    Performance
    CSAA performance is generally very similar to that of typical MSAA performance, given the same color/z/stencil sample count. For example, both the 8x and 16x CSAA modes (which use 4 color/z/stencil samples) perform similarly or identical to 4x MSAA.
    CSAA is also extremely efficient in terms of storage, as the coverage samples require very little memory.