Home / Component / Graphics / nVidia’s Nick Stam addresses HAWX ‘cheating’ allegations

nVidia’s Nick Stam addresses HAWX ‘cheating’ allegations

A few days ago, a forum member posted information regarding image quality concerns with popular flight combat game Tom Clancy H.A.W.X. There has been much debate about this across the net over the weekend. KitGuru followed up on the forum post to clarify exactly what was being noticed.

As we discussed in our article, the points were being raised in the KitGuru forum, there appeared to be a difference in image quality levels and those differences appear to give an ~8% boost to the GTX570/580 cards. We were clear that our own testing was to happen next week and we invited nVidia to come back to us on the subject if there were any innaccuracies or misunderstandings.

Big thanks to nVidia for taking our concerns seriously.

Nvidia's Nick Stam has addressed these issues himself, responding to KitGuru. Nick Stam is nVidia's Technical Marketing Director. His team provides technical support to Web and print tech media, industry analysts, and business partners. Nick's team also generates reviewer guides and technical whitepapers.

We felt it was important to highlight his reply via KitGuru to our readers so he can clear up any concerns.

Hi Everybody,

What is being experienced is not an “Antialiasing cheat” but rather a HawX bug that is fixed by our driver using an application specific profile.

In a nutshell, the HawX application requests the highest possible AA “sample quality” at a particular AA level from our driver. Without our driver fix, the game would be running 16xCSAA instead of standard 4xAA when you select 4xAA in-game. It runs the proper 4xAA with the driver fix. You defeat the fix by changing the .exe name, causing it to run at 16xCSAA.

You may remember that Geforce 8800 introduced Coverage Sampling AA (CSAA) technology, which added higher quality AA using little extra storage. Prior to 8800 GTX and CSAA, there was only one “sample quality level” for each AA level, so if an application requested four AA samples, the hardware performed standard 4xAA. However, with 8800 GTX GPUs onwards, our drivers expose additional sample quality levels for various standard AA levels which correspond to our CSAA modes at a given standard AA level.

The “sample quality level” feature was the outcome of discussions with Microsoft and game developers. It allowed CSAA to be exposed in the current DirectX framework without major changes. Game developers would be able to take advantage of CSAA with minor tweaks in their code.

Unfortunately, HawX requests the highest quality level for 4xAA, but does not give the user the explicit ability to set CSAA levels in their control panel. Without the driver profile fix, 16xCSAA is applied instead of standard 4xAA. Recall that 16xCSAA uses 4 color/Z samples like 4xAA, but also adds 12 coverage samples. (You can read more about CSAA in our GeForce 8800 Tech Briefs on our Website).

When you rename the HawX.exe to something else, the driver profile bits are ignored, and 16xCSAA is used. Thus the modest performance slowdown and higher quality AA as shown in the images.

To use “standard” 4xAA in a renamed HawX executable, you should select any level of anti-aliasing in the game, then go into the NVIDIA Control Panel and set 4xAA for “Antialiasing Setting” and turn on “Enhance the application setting” for the “Antialiasing mode”.

Nick Stam, NVIDIA


In summary

  • Was there a performance boost for the GTX570/580 cards when the HAWX application was detected?  Yes.
  • Was it specifically coded into the driver as an application detection? Yes.
  • Was it done to cheat on performance? No.
  • Was it done to overcome an issue being experienced in the game by nVidia cards? Yes.
  • In overcoming the coding issue, was a side effect an ~8% performance boost in some benchmarks? Yes.
  • Was the 8% faster score delivered with the 4xAA originally set by the user? YES

So, there you have it. We really appreciate Nick's quick and honest response, which we have – as promised – published in full.

KitGuru readers should feel free to flag any issues you see like this in our forums or to our editorial team. We will investigate all of them and publish any that we believe have merit (screen shots and system specs always appreciated).

KitGuru says: BIG THANKS to our forum members for raising it and BIG THANKS to Nick for addressing it. Hopefully this puts the matter to bed !

Become a Patron!

Check Also

Nvidia RTX 5060 to have 8GB of VRAM, RTX 5060 Ti to feature 16GB

Nvidia's upcoming RTX 5060 series memory specs have been leaked. The new leaks also suggest …

9 comments

  1. So, much ado about nothing, then!

  2. So, much ado about nothing, then!

  3. Cool story, bro.

  4. How sneaky is that?
    Isn’t Hawx a TWIMTBP title?
    Does this issue also affect AMD cards?

    So influence the developers to bulid in a “bug” that will reduce your competitors performance and then build a fix into your drive and when found out you can legitimately claim it was a bug fix even though this was planned all along to impair AMD cards……very cunning/deceitfull ……call me cynical

  5. I dont think you are cynical. its nice to see guys like nick over here helping their cause and it all ‘makes sense’. the only problem I have is that this was kept so quiet and it doesnt seem to quite gel together for me. Something certainly fishy going on somewhere. this isn’t a new game and for them to be running a botched AA method to fix a bug, rather than get the developer to fix it seems underhanded to me.

  6. Yet in his blog posted on nvidia.com, Nick Stam states that Nvidia is “ensuring the user experiences the image quality intended by the game developer” and ““NVIDIA will not hide optimizations that trade off image quality to obtain faster frame rates.” The HAWX example and response directly contradicts these statements.

    It is very convenient for Nick to state that there is a “bug” in HAWX, implying that the game developer did not intend to request the highest possible AA quality. Nvidia has been encouraging developers to use these AA modes for some time, per their developer documentation (referenced below). Was the “bug” to follow Nvidia’s advice?

    http://developer.nvidia.com/object/coverage-sampled-aa.html

    CSAA Guidelines
    Performance
    CSAA performance is generally very similar to that of typical MSAA performance, given the same color/z/stencil sample count. For example, both the 8x and 16x CSAA modes (which use 4 color/z/stencil samples) perform similarly or identical to 4x MSAA.
    CSAA is also extremely efficient in terms of storage, as the coverage samples require very little memory.

  7. [quote]It is very convenient for Nick to state that there is a “bug” in HAWX, implying that the game developer did not intend to request the highest possible AA quality. Nvidia has been encouraging developers to use these AA modes for some time, per their developer documentation (referenced below). Was the “bug” to follow Nvidia’s advice?[/quote]

    When the game is set to 16xCSAA, then the mode used is 16xCSAA
    when the game is set to 4x, then it should only be 4x, not 16xCSAA.

    the reason for the issue is that 16xCSAA is exposed as the highest form of 4x, being a 4+12 mode.

  8. also you obviously haven’t played JC2, the 16xCSAA mode causes more then an 8% performance hit on some cards.

  9. Why it is being done in secret? Please publish list of games that are optimized by drivers! I can’t see difference, except that 8% 🙂 I like that. I would like to see list of games somewhere in driver’s setup – then I could check it. Something like producer’s settings for particular title. However doing it in secret is cheating whatever you say.