Home / Component / Graphics / nVidia lowering IQ in Tom Clancy H.A.W.X.?

nVidia lowering IQ in Tom Clancy H.A.W.X.?

It seems there is much discussion online about nVidia possibly lowering the driver IQ in Tom Clancy H.A.W.X. to increase benchmark scores. This is nothing new as HARDOCP reported about a similar issue in October within HAWX 2 Benchmark.

Sadly it appears that this is in the actual game itself, according to reports on our forum. We haven't had the chance to test this ourselves, but there are images and details in this forum thread which may be interesting to our readers. If you get time over the weekend to test this out yourselves, please post in our forums and share your findings. It would be great if this could be confirmed or denyed from our readers.

KitGuru says: Dirty tricks or something less sinister? share your findings.

Become a Patron!

Check Also

Intel Arc 140V iGPU Benchmarks vs Radeon 890M

Today we are taking a closer look at the Arc 140V iGPU, found in Core Ultra 200V (Lunar Lake) CPUs

9 comments

  1. Why rename the file to HAWX.EXE that file belong to “Tom Clancy H.A.W.X”
    not ” Tom Clancy H.A.W.X 2 ” .
    the executable file in Tom Clancy H.A.W.X 2 is supposed to be HAWX2.EXE

  2. yeah I think it is. but this is HAWX 1.

  3. What has to do with HAWX2? HAWX1 is compared but very old and no longer in reviews already not under DX9.

  4. On noes it’s attack of the idiot Nvidia fan bios..

    You muppet hawx is dx10.1 capable. And this clearly says hawx in the title. Stop sniffing the glue.

  5. Capable or not, where is the DX10. 1 was used? This is nothing more than cheap and bad AMD propaganda. Here things are mixed with each other to do nothing.

  6. Moss, best just signing off man, you are making no sense at all and are certainly not helping Nvidia if that is your intention. I fail to see how posting information on a possible forced image q reduction, and asking people to check it out themselves and share findings is bad and ‘cheap AMD propaganda’. I never unerstood this blind and insanely passionate following for a graphics card company. All other tech sectors seem to have more level headed and sane sounding people discussing potential issues.

  7. You do not understand the relationship “Will GTX570 be caught in new app detection cheat?” It should be obvious that it comes here get a NVIDIA, since 1.) (the settings indicates 2) Compare with AMD does not. We can say much, only that needs are not always true. And here is something what claimed there for many years. H.A.W.X is a NVIDIA Sponsored game, why so they should reduce the quality if they cooperate with Ubisoft. Have you asked the ever?
    AMD uses AI years application detection whats new? Please explain.

  8. Hi Everybody,

    What is being experienced is not an “Antialiasing cheat” but rather a HawX bug that is fixed by our driver using an application specific profile.

    In a nutshell, the HawX application requests the highest possible AA “sample quality” at a particular AA level from our driver. Without our driver fix, the game would be running 16xCSAA instead of standard 4xAA when you select 4xAA in-game. It runs the proper 4xAA with the driver fix. You defeat the fix by changing the .exe name, causing it to run at 16xCSAA.

    You may remember that Geforce 8800 introduced Coverage Sampling AA (CSAA) technology, which added higher quality AA using little extra storage. Prior to 8800 GTX and CSAA, there was only one “sample quality level” for each AA level, so if an application requested four AA samples, the hardware performed standard 4xAA. However, with 8800 GTX GPUs onwards, our drivers expose additional sample quality levels for various standard AA levels which correspond to our CSAA modes at a given standard AA level.

    The “sample quality level” feature was the outcome of discussions with Microsoft and game developers. It allowed CSAA to be exposed in the current DirectX framework without major changes. Game developers would be able to take advantage of CSAA with minor tweaks in their code.

    Unfortunately, HawX requests the highest quality level for 4xAA, but does not give the user the explicit ability to set CSAA levels in their control panel. Without the driver profile fix, 16xCSAA is applied instead of standard 4xAA. Recall that 16xCSAA uses 4 color/Z samples like 4xAA, but also adds 12 coverage samples. (You can read more about CSAA in our GeForce 8800 Tech Briefs on our Website).

    When you rename the HawX.exe to something else, the driver profile bits are ignored, and 16xCSAA is used. Thus the modest performance slowdown and higher quality AA as shown in the images.

    To use “standard” 4xAA in a renamed HawX executable, you should select any level of anti-aliasing in the game, then go into the NVIDIA Control Panel and set 4xAA for “Antialiasing Setting” and turn on “Enhance the application setting” for the “Antialiasing mode”.

    Nick Stam, NVIDIA

  9. Yet in his blog posted on nvidia.com, Nick Stam states that Nvidia is “ensuring the user experiences the image quality intended by the game developer” and ““NVIDIA will not hide optimizations that trade off image quality to obtain faster frame rates.” The HAWX example and response directly contradicts these statements.

    It is very convenient for Nick to state that there is a “bug” in HAWX, implying that the game developer did not intend to request the highest possible AA quality. Nvidia has been encouraging developers to use these AA modes for some time, per their developer documentation (referenced below). Was the “bug” to follow Nvidia’s advice?

    http://developer.nvidia.com/object/coverage-sampled-aa.html

    CSAA Guidelines
    Performance
    CSAA performance is generally very similar to that of typical MSAA performance, given the same color/z/stencil sample count. For example, both the 8x and 16x CSAA modes (which use 4 color/z/stencil samples) perform similarly or identical to 4x MSAA.
    CSAA is also extremely efficient in terms of storage, as the coverage samples require very little memory.