When Intel revealed the specifications of the Intel Arc A-series mobile GPUs, many commented about the relatively low ‘graphics clock'. As it turns out, clock speeds are reported differently on Intel's GPUs compared to Nvidia and AMD graphics cards.
During an interview with HotHardware (via Tom's Hardware), Intel's Tom Petersen explained Intel's ‘graphics clock' in more detail. The graphics clock is not the same as the base clock on an equivalent AMD or Nvidia GPU, but rather the average clock speed in a TDP-constrained environment. Essentially, it is the minimum clock speed a user should see while running applications.
Taking the 1,550MHz of the A370M as an example, this is the minimum graphics clock laptop users will see in a 35W TDP (minimum for the GPU). 50W TDP A370M GPUs will likely offer higher clocks. Moreover, the graphics clock also varies depending on the application. As stated by Petersen, running a game like CS:GO will allow the card to push itself above 2,000MHz.
Although Petersen's claims seem valid, they don't match what we've seen in the A370M analyses made by Benchmark Lab and Digital Testings. None of the tests show the A370M going above 1,550MHz, much less 2,000MHz or more. These tests even include CS:GO benchmark runs, but even so, the clock never passed the 1,515MHz mark.
However, there are also cases like this GPU-Z screenshot spotted by VideoCardz at Zhihu. Here, we see an A350M with a 2,200MHz GPU clock, which GPU-Z's developer confirmed to be accurate.
Discuss on our Facebook page, HERE.
KitGuru says: How did you interpret the graphics clock spec on the Arc A-series table? Do you think Intel's approach to graphics clock speeds is better than the one used by Nvidia and AMD?