Home / Tech News / Featured Tech Reviews / Intel Arc A770 Limited Edition Review

Intel Arc A770 Limited Edition Review

Rating: 5.5.

It has felt like an eternity since we first had official confirmation of Intel's plan to enter the discrete GPU market. But today the wait is over and we can present our review of the Arc A770 Limited Edition. Intel is claiming a strong performance advantage over the RTX 3060, but at a competitive price of $349. Is this the shake-up the GPU market has been crying out for? We find out today.

This review will focus on the Intel Arc A770 Limited Edition graphics card, but we do also have a day-1 review of the Arc A750 Limited Edition that you can find over HERE.

We've taken a look at the Xe HPG architecture in the past, but to remind ourselves of what we are looking at today, above we can see a comparison of the core spec of the A750 and A770. Both SKUs are built on Intel's ACM-G10 silicon, but the A750 is shaved down slightly. The full die offers 32 Xe cores for the A770 however, where each core offers 16 vector engines, with each vector engine housing 8 FP32 ALUs, for a grand total of 4096.

Each Xe core is accompanied by a Ray Tracing Unit, while we also find 224 TMUs and 112 ROPs. A 256-bit memory interface is used for both A750 and A770, but 16Gbps GDDR6 memory is used for the A750, giving total memory bandwidth of 512 GB/s. Intel also rates the A770 for 2100MHz graphics clock, and 225W total board power, something we look at closely in this review using our in-depth power testing methodology.

It's also worth clarifying that today we are reviewing the Limited Edition cards. Despite the name, these are not limited in quantity and you can instead think of the Limited Edition as the equivalent to Nvidia's Founders Edition – boards manufactured by Intel and sold directly to consumers. Intel was firm with us that the Limited Edition models will be available at the announced prices, so $289 for the A750 and $349 for the A770, with the A770 Limited Edition only available with 16GB VRAM. A770 partner cards with 8GB VRAM will start at $329.

The Arc A750 Limited Edition ships in a compact blue box, with the Arc branding positioned prominently on the front. The back of the box is almost entirely plain, apart from a very small section of ‘minimum system requirements' in the bottom left corner.

Inside, a quick start guide and ‘thank you' note are included, as well as a custom cable that connects from the GPU to an internal USB header, used to control the RGB lighting.

 

We've already seen a positive reaction to the design of Intel's Limited Edition cards from our unboxing video and we have to say Intel has done a great job with the aesthetics here. The card is almost entirely matte black, with no gaudy design elements or aggressive angles, and on the underside of the shroud, we just get a look at the two 90mm axial fans. It's a very simple, but elegant design and it makes a refreshing change from the RGB monsters we have become used to over the last few years.

That's also true in terms of dimensions, as the Limited Edition is just 2-slots thick and a standard 10.5″, or 266.7mm, length. We can see the RGB lighting strip that runs the length of the card, with the fan rings, Intel Arc logo and backplate also acting as RGB zones.

For a closer look at the RGB lighting do check out the video on the first page of this review, but overall it looks fine to my eye – and if you don't like it, you can always turn it off.

The Intel Arc logo is visible on the front side of the card, while we also get a look at the rest of the front of the shroud. Intel's backplate design is also eye-catching, with a pin-stripe design. The backplate itself is made from plastic and doesn't feel particularly sturdy, but it certainly looks the part.

I've not yet disassembled the cards themselves but above we can see the renders supplied by Intel. The cards use a copper vapour chamber and aluminium fin stack, with a total of four flattened 10mm heat pipes. Air from the two fans blows down onto the cooler and escapes through the back and sides of the card.

Power requirements consist of one 8-pin and one 6-pin PCIe connector – no 12VHPWR here. Next to the 6-pin connector is the very small 3-pin header that connects to an internal USB header, used for controlling the RGB lighting.

Meanwhile for display outputs, credit to Intel for coming to market with three DisplayPort 2.0 connectors, the first of any GPU manufacturer to do so, while there's a single HDMI 2.1 port as well.

Driver Notes

  • All AMD GPUs were benchmarked with the public Adrenalin 22.9.1 driver.
  • All Nvidia GPUs were benchmarked with the 516.94 driver.
  • All Intel GPUs were benchmarked with the 101.3433 driver supplied to press.

Test System:

We test using a custom-built system powered by MSI, based on Intel’s Alder Lake platform. You can read more about this system HERE and check out MSI on the CCL webstore HERE.

CPU
Intel Core i9-12900K
Motherboard
MSI MEG Z690 Unify
Memory
32GB (2x16GB) ADATA XPG Lancer DDR5 6000MHz
CL 40-40-40
Graphics Card
Varies
SSD
 2TB MSI Spatium M480
Chassis MSI MPG Velox 100P Airflow
CPU Cooler
MSI MEG CoreLiquid S360
Power Supply
 Corsair 1200W HX Series Modular 80 Plus Platinum
Operating System
Windows 11 Pro 22H2
Monitor
MSI Optix MPG321UR-QD
Resizable BAR
Enabled for all supported GPUs

Comparison Graphics Cards List

  • AMD RX 6800 XT 16GB
  • Gigabyte RX 6600 XT Gaming OC Pro 8GB
  • Gigabyte RX 6600 Eagle 8GB
  • ASUS RTX 3080 TUF Gaming 10GB
  • Nvidia RTX 3070 FE 8GB
  • Nvidia RTX 3060 Ti FE 8GB
  • Palit RTX 3060 StormX 12GB
  • Nvidia RTX 2060 FE 6GB

All cards were tested at reference specifications.

Software and Games List

  • 3DMark Fire Strike & Fire Strike Ultra (DX11 Synthetic)
  • 3DMark Time Spy (DX12 Synthetic)
  • Assassin's Creed Valhalla (DX12)
  • Cyberpunk 2077 (DX12)
  • Days Gone (DX11)
  • Dying Light 2 (DX12)
  • Far Cry 6 (DX12)
  • Forza Horizon 5 (DX12)
  • God of War (DX11)
  • Horizon Zero Dawn (DX12)
  • Marvel's Spider-Man Remastered (DX12)
  • Metro Exodus Enhanced Edition (DXR)
  • Red Dead Redemption 2 (DX12)
  • Resident Evil Village (DX12)
  • Total War: Warhammer III (DX11)

We run each benchmark/game three times, and present mean averages in our graphs. We use FrameView to measure average frame rates as well as 1% low values across our three runs.

Fire Strike is a showcase DirectX 11 benchmark for modern gaming PCs. Its ambitious real-time graphics are rendered with detail and complexity far beyond other DirectX 11 benchmarks and games. Fire Strike includes two graphics tests, a physics test and a combined test that stresses the CPU and GPU. (UL).

3DMark Time Spy is a DirectX 12 benchmark test for Windows 10 gaming PCs. Time Spy is one of the first DirectX 12 apps to be built the right way from the ground up to fully realize the performance gains that the new API offers. With its pure DirectX 12 engine, which supports new API features like asynchronous compute, explicit multi-adapter, and multi-threading, Time Spy is the ideal test for benchmarking the latest graphics cards. (UL).

3DMark is a strong start for the A770. It outperforms the RTX 3060 Ti in Fire Strike and isn't far off the RTX 3070, closing the gap further in Time Spy and Time Spy Extreme. Let's see how this translates to real-world gaming performance.

Assassin's Creed Valhalla is an action role-playing video game developed by Ubisoft Montreal and published by Ubisoft. It is the twelfth major instalment and the twenty-second release in the Assassin's Creed series, and a successor to the 2018's Assassin's Creed Odyssey. The game was released on November 10, 2020, for Microsoft Windows, PlayStation 4, Xbox One, Xbox Series X and Series S, and Stadia, while the PlayStation 5 version was released on November 12. (Wikipedia).

Engine: AnvilNext 2.0. We test using the Ultra High preset, DX12 API.

Assassin's Creed Valhalla gets us up and running with the gaming benchmarks, with the A770 delivering 72FPS on average at 1080p. It's 10% faster than the RTX 3060 here, but still lags significantly behind the RX 6600 XT. At 1440p, it closes the gap on the 6600 XT to just 5%, while it is also 14% faster than the RTX 3060.

Cyberpunk 2077 is a 2020 action role-playing video game developed and published by CD Projekt. The story takes place in Night City, an open world set in the Cyberpunk universe. Players assume the first-person perspective of a customisable mercenary known as V, who can acquire skills in hacking and machinery with options for melee and ranged combat. Cyberpunk 2077 was released for Microsoft Windows, PlayStation 4, Stadia, and Xbox One on 10 December 2020. (Wikipedia).

Engine: REDengine 4. We test using the Ultra preset, DX12 API.

Cyberpunk 2077 brings us to the first of several driver-related issues we will discuss today, with more shown in the video. Frame time consistency is simply very poor at 1080p, resulting in 1% low performance that is significantly worse than even the RTX 2060. Weirdly, this does improve – relatively speaking – at 1440p, but this game with Ultra settings is still too heavy for this calibre of GPU at the higher resolution.

Days Gone is a 2019 action-adventure survival horror video game developed by Bend Studio and published by Sony Interactive Entertainment for the PlayStation 4 and Microsoft Windows. As part of Sony's efforts to bring more of its first-party content to Microsoft Windows following Horizon Zero Dawn, Days Gone released on Windows on May 18, 2021. (Wikipedia).

Engine: Unreal Engine 4. We test using the Very High preset, DX11 API.

Just like Cyberpunk 2077, Days Gone exhibits awful frame times. The average frame rates are relatively strong, but the 1% lows make this a borderline unplayable experience. Once more things do improve at 1440p, and admittedly Days Gone is a DX11 title which Arc GPUs do struggle with (as we see in more detail later), but the behaviour is still very odd.

Dying Light 2: Stay Human is a 2022 action role-playing game developed and published by Techland. The sequel to Dying Light (2015), the game was released on February 4, 2022, for Microsoft Windows, PlayStation 4, PlayStation 5, Xbox One, and Xbox Series X/S. (Wikipedia).

Engine: C-Engine. We test using the High preset, DX12 API.

Dying Light 2 represents a huge win for the Intel Arc GPUs. At 1080p we see a crushing 27% performance advantage over the RTX 3060, while the A770 is 17% faster than the RX 6600 XT as well. At 1440p, both of those margins improve further, with the A770 now outperforming the RTX 3060 by 33%, and the 6600 XT by 28%. It's breathing right down the neck of the RTX 3060 Ti in both instances, which is highly impressive stuff.

Far Cry 6 is a 2021 action-adventure first-person shooter game developed by Ubisoft Toronto and published by Ubisoft. It is the sixth main instalment in the Far Cry series and the successor to 2018's Far Cry 5. The game was released on October 7, 2021, for Microsoft Windows, PlayStation 4, PlayStation 5, Xbox One, Xbox Series X/S, Stadia, and Amazon Luna. (Wikipedia).

Engine: Dunia Engine. We test using the Ultra preset, HD Textures enabled, DX12 API.

As for Far Cry 6, this isn't quite so jaw-dropping as Dying Light 2 but the A770 still does well. It manages to deliver 97FPS on average, putting it 8% ahead of the RTX 3060 at 1080p, though it's 11% slower than the RX 6600 XT. At 1440p, it's now 15% faster than the 3060 and only 2% behind the 6600 XT, so we can see how Arc scales better at the higher resolution.

Forza Horizon 5 is a 2021 racing video game developed by Playground Games and published by Xbox Game Studios. The twelfth main instalment of the Forza series, the game is set in a fictionalised representation of Mexico. It was released on 9 November 2021 for Microsoft Windows, Xbox One, and Xbox Series X/S. (Wikipedia).

Engine: ForzaTech. We test using the Extreme preset, DX12 API.

Forza Horizon 5 is highly demanding using the Extreme preset and is particularly VRAM intensive even at 1080p. We can see the A770 takes a lead of 15% over the A750 here – the biggest margin between the two we will see today – with VRAM a likely factor. At 1440p, the A770 pulls away further and comes in a whopping 28% ahead of the A750, so it does show there are at least some games which benefit from more than 8GB VRAM, even in this performance segment.

God of War is an action-adventure game developed by Santa Monica Studio and published by Sony Interactive Entertainment (SIE). It was released worldwide on April 20, 2018, for the PlayStation 4 with a Microsoft Windows version released on January 14, 2022. (Wikipedia).

Engine: Sony Santa Monica Proprietary. We test using the Ultra preset, DX11 API.

Being a DX11 title, an area of performance that we analyse in more detail later on, I was honestly pleasantly surprised to see the frame rates on offer here from the A770. It's only a couple of frames behind the RTX 3060 and RX 6600 XT at 1080p, and even edges out both of those GPUs as we step up the resolution to 1440p. As a major DX11 title released this year, I would imagine the Intel software team has optimised the driver for God of War, at least to a greater extent than some of the other examples that we will see later in the review.

Horizon Zero Dawn is an action role-playing game developed by Guerrilla Games and published by Sony Interactive Entertainment. The plot follows Aloy, a hunter in a world overrun by machines, who sets out to uncover her past. It was released for the PlayStation 4 in 2017 and Microsoft Windows in 2020. (Wikipedia).

Engine: Decima. We test using the Ultimate Quality preset, DX12 API.

Horizon Zero Dawn is another solid performer for the A770. At 1080p, Intel's fastest GPU delivers 105FPS on average, putting it 13% ahead of the RTX 3060, though it's much closer to the RX 6600 – but still faster. At 1440p it extends its lead over the AMD GPU too, this time with an 11% margin between the two.

Marvel's Spider-Man Remastered is a 2018 action-adventure game developed by Insomniac Games and published by Sony Interactive Entertainment. A remastered version of Marvel's Spider-Man, featuring all previously released downloadable content, was released for the PlayStation 5 in November 2020 and for Microsoft Windows in August 2022. (Wikipedia).

Engine: Insomniac Games Proprietary. We test using the Very High preset, DX12 API.

Marvel's Spider-Man Remastered is – or would be – a new addition to our test suite, but currently exhibits a game-breaking lighting issue on Intel Arc GPUs, as shown in our video.

Red Dead Redemption 2 is a 2018 action-adventure game developed and published by Rockstar Games. The game is the third entry in the Red Dead series and is a prequel to the 2010 game Red Dead Redemption. Red Dead Redemption 2 was released for the PlayStation 4 and Xbox One in October 2018, and for Microsoft Windows and Stadia in November 2019. (Wikipedia).

Engine: Rockstar Advance Game Engine (RAGE). We test by manually selecting Ultra settings (or High where Ultra is not available), TAA, DX12 API.

Red Dead Redemption 2 is arguably the best case scenario for the Arc A770. At 1080p it delivers 80FPS on average, making it over 30% faster than the RTX 3060, which is really quite something. It's also 12% ahead of the RX 6600 XT, though that increases to 15% at 1440p. At the latter resolution, the A770 is even just edging out the RTX 3060 Ti, so you can see why we find this so exciting.

Resident Evil Village is a survival horror game developed and published by Capcom. The sequel to Resident Evil 7: Biohazard (2017), players control Ethan Winters, who is searching for his kidnapped daughter; after a fateful encounter with Chris Redfield, he finds himself in a village filled with mutant creatures. The game was announced at the PlayStation 5 reveal event in June 2020 and was released on May 7, 2021, for Windows, PlayStation 4, PlayStation 5, Xbox One, Xbox Series X/S and Stadia. (Wikipedia).

Engine: RE Engine. We test using the Max preset, with V-Sync disabled, DX12 API.

Resident Evil Village also delivers the goods for the Arc A770. Hitting just shy of 160FPS at 1080p, it's a very solid 21% faster than the RTX 3060, but dead level with the RX 6600 XT – though that's still impressive considering this is an AMD-sponsored title. It also improves, relatively speaking, at 1440p, coming in 25% ahead of the 3060, though it's only a couple of frames ahead of the 6600 XT.

Total War: Warhammer III is a turn-based strategy and real-time tactics video game developed by Creative Assembly and published by Sega. It is part of the Total War series, and the third to be set in Games Workshop's Warhammer Fantasy fictional universe (following 2016's Total War: Warhammer and 2017's Total War: Warhammer II). The game was announced on February 3, 2021, and was released on February 17, 2022. (Wikipedia).

Engine: TW Engine 3 (Warscape). We test using the Ultra preset, with unlimited video memory enabled, DX11 API.

Total War: Warhammer III would be the last game we test, but unfortunately there is more game-breaking visual corruption. Intel told us that this issue was meant to be fixed in the previous driver revision, but as you can see in the video, that is not the case.

While three of the twelve games we tested (or hoped to test) today are DX11 titles, Intel was very upfront to us about the problem posed by games running on DX11 – and older – APIs. As these APIs place a much larger emphasis on the driver itself, which is an obvious problem considering Intel's newcomer status in the GPU market, performance is likely to suffer versus low-level APIs such as DX12 and Vulkan.

To put this to the test, we benchmarked five titles that support DX11, and either DX12 or Vulkan:

It's quite amusing that the first title tested – Battlefield V – delivers such poor frame times using DX12, that although the average frame rate is 33% worse, DX11 actually offers a better gaming experience here. It's also worth pointing out that I did want to include Kena: Bridge of Spirits in these benchmarks, but it crashed several times when switching from DX11 to DX12, so it goes to show that DX12 isn't a guarantee of good performance for Intel Arc.

Still, DX11 is generally much worse across the board. Some titles, like Control, see fairly similar average frame rates but with a huge reduction to 1% low performance when using DX11. Other games, such as Rainbow Six Siege, get significantly slower across the board.

According to Intel's Tom Peterson, optimising DX11 titles is a task that will last ‘forever' for Intel Arc, and we can see why. It's a major red flag for Intel's first dGPUs.

At KitGuru we have been running our benchmarks with Resizable BAR since the beginning of the year. Intel has again been quite upfront about the fact that Resizable BAR, or Rebar, is essentially a requirement for its Arc GPUs; in a press briefing, Tom Peterson was as blunt as saying users who can't enable Rebar should just get a 3060 without even considering Arc!

We tested four games, with Rebar on and then with Rebar off, to take a closer look.

The results quickly show why Intel is more-or-less positioning Rebar as a requirement for Arc GPUs – the 1% lows are near enough cut in half in Assassin's Creed Valhalla and face significant reductions in both Dying Light 2 and Far Cry 6. Forza Horizon 5 is the worst offender however and becomes simply unplayable without Rebar.

While clearly a huge issue for those with systems that don't support Rebar, for me this isn’t as big of an issue as the DX11 performance is. My reasoning is that at least the last two motherboard generations from Intel and AMD do support Rebar, and in some cases even further back than that, so a fair number of PC enthusiasts should have a compatible system. And as we will get to, right now I don’t think many people who aren’t enthusiasts will be buying Arc as it’s not ready for the mainstream yet, though the Rebar situation is certainly something to keep in mind.

It's at this part of a GPU review where I usually show a performance summary, giving the average frame rate from across all 12 games tested so we can compare the card in question against the competition. However, I won’t be doing that for Intel Arc just yet.

For one, two of the twelve games that I wanted to test are visually broken, so there's that… But more fundamentally, I just don’t think you can really ‘boil down' the experience from using Arc into a single chart. I personally would be worried that someone could view such a chart and take the data at face value, which really wouldn't tell the whole picture and could be misleading. I’ve found that the experience of using Arc can vary massively from game to game, so I didn’t want to try and distil the performance in a way that doesn’t really do it justice.

Here we test Cyberpunk 2077, with RT Lighting set to Ultra.

Using Ultra ray traced settings in Cyberpunk 2077 is an absolute GPU killer, but it's very reassuring to see the A770 actually edging out the RTX 3060 at 1080p. Admittedly by only the tiniest of margins, but it is indicative of strong ray tracing performance that bodes well for other easier-to-run RT titles.

Here we test Metro Exodus Enhanced Edition, with the in-game ray tracing effects set to Ultra.

Metro Exodus Enhanced Edition is one such title that runs incredibly well despite vast amounts of ray-traced lighting. At 1080p the A770 is a couple of frames faster than the RTX 3060 Ti but is significantly better in terms of the 1% lows, which is hugely impressive. Even at 1440p the A770 averages over 60FPS, giving it a very slender lead over the 3060 Ti.

Here we test Resident Evil Village, this time testing with the in-game ray tracing effects set to High.

Resident Evil Village doesn't see the A770 matching the RTX 3060 Ti, but it's still a decent chunk faster than the RTX 3060 – by 11% at 1080p and 14% at 1440p, which is not bad going. For a first-generation ray tracing architecture, Intel is looking hugely promising in this department.

Here we present the average clock speed for each graphics card while running Cyberpunk 2077 for 30 minutes. We use GPU-Z to record the GPU core frequency during gameplay. We calculate the average core frequency during the 30-minute run to present here.

With only two Arc cards to test, operating clock speeds don't mean much just yet, but over our 30-minute 4K workload, the A770 hits 2349MHz, putting it about 50MHz slower than the A750, which hit exactly 2400MHz. These figures are a fair bit higher than Intel's rated graphics clock speeds, but the exact frequency will vary depending on the workload in question.

For our temperature testing, we measure the peak GPU core temperature under load. A reading under load comes from running Cyberpunk 2077 for 30 minutes.

Intel's Arc Limited Edition cooler does a proficient job of keeping both GPUs cool, too. Do note the chart above shows both GPU and VRAM temperatures, but both cards were able to keep the GPU temperatures at or around 70C under load. The VRAM on the A750 did run a bit hotter than that of the A770, but neither result is particularly worrisome.

We take our noise measurements with the sound meter positioned 1 foot from the graphics card. I measured the noise floor to be 32 dBA, thus anything above this level can be attributed to the graphics cards. The power supply is passive for the entire power output range we tested all graphics cards in, while all CPU and system fans were disabled. A reading under load comes from running Cyberpunk 2077 for 30 minutes.

As for noise levels, both Arc GPUs are right in line with the competition. As we'd expect given both SKUs use the same cooler, but the A770 is the faster chip, it does run a bit louder, with its fan at about 1800rpm, producing 38dBa of noise – making it only a little bit louder than the RTX 3070 Founders Edition. The A750 runs its fans slightly slower, at about 1600rpm in our testing, producing 36dBa of noise.

While fan noise is absolutely fine, I did notice some coil whine present. This was most noticeable on the A750 but was still audible with the A770, though it wasn't much of a problem while gaming, only in certain game menus with an uncapped frame rate. We provide a sound test of this in the video.

Here we present power draw figures for the graphics card-only, on a per-game basis for all twelve games we tested at 1080p. This is measured using Nvidia's Power Capture Analysis Tool, also known as PCAT. You can read more about our updated power draw testing methodology HERE.

Per-Game Results at 1080p:

Click to enlarge.

Power draw for the A770 isn't far off the rated 225W figure. It does vary from game to game, as you can see above, but we see an average power draw of 208.2W at 1080p.

 

Here we present power draw figures for the graphics card-only, on a per-game basis for all twelve games we tested at 1440p. This is measured using Nvidia's Power Capture Analysis Tool, also known as PCAT. You can read more about our updated power draw testing methodology HERE.

Per-Game Results at 1440p:

Click to enlarge.

Power draw does increase across the board at 1440p, now averaging 220.4W across the 10 titles we tested. Certain games – like Assassin's Creed Valhalla – come in under that figure, while others including Resident Evil Village, draw slightly more.

Here we present power draw figures for the graphics card-only, on a per-game basis for all twelve games we tested at 2160p (4K). This is measured using Nvidia's Power Capture Analysis Tool, also known as PCAT. You can read more about our updated power draw testing methodology HERE.

Per-Game Results at 2160p (4K):

Click to enlarge.

Lastly, 4K power draw is fairly consistent across the board, with the A770 drawing an average of 227.6W over the ten games we tested.

Using the graphics card-only power draw figures presented earlier in the review, here we present performance per Watt on a per-game basis for all twelve games we tested at 1080p.

Per-Game Results at 1080p:

Click to enlarge.

As for performance per Watt, with the A770 drawing about 210W at 1080p, it's not particularly efficient overall. It's also less impressive than the A750 here, but lags well behind the likes of the RX 6600 and RX 6600 XT.

Using the graphics card-only power draw figures presented earlier in the review, here we present performance per Watt on a per-game basis for all twelve games we tested at 1440p.

Per-Game Results at 1440p:

Click to enlarge.

Likewise at 1440p, the A770 is one of the least efficient cards that we've tested for this review. In several games, it offers worse performance per Watt than the Turing-based RTX 2060 from 2019, and even in the best-case scenarios, it's behind the likes of the RTX 3060

Using the graphics card-only power draw figures presented earlier in the review, here we present performance per Watt on a per-game basis for all twelve games we tested at 2160p (4K).

Per-Game Results at 2160p (4K):

Click to enlarge.

The picture doesn't change much at 4K, with the A770 still one of the least efficient cards we've tested today. It's probably low on the list of priorities for Arc at this moment, but RDNA 2 is head and shoulders above in terms of overall performance per Watt.

We measure system-wide power draw from the wall while running Cyberpunk 2077 for 30 minutes. We do this at 1080p, 1440p and 2160p (4K) to give you a better idea of total system power draw across a range of resolutions, where CPU power is typically higher at the lower resolutions.

To add to our detailed graphics card-only power draw testing, we also look at power draw of the entire system measured at the wall socket. Power draw for the Arc GPUs increases as we step up in resolution, peaking at 374W for the A770 at 4K. This will vary slightly depending on the game in question, but it's good to know a 1000W PSU isn't required!

For our manual overclocking tests, we used Intel's Arc Control software. Our best results are as below.

Overclocking with Arc Control was surprisingly painless. We increased the power limit to its maximum value of 228W and set the GPU voltage offset to +55mV, with the GPU performance boost set to 16.

This overclock resulted in a 10% boost to the Time Spy GPU score. Resident Evil Village didn't scale as well, but we still observed a 7% increase in the average frame rate, which is better than nothing.

Power draw did increase while overclocked, averaging just over 280W, further reducing the A770's performance per Watt.

The arrival of Intel's Arc GPUs marks the first time I have ever reviewed a graphics card not manufactured by AMD or Nvidia. Those two companies have dominated the GPU market for decades, but now there is a third player breaking into the discrete graphics segment. This review is focused on the $349 A770 Limited Edition, but we do also have a day-1 review of the A750 Limited Edition that you can find HERE.

With the cards arriving last week, I've spent the last six days benchmarking a wide variety of different games, engines and APIs to develop a clear picture of exactly what these cards are capable of. Right now, I just can't say with confidence that these cards are ready for the mainstream market. At least, not yet.

Before we get into the issues, it's worth commending Intel for the good they have done. For one, the Limited Edition coolers not only look fantastic, but they run quiet and cool. There's a little bit of coil whine, but for the most part, this is a highly successful first-party design.

 

In terms of gaming performance, there are a few occasions where the ACM-G10 silicon really shines. I'm talking about performance in games such as Red Dead Redemption 2 and Dying Light 2, where the A770 is significantly faster than the likes of the RX 6600 XT and RTX 3060. In ray-traced workloads too, we even saw the A770 matching the RTX 3060 Ti, which for a first-generation ray tracing architecture is hugely impressive.

So while we got a glimpse of what the A770 can do, talking about the hardware only gives half the story. Right now, the drivers and software are the real problems here.

Throughout my testing, I experienced incredibly poor frame times in certain games, visual glitches that affected two of the twelve games I wanted to benchmark, as well as game crashes and even system BSODs. Performance in DX11 titles is also a huge problem for Arc, while Rebar is absolutely essential for a hope of a smooth gaming experience. I'd add to that by saying I wasn't trying to go out of my way to find problems. I simply set out to benchmark a wide variety of titles, and this was my experience.

The problems are so varied and significant that it becomes impossible to recommend buying an ARC GPU right now. While the $349 price point for the A770 Limited Edition certainly looks good on paper, multiple RX 6650 XTs are currently selling on Newegg.com for between $300-329. That GPU offers performance that is broadly similar but on a platform that is just head and shoulders above Arc in terms of stability and consistency.

I'd also add to that with a word of caution on the A770 itself. My testing doesn't show a particularly large delta between the A750 and the A770, with the later card 9% faster on average at 1080p, despite being priced 21% higher. The A770 8GB model may well make more sense, but certainly, the A750 looks more attractive on paper.

That point is entirely academic at present though, as currently, we are not in a position to recommend any of the Arc lineup – Intel has plenty more work to do before we can consider making nuanced recommendations. We remain optimistic that Arc could be a success in the future, as we say there certainly are glimpses of strong potential here, and we look forward to testing the A750 and A770 as major updates land and hopefully change the picture.

Right now, however, Intel Arc isn't ready for the mainstream market.

The Intel Arc A770 Limited Edition has an MSRP of $349, and Intel is adamant that cards will be in stock and selling for that price on October 12. We're still unclear on UK availability and pricing, but will update this article when we know more.

Discuss on our Facebook page HERE.

Pros

  • Impressive ray tracing performance for a 1st gen architecture.
  • Outperforms the RTX 3060 handily in certain games.
  • Well-designed Limited Edition cooler.
  • Overclocked fairly well.

Cons

  • DX11 performance is woeful compared to DX12 or Vulkan.
  • Resizable BAR support is an absolute must.
  • We experienced numerous crashes, visual glitches and BSODs while testing.
  • Frame times can be incredibly erratic in certain games.
  • Performance isn't much better than the A750, despite a 21% price difference.
  • Overall efficiency is poor compared to RDNA 2.
  • Pricing isn't currently aggressive enough to warrant the significant risk of purchase.

KitGuru says: The GPU market needs another player, and we can definitely see signs of potential for Intel Arc. That said, there's still a ways to go before we can recommend picking up an Arc graphics card.

Become a Patron!

Check Also

Intel Arc 140V iGPU Benchmarks vs Radeon 890M

Today we are taking a closer look at the Arc 140V iGPU, found in Core Ultra 200V (Lunar Lake) CPUs