The Witcher 3: Wild Hunt (Polish: Wiedźmin 3: Dziki Gon) is an action role-playing video game set in an open world environment, developed by Polish video game developer CD Projekt RED. The Witcher 3: Wild Hunt concludes the story of the witcher Geralt of Rivia, the series’ protagonist, whose story to date has been covered in the previous versions. Continuing from The Witcher 2, the ones who sought to use Geralt are now gone. Geralt seeks to move on with his own life, embarking on a new and personal mission whilst the world order itself is coming to a change.
Geralt’s new mission comes in dark times as the mysterious and otherworldly army known as the Wild Hunt invades the Northern Kingdoms, leaving only blood soaked earth and fiery ruin in its wake; and it seems the Witcher is the key to stopping their cataclysmic rampage. (Wikipedia).
We test with the highest image quality settings, although I have disabled the Nvidia Hairworks option specifically as it does kill frame rate on many cards. Graphics Preset is on ULTRA and Postprocessing is on HIGH.
I have played The Witcher 3 for around 85 hours and I have completed the single player campaign. I tested the game today by playing 4 different save game stages for 5 minutes each, then averaging the frame rate results for a real world indication of performance – one of the map sections we tested is one of the most demanding in the game and our results can be considered strictly ‘worst case'. The Witcher 3 is a dynamic world, so it is important to run tests multiple times to remove any discrepancies. Our results below are
This is one of the greatest PC games ever released in my opinion, so I spent around a total of 48 hours benchmarking it for this review alone – it should be on your must have list, if you don't have it already.
Performance at 1440p is excellent, especially with these maxed out image quality settings.
still laughing at the r9 295×2 performance on the Witcher 3 hahahahahahahahaha (sorry).
Well I suspect that if AMD were to ever work out their driver-level Crossfire support for W3, the 295×2 would likely at least trade blows with the Titan Z. As it is, it looks just like I would expect a single reference 290X to look, because as far as W3 is concerned, that’s what it is.
It’s an Nvidia showcase title, I don’t expect any better.
Amazing card.
Good deal faster than Fury X on 4K, smokes it on 1440p which is my preferred resolution, good price, premium quality components used throughout the card, more quiet than Fury X.
Good test KitGuru.
They sacrificed some temps to make it more quiet. I fully support that !
i just agree with kitguru. < Find Here <
it’s a dual chip card (so basically crossfire) and it can even run past 45 fps on 1440p that is piss-poor performance even their single chip cards beat it
i am getting this and a hyper 612 pwm to upgrade my currently crippled rig
i7 2600
GTX 560 from msi that broke so gt 730 1 gb ddr3 64 bit from msi is the temp. gpu
16GB ram
TBs of HDD
etc
having put up witth infernal stock and laptop coolers running at 6000 rpm (100rps or 10ms/rev)
i won’t mind the extra fan speed for cooling… 3000 rpm seems to be the max i can tolerate a non optimised fan design but with this i think i can go to hell with the fans… although 2-3000 rpm at max should be enough that should be like 50 degrees right
SIDENOTE: because i have a micro atx mb it does mean the graphics card can only use the top pci slot so it means the 612 would almost touch the asus but hey more cooling right…
cool the cpu and backside of the gpu with 1 fan (+case)
in another review the temps were 80 ish ‘stock’ and 70 ish oc with a more aggresive fan
so that means it would be well under 65 with ‘stock’ clocks and more fan speed
40 fps @1440p is disastrous for that card, AMD are slowly killing themselves because of poor drivers and optimisation. If you look at Tomb Raider which is an AMD showcase title (tress FX first outing) Nvidia perform very well on there. If Nvidia can get their cards to perform well on AMD biased titles then why can AMD do the same on Nvidia biased titles ?
Because tech like TressFX is made open, deliberately, by AMD. When Tomb Raider came out and debuted TressFX, for a week or two Nvidia fans were screaming and moaning that it didn’t work properly on their cards, until Nvidia fixed it in drivers, and lo and behold, it suddenly worked BETTER on Nvidia cards. It was easy for them because TressFX is open, Nvidia picked up the base code and fixed it up in their drivers.
AMD can’t do that with Nvidia-biased titles because Gameworks features (such as Hairworks) are a black-box. Nvidia doesn’t open that stuff up, they lock it up.
By the way, the new Catalyst driver adds Crossfire support for Witcher 3, so I would expect the R9-295×2 to start kicking ass again at that game.
I’m definitely getting two of these. Nice review and great looking/performing card.
Either AMD need to start shutting Nvidia out or Nvidia need to start sharing more……it is unfair on the gamers.
This is what AMD fans have been saying for a while now. AMD simply can’t afford to try to shut Nvidia out, even if they wanted to – one failed attempt could be disastrously expensive for them. And their open approach tends to benefit all gamers when it’s successful. Nvidia on the other hand has piles of money to spend when Huang’s not swimming around in it like Scrooge McDuck, and a veritable army of devotees who would rather a feature not exist if it’s not an Nvidia exclusive. In my opinion, TressFX is superior to HairWorks (which is just “tesselate the **** out of it” written into code) and is advantageous because it works really, really well on all platforms, but Nvidia has 75% (ish) of the market, so game companies tend to do (and use) what they say.
why 4 gigabites from gpu?, the gtx 980 ti should be get 6 gigabites right?
Yes 1440p is great – I have a triple 1440p setup – Dual 980ti Strixii 🙂
Me too – They are in the post (hopefully)