Our GTX 1080 Ti Founder's Edition sample shipped directly from Nvidia in a box that has become synonymous with the Pascal FE cards.
Nvidia supplies a DisplayPort-to-DVI (dual-link DVI-D) adapter due to the GTX 1080 Ti's slight redesign on the rear IO. DVI has been dropped and is now supported via the supplied DP-to-DVI adapter or by converting the onboard HDMI 2.0 connection.
The Founder's Edition cooler used on the GTX 1080 Ti features similarities to that found on the GTX 1080. A die-cast aluminium body houses the cooling system that comprises a copper vapour chamber in contact with the GPU, an aluminium fin array, and a radial fan that forces air towards the rear vent.
Measuring 10.5″ in length, standard width for a PCIe expansion card, and conforming to a dual-slot form factor, compatibility is key to the Founder's Edition design logic. Nvidia has created a card that can be installed inside a variety of systems, from a standard ATX-sized chassis to a mini-ITX build with limited cooling space and case airflow.
The rear-exhaust cooling method, as opposed to internally-venting coolers found on many after-market cards, is designed to dump a significant proportion of the heat outside of the chassis thus reducing the requirement for strong case airflow to maintain suitable operation.
Backplate design follows the same two-plate split methodology applied for the GTX 1080 Founder's Edition and Titan X Pascal cards. Two thin metal backplates connect separately to the rear PCB allowing them to be removed individually. This has the benefit of allowing the single backplate piece residing behind the radial fan to be removed to create a small cooling gap if a pair of GTX 1080 Ti cards is installed back-to-back in SLI.
Other than a more aesthetically-pleasing appearance than a bare PCB, backplates on GPUs are able to aid structural rigidity of the card. They can also passively transfer heat away from the PCB, if designed correctly.
Silver, black, and a hint of green is the go-to style for Nvidia's Founder's Edition cards and the GTX 1080 Ti continues that trend. The side-mounted ‘GEFORCE GTX‘ logo is coloured green and features an LED that can be controlled via software, though not with particularly advanced functionality or RGB options.
The angled design of Nvidia's new cooler it introduced with Pascal cards seems to have been well received by enthusiasts who like the styling and largely two-tone appearance.
Tipping the scales at just over 1kg, much of which can be attributed to the cooler, it is easy to understand the challenge of designing a 250W-capable cooling system with such limited volumetric (not to mention cost) constraints. The GTX 1080 Ti FE weighs the same as both the Titan XP and the GTX 1080 FE, despite the latter using a lower speed fan (which could be software limited).
Nvidia's marketing material is claiming 2x the airflow area over the GTX 1080 for the GTX 1080 Ti Founder's Edition cooler. However, this basically seems to be an interpretation of the rear IO venting space gained by removing the DVI port. So don't be misled by claims of 2x the airflow area as it doesn't directly translate into real world performance. That's especially true for fluids such as air that can simply undergo a change in flow velocity to be forced through smaller cooling gaps meaning that area isn't the only metric of importance.
Power is delivered to the GTX 1080 Ti through a combination of side-mounted 8-pin and a 6-pin PCIe power connectors. We expect to see board partners shipping factory-overclocked cards that leverage dual 8-pin PCIe power connectors, or perhaps even a trio for overclocking-geared SKUs.
TDP, which isn't technically power draw but serves as a rough estimation, sits at 250W and Nvidia recommends pairing GTX 1080 Ti with a 600W PSU.
A pair of gold fingers provide SLI support for the GTX 1080 Ti. If you have deep pockets and high graphical demands in your games, coupling together a pair of GTX 1080 Ti graphics card will require the use of a high-bandwidth SLI bridge for the highest level of performance.
More information relating to Nvidia's approach to SLI with Pascal can be found HERE.
Here comes one of the biggest changes for Nvidia's GTX 1080 Ti graphics card – the rear IO. Nvidia opts for a combination of triple DisplayPort 1.4-ready connections and an HDMI 2.0b output. The DVI port has been dropped, meaning that you'll have to rely upon an HDMI to DVI or DisplayPort to DVI (included) adapter. The decision to drop DVI is driven by the real estate it commands on the rear IO backplate and Nvidia's thoughts that the space is better allocated to additional cooling ventilation.
As a triple-monitor user for my workstation PC, I am sad to see DVI being dropped. However, I completely understand the decision and like that Nvidia includes an adapter for users with older monitors that don't support HDMI 2.0 and who don't want the Windows headaches introduced by DisplayPort. Expect board partner cards to restore the DVI output. It's also likely that some vendors will switch out one of those DisplayPort connectors in favour of an additional HDMI port that's more convenient for VR users.
One of the main positives I see from switching to a single row of display outputs is liquid cooling support. The possibility for single-slot liquid-cooled cards has returned and it doesn't involve chopping off one's DVI port on a £699 graphic card!
Eleven GDDR5X memory chips comprise the 11GB of VRAM. Nvidia has enhanced the power delivery system compared to the GTX 1080 Founder's Edition. A 7-phase control system drives a total of 14 dual-FETs with a rated power capacity of 250 Amps for the GPU.
This is an upgrade from the GTX 1080 both in terms of control channels and FET count, as the GTX 1080 Ti now uses two dual FETs per control channel, as opposed to one. It is also an upgrade from the Titan X Pascal's power delivery solution which is likely to translate into slightly higher electrical efficiency or better overclocking.
Thanks for the review. In the concluding remarks you state “it should suffice for 5K usage”. Well, older cards also suffice for 5K usage, just not 5K gaming. I would be very grateful if you would expand on your thoughts in a meaningful way please? Why aren’t there any 5K gaming tests, there are 5K monitors for several years now. Yet powerful graphics cards are regularly tested with 1080p monitors, as if that means something to anybody?
1080p monitors are the most widely used by an enthusiast audience, and for those moving to a higher resolution 4k seems the way to go as a single card can now hold a 60fps minimum with high image quality settings. 5K would push the demand higher and its still very niche. I agree though, id like to see a few 5k tests just for shits and giggles.
Well, I would only partially agree, and with the more obvious part. These past ~2 years we have regularly seen 4K gaming tests although 4K60 was nowhere to be seen, for example. And if a technology is trying to break some barriers why not test it in quite an obvious way? 5K monitors are great for gaming, no lag, no 5000€ price etc.
Hmmm
Given that NVidia has had almost a year to come out with this “New” card, and given its still sky high price of $700 (700 bucks is only “cheap” to the people selling these cards..) it is an underwhelming release at best. And take a look at the 250 Watt power demand, which will surely surge past 300 watts when overclocked. How quickly the NVidia fan boys (and if the 10 reviews I have seen thus far are any indication, the online PC reviewer establishment as well) forget the endless whining and complaining about the R9290X power and heat performance. In this regard the 1080ti is a giant step backward.
So basically if you have a GTX1080 right now and are playing at 1080P there really isn’t any compelling reason to shell out $700 to get a 15% bump in performance. Not to mention you may have to upgrade your power supply and case cooling. And if you look at the performance delta in aggregate even at 1440 the cost/benefit isn’t stellar either.
These numbers look eerily similar to AMD’s Ryzen 7 gaming numbers. Things don’t get really interesting until you get to 4K. However, right now if 4k is where you play, this is best card on the market, at least until Vega shows up. But the fact remains that the 1080ti is nothing more than year old technology in the form of a cut down Titan. We really need AMD to hit Vega out of the park to get NVidia off its butt……
I’m an nvidia guy, I bleed green for sure now that the Shield TV is a thing. Back in the days of the 7970, I was all about AMD. We need Vega to be a hit, for nvidia fans and for amd, if it can compete with nvidias high end offerings, it will make competition between the two companies fierce, and in the world where consumer is king, that is a good thing.
Thank you for turning AA off in your 4k benchmarks which is something most sites don’t do. Annoying when they do benchmarks with things like MSAA on at 4k when it isn’t needed and you can’t get an accurate representation of real world 4k performance.
Could be worse, they could be telling you 20fps is all you need as its ‘super smooth’, as eteknix just did today in their GTX1080ti ‘CPU’ ‘editorial’. Biggest AMD shill piece on Ryzen I have ever read. last time I am going there!
Thanks for pointing that out. ‘Gaming’ was meant by the word ‘usage’, which has now been updated. With that said, multi-monitor 5K work can command a lot of VRAM too if you use programs such as Photoshop and Lightroom. I haven’t tested with a card other than the 6GB GTX 980 Ti, though, and I haven’t tested if it’s a case of Adobe simply caching data into the VRAM where there is spare capacity rather than a performance enhancement available.
Unfortunately, I had very little time to test this card due to shipping delays. So I wasn’t able to gather data for 5K gaming. Also, the limited appeal (to gamers) of the current handful of 5K monitors meant that it wasn’t a worthwhile compromise to sacrifice the gathering of other performance data in favour of 5K tests.
“AMD Graphics cards were benchmarked with the AMD Crimson Display Driver 16.11.4.”
Uhhhh, why were the AMD cards using drivers from 4 months ago?
I would imagine time is a problem as Luke already said in another post. There is quite a lot of work involved in these and I noticed he is also doing Ryzen reviews in the last 2 weeks as well. Guess the guy needs to sleep sometime. Anyone ever tell you, your avatar looks like a mafia don?
I’m sorry but that’s how nvidia is rolling
https://www.youtube.com/watch?v=YzSUzMGSMEQ
i dont see your 35% gain in some games the frame rate is 4 fps and some 15 to 20 frames that amounts
to 2 to 20% not the 35 you are saying
Ok Ok I have to admit it is a very fast card it just squeaks by as a good 4K gaming card which is pretty impressive. I still say they held onto the card for to long probably only because when I had the money to buy it they were thinking when the heck is AMD gonna release that Vega thing so we can start cashing in on Ti goodness…lol
No, no one ever did. Maybe because the ladies like this picture? Anyway, I turned out to be the darkest of the family, considering my brother is blond and my sister is blond with blue eyes. So considering I’ve been called Turkish before, being called south Italian is probably an improvement, since I’m Portuguese. And I look just like my father, except there’s a picture of him when he was 8 and blond, while I was never blond.
Umm it means something to me. I much prefer maxing out my games at 1080p 144hz+. So I’m thanking for 1080p benchmarks with super powerful cards.
Of course, but isn’t your 144Hz+ gaming limited by CPU? But yes, my post was over the top, no doubt, I wanted to make a point that we regularly read about Titan X, 10-core i7 CPUs and similar tech, with pleasure, because we’re enthusiasts. So even if I am not going to buy it because of its price/performance I like to see a review. 5K gaming on the other hand isn’t Martian Hyperdrive yet it is easily dismissed because it is not ‘enthusiast’, but ‘niche’, although there really is nothing wrong with high resolution, those are wonderful panels. However, Titan X or a 10-core Intel CPU are not regarded ‘niche’, they are ‘enthusiast’..by now you certainly get my point 🙂
Hi.
Which games are you referring to? The gains tend to be more
visible at higher resolutions (due to less CPU bottlenecking) and at 4K
are more than 30% in Deus Ex, GoW4, GTA V, Metro: LL, ROTTR, Witcher 3, and Total War: Warhammer.
The bulk of the data was gathered at that time, hence why the Nvidia cards (except GTX 1080 Ti, due to its later launch) also use a driver of the same period. We did some internal re-testing of the new Nvidia driver with the Titan X Pascal and found its performance changes to be relatively minor except in Ashes of the Singularity at 4K. So we decided that it was best to retain the older data for comparison as I wasn’t given enough time with the card to do full re-testing of a stack of GPUs.
AMD has been destroyed completely.
Ryzen is not good processor for gaming and they got no answer for Nvidia.
Total AMD failure!
I really do hope AMD Radeon are able to pull it out of the bag with the RX Vega later this year as from the limited releases we have seen so far it looks promising. Now if they can also replicate the same power savings they have achieved with their latest hardware then Intel AND Nvidia will seriously need to rethink their own tech and pricing structure. It will be a much needed boost to the Desktop market in my opinion.
sorry, a question and sorry for bad english, in the witcher it says AA (above), and above it says nothing about AA
I JUST SAW A REAL REVIEW on youtube that rated this card at real numbers and they are 21.5% from the 1080 not the lies about 35% they said you can get 35 from overclocking to the max no other way
stock speeds are 21.5 still good but not the lies that NVidia is saying
i hope so too but look at facts amd like to hype the razen is a failure at gaming i have one and overclocking no way i tried it crashes after 1 2 days on and when you reboot it says that overclocking has failed i hope its just a bios fix and my video encoding im seeing a 50%
slower then the 7700 kaby lake from what i know about amd this card will be 80% of a 1080
the 1080 ti is 21 to 35% faster then the 1080 i don’t see amd matching anywhere near this
the fury is 50% of the 1080 the most they can do i think will be 30 to 40% better that will be still slower then the 1080 im only guessing
I understand your frustration. IPC for the Ryzen chips is comparable to the Intel and OC aside I think they will and do provide great value for money in today’s market. The BIOS updates you mention will come thick and fast very soon improving stability and performance. Gaming is an issue for many but not the issue that media would have you believe as a very good FPS is still there especially in 4K. The problem we have at the moment is the vast majority of code is favouring the Intel platform and while updates will be forthcoming I trust that newer games will be able utilise the undisputed abilities of the Ryzen chips off the bat. Now as for RX Vega: AMD advertised 40% uplift on their CPUs and gave us 52% and IF the same were true for Vega then we are already on to a winner of a GPU. I also believe AMD have always given more “bang for your buck” so fingers crossed we shall see exactly that later this year.
Hi. The settings screenshots above The Witcher 3 charts show that AA is set to ‘on’ (the third screenshot). The ‘above’ text refers to the tested settings being shown in screenshots above the charts.
oh, yes, i didnt see them, thnx
is ur mobo made by asus?
I cant believe none of those cards can”t a maintain a minimum 60fps at 1080P at all times…