Home / Component / Graphics / Nvidia to launch GeForce GTX 980 Ti next month, GeForce GTX 980 Metal in the works

Nvidia to launch GeForce GTX 980 Ti next month, GeForce GTX 980 Metal in the works

Nvidia Corp. has reportedly reconsidered its plans concerning its rumoured add-in cards powered by the GM200 graphics processing unit. The higher-end GeForce GTX 980 Ti is now expected to be released as early as in the middle of May. In addition, the GPU developer is working on something called GeForce GTX 980 Metal.

According to a report from a Chinese web-site, Nvidia has moved the launch window of its GeForce GTX 980 Ti to 16th – 26th of May. The graphics card will be based on the code-named GM200 graphics chip with 3072 stream processors, 192 texture units, 96 raster operations pipelines as well as 384-bit memory interface. The GeForce GTX 980 Ti graphics adapter is projected to carry 6GB of GDDR5 memory onboard, which will be its main difference compared to the high-end GeForce GTX Titan X, which features 12GB of memory.

Nvidia needs GeForce GTX 980 Ti graphics card in order to better compete against upcoming AMD Radeon R9 390-series from its arch-rival. Final specifications of the GeForce GTX 980 Ti are unknown, just like the price.

TitanX_Stylized_08

In addition to the GeForce GTX 980 Ti, Nvidia is also reportedly working on something called GeForce GTX 980 “metal enhanced version”. It is unknown what it is, but it is highly likely that this graphics adapters is based on a cut-down version of Nvidia’s GM200 GPU. Since nothing particular is known about yields of Nvidia’s largest graphics processor to date, it is impossible to guess configuration of its cut-down version with disabled processing units.

Price and performance of the GeForce GTX 980 Metal, if it ever sees the light of day, will be higher than those of the GeForce GTX 980, but lower compared to those of the GeForce GTX Titan X. Pricing of the solution will depend on performance and availability timeframe.

Nvidia did not comment on the news-story.

Discuss on our Facebook page, HERE.

KitGuru Says: A graphics card based on a cut-down version of GM200 can easily become a bestseller, especially if it delivers 90 per cent of GeForce GTX Titan X’s performance at 66 per cent of the price…

Become a Patron!

Check Also

PlayStation 5 Pro PS4

Sony reveals Project Amethyst, AMD co-developed next-gen AI-enhanced hardware

In a video presentation featuring Mark Cerny, the lead system architect for the PlayStation 5 …

82 comments

  1. the 980 was a fully unlocked chip so where is the performance gain coming from? Will this just be a bined card for high oc.

  2. Why not both? 🙂

  3. Think McFly Think

    GTX 980 = GM204

    Two variants were mentioned in the article. Both presumed to use GM200.

    “GTX 980 Ti – The graphics card will be based on the code-named GM200 graphics chip with 3072 stream processors, 192 texture units, 96 raster operations pipelines as well as 384-bit memory interface.”

    “In addition to the GeForce GTX 980 Ti, Nvidia is also reportedly working on something called GeForce GTX 980 “metal enhanced version”. It is unknown what it is, but it is highly likely that this graphics adapters is based on a cut-down version of Nvidia’s GM200 GPU.”

  4. Metal is an intriguing thing indeed. Kinda reminds me of the metal beyblade series, and if its anything as cool as that then I’ll be a fan 😛

  5. Irishgamer Gamer

    The whole Video card market Sucks at the moment
    Nvidia have been price gouging for the past year. 600-700 Euro for a gtx 980 is just crazy. 1000 for a Titan X is plain bonkers. What nvidia are doing is cutting off the hand that feeds them. IE thier customers, and the whole PC market. Short term gain long term pain. Driving customers into consoles.

    AMD’s 390x challenger looks interesting, but if they don’t have the money to get the 490x out next year, it is already set to fail. Nvidia are sure to reply, and indeed have the money to reply, so a 3 month spurt for AMD after its release is not going to save AMD.

    I run both AMD and Nvidia rigs, indeed will be first in line for the 390x, depending on the price, but something is going to have to give.

  6. AMD isn’t doing much better. Price for 390 is going to be around 600-700 as well.

    But the market isn’t crashing yet, so the consumer seems happy with it. In the end, you have to upgrade occasionally. I need a new GPU, so my choice is going with an older 290, or a modern 380/970. Of course, that’s not the high end spectrum, which is a relatively niche market anyway

  7. with 390x coming at 600-700$ and 380x (old 290x) at about 300-400$ we have a big gap between, mybe the 980 will become cheaper with the 980ti

  8. Metal a cut down 980? I thought that was called a 970.
    I find it more possible Metal to be an overclocked, more expensive, possibly with 8GB VRAM 980 than a cut down one.

  9. The problem is that the 380/380x series is a rebrand of the 290x series, and thus not really worth the 10% performance increase. The only true next gen card is the 390/390x/395×2 series, since it has HBM.
    Also if rumors are true, the 380/380x will be GCN 1.2 while 370/370x is GCN 1.3, so yeah AMD screws up as well.

  10. Maybe it will be bundled with Metal Gear Solid V, as per the Metal theme.

  11. I would have bought a 980 for 400/500 euro, but not for 600+

    I love both Nvidia and AMD as i always buy the gpu that suits my needs long term.

    But man Nvidia has been realy cracking up the prices of their gpu’s -_-

    Hope the 390x regains a ton of hope for AMD as i will buy one if the price is good.

  12. We sadly dont know jack how the 380 series is going to perform, or the 390 for that matter 🙁

    Would love to see what AMD has done for the whole 300 series and dont want to see just rebrands with a littlebit of o.c. here and there and slap on a new sticker.

  13. Adam the Pinnock

    So.. this card is released – spike in sales from those who can afford it, the rest of us wait to see how it compares to the 390X, price drops before the 390X is released, and closes the gap a bit so AMD is forced to drive the price down… hopefully there’s a steal to be made here with two competitive cards… I have a feeling they’ll end up quite similar, just depends on if HBM is a game-changer. On a side note it’ll be interesting to see what this does to the 290X’s pricetag.

  14. Cutdown of the GM200, which is the Titan X chip, not the GM204, which is the GTX 980.

  15. Oops! my bad. Thanks for the correction.

  16. Hopefully the ‘Metal’ will be a replacement for the 970, but with the 980’s 4 GB of VRAM instead of the 970’s 3.5 + 0.5 GB. That’s something I would consider buying…

  17. Where are the Titan X owners??

    You need to go to the Police station and file a complaint coz you just got rape% by Nvidia.
    980Ti £250 – £300 cheaper.
    pffft

  18. The Shiller (D W)

    Like a 970 Ti? 5GB VRAM?

  19. The Shiller (D W)

    I understand that the 980 chip memory architecture has 8 channels each with 0.5GB VRAM attached. The Titan X chip has 12 memory channels, each with 1GB VRAM attached. Therefore, I would expect that this ‘Metal’ chip would have something less than the 12 full speed memory channels. From nVidia’s point of view, it is probably to compete against AMD 380X or 390, and they don’t want to be calling it 970 Ti.

    BTW, the 980 Ti with 12 full speed memory channels would be good, because each channel only has to cache 0.5GB VRAM. There will be less cache misses than with 1GB VRAM attached. A full Titan X starts to lose cache efficiency past 6GB VRAM use.

    nVidia’s new GPU architecture has unit/module disablement “feature”. You can think of the 970 as a 7-unit GPU enabled card. The 980 as a 8-unit GPU enabled card. The 980 Ti as a 12-unit GPU enabled card. Now we need a 10-unit GPU enabled card.

  20. Actually we do, as KeplerNoMore says, the R9 380 will just be an R9 290 that has been enhanced a bit, meaning it’s just a 290 with slightly higher clock speed and maybe a couple dozen more cores. Same with the R9 380X, it will be an R9 290X with slight improvements. This is no surprise, we’re not guessing here; the same was done last generation. The R9 280 was just an HD 7950, and the R9 280X is just an HD7970 Ghz Edition with a ~5% boost from higher clock speed/25-50 more stream processors.

    This is how AMD does things, it was the same in even older gens, with cards like the 7870 being a 6970 with ~5% enhancement etc.. We even know what the 370 line will be, it’s been revealed that the R9 370 will be the exact same chip as the R9 285, and the R9 370X will be an enhanced version of the 285, essentially a “285X” if you will, using that same Tonga architecture that the R9 285 uses. Exactly the same as how the R9 380 and R9 380X will use the “Hawaii” architecture that the 290 and 290X use.

  21. *upbeat mexican music plays*

  22. I gotta admit it is a little unnerving, i bought two TITAN X’s for SLI thinking the 980 TI wouldn’t be out till “after summer” like Nvidia said. This always happens to me though lol, i bought a 970, then got a 2nd 970 for SLI only to find that they’re gimped 3.5gb franken-GPU’s, sold those and got two Gigabyte G1 gaming GTX 980s, but they were underwhelming in performance considering the cost so i sold those too. Then i got two of the GTX 980 Kingpin Editions and the performance was great but mine appeared to be kinda weak with overclocking though, but still i was fairly content even with the $800 cost since i had the best of the best…..*cue GTX TITAN X* Arghh… lol. So i sold those and got the TITAN X’s because GM200 was too much to pass up, i knew the 980 TI would be a better deal, but it was months away so….. -___________-

  23. According to AMD’s own boasting the R9 390X will be roughly halfway between a GTX 980 and a TITAN X in overall gpu power. with the 390 being about the same, maybe a bit stronger, than the GTX 980. So that gives an overall performance idea, if you look at it like a chart you’d see R9 290X = 30fps GTX 980 = 40fps R9 390 = 42fps R9 390X = 50 – 52fps Titan X = 60fps. The 980 TI is still an unknown since there’s no charts or benchmarks etc.. yet, but with the Titan X being such a good overclocker it won’t be like the 780 TI which was ~5% faster due to having ~1,100mhz clock speed vs ~900 on the Titan Black. So i’d say a good guess for the 980 TI on the chart would be 55 – 60fps, somewhere in there.

  24. Adam the Pinnock

    Really nice comparison, it’ll be cool to see how they compare when both models are out! At least it seems like we’ll be getting some ‘affordable’ routes into reasonable 4K performance at pretty good settings.

  25. nvidia has not said anything about when they were going to launch a new card.

    that being said, it’s 15 years each time we have a flagship its overpriced as fuck and we have equivalents costing half 6 months laters.

  26. Guys dont get an AMD card! nvidia gameworks is a huge thing coming and if you havent seen the features you should look it up, the stuff is amazing. Watch the video on gameworks borderlands presequel it only works on nvidia cards!

    Also im so hyped for the 980 ti ! Wonder if its true its releasing in mid may and how long it will be until evga makes a custom one

  27. The Shiller (D W)

    Someone is going to have to come up with some fancy explanation, if the 980 Ti doesn’t perform the same as the Titan X, but with better overclockability because of better cooling.

    Plus, I would have put the R9 390X higher than what you have, but that does not matter, because they will all be measured by DX12 in a few months.

  28. The 980 “metal” is mistranslated. The site is referring to aftermarket solutions on a 980 TI, not an upgraded 980.

  29. I’ve been playing it on an R9 290x, on 3×1 Eyefinity (with not matching monitors), and get good enough frame-rates to play. Take your green tinted glasses off for a change.

  30. Moral of the story: You love to throw away money.

  31. Um what? “Playing it” ? Im talking about nvidia gameworks features not the game. Did you even read what I wrote before replying? Gameworks is a nvidia only feature which is AMAZING. Also nvidia can do what eyefinity does so not sure why you mentioned that.

  32. Gameworks is presently problematic. It uses up too many resources. Don’t believe me? Download the Gameworks demo toolkit from Nvidia and run the simulations yourself and see the frame rate and gpu usage associated with it. It’s the same with their amazing Voxel lighting tech. It’s amazing. But kills the GPU. It’s the same issue with AMD and TressFX. It killed performance even though the hair was only enabled on the main character and no one else. DX12 may help a bit, but these are technologies that are in a constant state of change/enhancement. GPU processing power would have to multiply several times before they become common features that people can use. And I’m not saying this as an Nvidia hater. I have 3 Titan X cards.

  33. I have 3 Titan X cards. Upgraded from my original 3 Titan cards. Main reason for the upgrade? 12gb vram. DSR/Supersampling with TXAA is beautiful. But takes a ton of vram. COD: Advanced Warfare, for example, uses 11gb. So a card with 6gb vram is of no interest to most owners of the Titan X. And this kind of move was/is not unexpected. What do you think the 780 and 780ti were after the original Titan came out? Same chip, clocked higher, better OC room, cheaper, but with 3gb vram.

  34. 1-DSR is not working with SLI. right?
    2-3xTX with how many LCD? It’s overkill for single 1440p like ROG, and using it with 4K still in 60Hz is also odd, so how to really utlilise such amount of vram?
    3-upcoming WinX with DX12 will double vram usage on SLI, so whats the point not to catch 980Ti with faster clocks and more than enough vram like 6+6?

    TX is really nice, but as single it’s still not enough for 4k gaming (usually 30-50 fps in high/ultra), and more than enough for classic FHD.

  35. 1- DSR does work with SLI. However, DSR doesn’t work with SLI if GSYNC is active. Of the 3: GSYNC, DSR, SLI, only 2 can be active at a time. This is being worked out for an upcoming driver update according to Nvidia.

    2- Just 1 LCD. 3 Titan X cards aren’t enough for 144Hz gaming at 1440p in many games if you like Anti-Aliasing and other graphical goodies. When you enable DSR and high AA, it jumps up your VRAM usage by a ton. I was previously hitting the 6GB VRAM limit on my original Titan cards. That’s why I upgraded.

    3- This is a valid point to consider. In terms of clock speeds…I get over 1500MHz OC on my Titan X cards anyway. And this is on air (of course with the fan going at ludicrous speeds). Waiting for Aqua Computers to come out with their active cooled backplate before I put them under water. Fortunately there is no real restriction for clock speeds. And honestly if you pay that much money for a Titan X, you can afford a water cooling setup. So the real issue comes back to VRAM. With regards to split frame rendering or however they decide to work it out on DX12, there are still a few problems. 1: All games that are currently out won’t benefit from it. 2: I still need to see how exactly it’s implemented.

    If the implementation truly works by stacking total VRAM on multiple cards, then you are absolutely right that you would be wasting money getting a Titan X over a 980ti (for DX12 games only, of course). But that would mean waiting more than half a year for games to start coming out with DX12 support. And assuming everything works perfectly with that regard. I really and sincerely hope that it does work. Because then it would encourage developers to cache more data in VRAM and prevent a lot of the stream stutter issues present even on raid SSDs (far cry 4, being a prime example).

    Honestly if that’s implemented…I have no idea what I’m going to do with 36GB total VRAM. I’m just hoping this carries me till the next Titan release. 😛 And until DX12 comes around and potentially changes the game, I get the performance I want.

  36. i love me some shiny shiny, but 3 titan x’s to play at 1440p seems dumb as hell

  37. Yes…because playing on the highest resolution display capable of at least 120Hz is “stupid.” Go back to playing consoles you 60Hz peasant. #PCMasterRace

  38. The Shiller (D W)

    R9 280X is just an HD7970, both with 2048 stream processors. What I have got.

  39. The Shiller (D W)

    Yeah. The offering seems to be no better than buying two previous generation cards. I am still on HD7970 1Ghz (280X). The best that I can get from AMD is double that, for double the price.

  40. well thats a fail because i have a ROG Swift monitor…..

  41. Gameworks is a huge piece of s***, i buy nvidia cards but this Gameworks they are trying to push is b.s. as in almost every game release using Gameworks has crazy optimization problems. Watch Dogs, AC Unity, Far Cry 4, Arkham Origins, AC IV, COD Ghost and some others as well.

  42. Damn, friend, I think you need to lay off of buying graphics cards for a while. While there’s a lot of bad press surrounding the 970’s, it’s been found to be nearly impossible to get them to bottleneck due to memory limitations. You honestly would have been just fine keeping those for quite a while.

    Either way… your money, your call; just giving you my 2 cents.

  43. All nvidia needs is a gtx950ti 8gb gddr5. All Amd needs is R7 360 with 8gb gddr5.

  44. Nice TN display, peasant. Makes complete sense that someone who is satisfied with a TN panel wouldn’t understand the benefits of extra graphical fidelity provided by 3 Titan X cards. They already average 75% load on the 3 cards in GTA 5 at 1440p. Shows how much you know about performance requirements. Then telling me it’s a waste to use these cards at 1440p? When the next resolution bump is 4k at 60Hz? What serious gamer spends 10k on a rig to play at 60Hz….

  45. You mean that proprietary thing made to kill frame-rates for gamers using Intel and AMD graphics solutions? I get very respectable frame-rates on my R9 290x, maxed out. So I think you’re just blowing smoke, like most of Nvidia’s fan-base does. Again, green tinted glasses need to come off for a change.

  46. Terrible source doe. A random no name zero credible poster.

  47. You would think you would get new standards for future proofing

  48. WTH are you talking about? Gameworks is software for video game development. SOFTWARE……

  49. Maybe you should get 4k? Lol spend money for 3 Titans just for 1440p.. I’d take 60 refresh anyday over 1440p. Try 4k at 60fps and be amazed

  50. I don’t even play games that have FPS capped at 60. 60Hz is a complete no. Especially with competitive FPS games. If you haven’t played 1440p with DSR and TXAA at 144Hz, you have no idea what you’re missing.

  51. The 7870 performed similar to the 6970, but it was a different GPU entirely.

  52. >Calling other PC gamers peasants
    >Using anti-aliasing with 1440p
    >Talking about competitive fps games while playing 1440p and high graphics.

    Oh wow. First of all, you’re a fucking idiot for calling anyone a peasant who plays on PC. Secondly, on of the main selling points of higher resolution is the lack of need to use AA as the picture won’t get such rought edges like 1080p anyway. Thirdly, not a single actual competitive player plays with anything more than 1080p with all settings lowest possible to gain maximum … you guessed it … competitive advantage.

    Go back to your cave.

  53. I’m sorry you don’t understand anti-aliasing, peasant.

  54. I’m sorry your mom dropped you head first on a rock when you were a child, approximately 5 years ago.

  55. If it takes being a 5 year old dropped head first on a rock as a child to know what anti-aliasing is, I’ll take that over being you. 🙂

  56. 2 titan x’s in sli isn’t going to be out dated any time soon!
    There is always something better just around the corner in pc land.
    You bought 2 highest end cards which is the right choice as it will last, If you had of bought 2 mid range cards then you would have a reason to be pissed off as they will be useless in 6 months time trying to run games that need more than 4gb ram. 😉
    I bought 2 780(non ti cards) back on day 1 release and am still running them fine at 1440p 144hz and occasionally use my 4k 60hz as well, They have lasted all this time and are only now just showing their age because of the 3gb vram, The horsepower/grunt of the cards is still fine for everything I throw at it but it’s just the 3gb no longer cuts it past 1080p.
    Your Titan X’s will be fine for a long time, Won’t see much of a performance jump in upcoming cards over yours from either company until they migrate to 16nm FF or 14nm die’s.
    Anyone running older than 780’s and AMD equivalents will finally see a nice reason to upgrade as normal 980’s wasn’t worth it.

  57. Direct X 12 will be interesting though, 6gb on each card will be enough if the rumours are true where direct x 12 can share the vram from multiple cards in a system. You are spot on though in regards to Vram is becoming a bit of an issue now. As people move up to 1440p and 4k it becomes even more important to have more Vram and I wouldn’t accept anything less than 6gb on my next card/s 4gb doesn’t cut it at all these days.

  58. I downgraded(or is that upgraded o.O)from 4k 60hz to 1440p ROG Swift, Socius is right, I learned the hard/expensive way. Once you have played on 120/144hz you can never go back to 60hz no matter how good the resolution is. 3x Titan X’s for 1440p is perfect and will last for ages.
    4k at 60hz sucks, It’s no good for fps games! Been there, Done that.
    I would rather 144hz TN than some 60hz IPS, Ultimate panel for me would be 30inch 4k 144hz IPS with 3ms or less response time and G sync or free sync, If someone ever makes it I’ll be throwing money at the screen. Don’t care if it’s over 1k, Wouldn’t be the first time I spend over 1k on a monitor. I remember paying $1200 just for an old school Sony Trinitron CRT back in the day. lol

  59. You still need AA at 1440p, Only once you start using 4k then you can lower it down. Even then it still looks better to have at least 2x AA on 4k resolution!

  60. I can’t even explain how excited I am for DX12. I’m still not sure how the vram stacking will work. It may be a feature introduced in DX12, but it may not be a retroactive feature. It may be a feature that requires something like the new NVLINK bridge that’s designed to replace the SLI bridge. Meaning maybe only certain future cards can use it. AMD and others have tried split frame rendering but it’s problematic. I don’t know. Maybe I’ll be pleasantly surprised. There are 3 main things I’m looking forward to:

    – DX12
    – Uptake in usage of shader based AA
    – VR

  61. Yup. I just picked up the Acer XB270HU. Minus a couple issues (ULMB limited to 100Hz, and no Nvidia 3D Vision) it’s an absolutely amazing unit. 144Hz IPS GSYNC 1440p. They also released a 3440×1440 30″ 16:10 widescreen IPS GSYNC monitor with a curved display. Sadly, that’s limited to 75Hz. If it were capable of 144Hz (once DP1.3 cards come out) it would be an even better option than 4K, imo.

  62. AMD seriously needs to get their new GPUs out. They were supposed to unveil them in March and nothing. Nvidia now has the Titan X and will be releasing the 980 Ti before they ever give us even the specs on the 390X much less release it.

  63. If you have that kind of money, fine.

    Not everyone can afford to spend as much on gaming PC as a car.

  64. I don’t use AA at 1080p. And I don’t notice because I’m playing the game. Not stopping to just stare at the screen and find every jagged edge I can. I love my games looking good, but I don’t have $6-7000 for a gaming rig.

  65. Yes I love game developers using proprietary features so that gamers without Nvidia cards (out of preference or cost) get a lesser experience….

  66. Yeah, as far as affordable decent 4k the R9 295×2 is the best if your on a budget, but honestly the TITAN X now or in the near future the 980 ti or 390X will be a better option as you’ll get “close” to 295×2 performance without worrying that games won’t support SLI which would cut the 295×2’s performance nearly in half. The way i got my comparisons was from AMD’s press release statements that say the 390X will be roughly 60% more powerful than the 290X. So the 290X is faster than a 780 and almost as fast as the 780 ti maybe tied in some games, the GTX 980 is about 20% faster than a 290X. Then you have the GTX TITAN X which is roughly 50% faster than the GTX 980. So you add the ~20% faster that the 980 is over the 290X with the 50% of the TITAN X and you come up with the calculation that the TITAN X is about 70% faster than the R9 290X.

    Then you have to factor in the fact that 90% of the time tech companies pre-release bragging is usually not too accurate, most of the time it’s at least a little bit weaker than what Nvidia or AMD estimates it as. So a “real” estimat would put the 390X at about 50 – 55% faster than 290X. So thats where i get my estimate of the 390X being about 15 – 20% faster than the 390X. So as for the best budget i’d say the 980 TI and 390X will be the best “budget 4k” cards. Although i don’t like the rumors about the 980 ti; it’s been said that it runs on the GM200 -301 chip instead of the GM200 -400 A1 chip the TITAN X uses, so it might have less cuda cores etc.. than we thought. I gotta say, i bought 2 TITAN X’s and i’m not disappointed. I won’t need to upgrade for several years for sure, i get almost max 144fps on my ASUS ROG Swift 1440p 144hz monitor with games like Dragon Age Inquisition, Crysis 3, Far Cry 4 etc.. with max settings and 4 x MFAA, and when playing at 4k with max settings i get about 75 – 85fps in those games which is pretty dang impressive. I even did some testing on Shadow of Mordor with the HD texture pack and GTA V using one of the new 5k monitors and i get pretty much constant 60fps in both games using 2 x AA with max settings. And i haven’t even overclocked them yet, i can push them from 1,075mhz to 1,450mhz on air, just put them in water cooling EK blocks along with my CPU so i should be able to hit 1,500mhz, which will give another 8 – 10 fps or so.

  67. Moral of the story, it’s ignorant to assume when you don’t know anything. I was able to send back/sell the first two sets of cards for exactly what i paid for them, and only lost a little on the kingpin ones, so overall i lost a little on the kingpin sale but made it up elsewhere, so i essentially broke even.

  68. Yeah i know, but i was able to send back or sell all the cards for about what i paid or even a little more, (they were used when i got them) so like i said in the above post to the other guy i pretty much broke even. And i had all kinds of trouble with the 970s, using 1440p at 144hz i had stuttering and problems with rendering etc.. and at 4k it was pretty much impossible. I suppose the 970 isn’t really the best 4k card, but at 1440p even with 144hz you would expect SLI to be pretty decent but i had quite a bit of problems with several games like GTA V, Shadow of Mordor, Dying Light etc.. as they would hit 4 – 4.5gb usage when you turned the setting up to the higher end of the spectrum. And i’m far from the only one whose had these problems so no it hasn’t been found to be nearly impossible. It just depends on the persons setup and what programs and games they typically use.

  69. Honestly three of them is a tad overkill for 1440p 144hz. I’m using two way TITAN X SLI and even on some pretty demanding games like Crysis 3, Metro 2033, Dragon Age Inquisition, Far Cry 4 etc.. with max settings and 4 x MFAA i get ~125fps average and pretty frequently hit the 144hz cap when indoors or in slightly less render heavy areas. And those results were with stock speeds, i can overclock the cards from 1,075mhz to 1,450mhz on air, and now that i have them both with EK waterblocks in my liquid cooled pc i can get it up to ~1,500mhz just using the stock Nvidia Bios with no voltage mods or anything. That gives you roughly 8 – 10fps increase, so two TITAN X overclocked will get you 144fps 1440p max settings as long as you stick to 4 x AA. Although there’s definitely nothing wrong with having three, that just means that even in future games you’ll have that 144hz and in games like GTA V and Shadow of Mordor etc.. you can pop all the HD texture packs etc.. and maybe even go with 8 x AA and still stay near or at 144fps. The TITAN X’s are great cards, much more useful than the GTX 980 imo. Although i gotta say in response to Socius that the ROG Swift is in no way a bad monitor, yeah it’s TN but it’s by far the best TN panel i’ve ever seen. Although i’m definitely going to upgrade mine to something like the XB270HU or the XB341CK Predator with the 34″ ultra wide 21:9 curved IPS G-Sync etc.. 3440 x 1440p display when i can afford to as going to ~4ms GtG from 1 is essentially non noticeable and the increase in color quality and whatnot is nice. I just don’t feel comfortable moving the 4k until i can have 144hz and either IPS or G-Sync as it just doesn’t feel the same moving from 144hz + G-Sync or 144hz IPS to 4k TN 60hz, even ones like the Acer XB280HK 4k 60hz G-Sync one.

  70. That’s my worries about DX12 as well, yeah you’ll get VRAM stacking but how many games will support DX12? You won’t see “universal” DX12 support for at least a year or two most likely, and even among games that support DX12 it’s possible the devs might screw up the VRAM stacking feature. And then there’s the question of how many older games will move to DX12. It’s still a very exciting feature though, it appears to take all of the great features and functionality of DX11 and combines it with the hardware communication features of Mantle while simultaneously increasing fps beyond what mantle even did. Hell even AMD is jumping ship to DX12 now, swearing by it in their ads/rumors etc.. for the new “Zen” CPU’s and the R9 300 GPU’s.

  71. Soft shadows seems to be the biggest culprit. Even leaving on all the other gameworks in games like FC4 (i.e. leaving on simulated fur, HBAO+, Enhanced Godrays etc..) i get a roughly 20fps increase just by going down from Gamework’s Soft shadows to “Ultra” shadow setting.

  72. More like the BS glasses need to come off of you for a change. Nobody in here is going to believe your lies about running 3 way Eyefinity on a single 290X with gameworks games like Watchdogs or FC4 and getting anything above putrid 20ish fps. Unless the new AMD Zen CPU’s and R9 300 series are as good as claimed AMD will pretty much sink, they’ve been losing market share more and more for years. Their mediocre irrelevant products just aren’t being purchased anymore, they get like 20% of market share while Nvidia gets over twice that, hell even Intels IGPU’s are selling more.

  73. I guess you enjoy market and technological stagnation, because that’s what Intel and Nvidia have been pushing with their every generation. Let me think, who brought us x86-64? Oh, yeah, AMD. Who brought us the dual-core? Oh, AMD again. Without AMD, you wouldn’t have half the so great things you love that Intel and Nvidia have appropriated. Intel and Nvidia are market stagnation corporations.

  74. Ah, I see. I haven’t looked into the 970’s in a while, I didn’t realize there were a handful of games now that could trip them up. I’m surprised that they ended up being a bust for you at 1440p in SLI though, I wonder if you didn’t have a bad unit. Either way, I’m hardly the one to tell you to stop spending on hardware, I totally know the feeling of needing to buy the latest and greatest. It’s an addiction for me, but not really something I’m ashamed of or anything. I don’t drink, do drugs, or spend money on anything lavish really other than my computer hardware fix, so I’m ok with it.

    Current rig: 5960X, X99M Gaming 5 Micro-ATX, 32gb Corsair Dominator 3000mhz, 2x Samsung 850 Pro 1tb, 2x R9 295X2 in Quadfire, EVGA 1600 P2 PSU, Corsair Obsidian 350D, with CPU and GPU’s on a full custom loop.

    So ya, I know about excess, hehe.

  75. lmao, who stood on the backs of giants to become what they are? AMD. Who recycles their architectures every single time a new GPU or CPU release is scheduled instead of ACTUALLY making something new, just renaming the old products over and over? AMD. Who hasn’t made a single new desktop CPU in several years and is still using ancient pathetic 32nm tech? AMD.

    Not to mention “AMD” never did any of those things you attribute to them, at least not *this* AMD. These were achievements of the old ATI/AMD, the ones who made the fantastic Phenom and K series etc.. so long ago, and the old Radeon GPUs from 10 years ago. How about something AMD was first to in recent years? First to make a compute/gamin hybrid card? Nope…still haven’t done it, Nvidia’s TITAN has been out for a few years by now though and fits the bill perfectly. First to make a Hyperthreaded or similarly optimized CPU? nope, intel was.

    You see the problem with people like you and those currently running AMD is they have their priorities all screwed up. AMD cares more about being “herpderp first XD” to some arbitrary and pointless milestone than they do about actually making competitive products and keeping their company in business. If AMD had taken all the time and money it took for them to develop a half-assed 32nm CPU architecture (bulldozer/piledriver/vishera) that are just relabels of the same tech in order to be the first to have a “6 core” or “8 core” CPU, then they would’ve had the time and money to make a TRULY quality architecture that has 6 or 8 REAL cores instead of making 3 and 4 core cpu’s containing 6/8 “modules” that share cache and resources. This would result in FAR better profits, sales, market share etc.. which would allow them to make a new architecture yearly like Intel or Nvidia does, increasing yields and performance increases. This is what AMD used to do back in the good days, they had money and intelligence and made quality products on a regular basis instead of just taking the 2011 GPU/CPUs and renaming them then selling them as the 2012 GPU/CPUs, then renaming THOSE and re-selling again as 2013 products etc.. because they don’t have the money to make anything new.

    I used to love AMD, i had all AMD gaming rigs, rendering rigs, home pcs, laptops etc.. that used everything from Phenom IIs, Fire-Pros, Opterons, Radeons, K6s etc.. etc.. etc.. But the facts are just that…facts. And they say that AMD is sinking, i haven’t bought an AMD GPU since the 4000 series, and haven’t used an AMD CPU since the Phenom days. (hell my dual core phenom performs better in 90% of tests, programs, games etc.. than the FX6300 “6 core” *cough* 3 core *cough* bulldozer)

  76. Because they’re being squeezed like a zit from the anti-competitive tactics that are from your Intel and Nvidia that you love oh, so much. You’re a typical Intel and Nvidia fanboy who thinks that just because they did one build with AMD “back in the day” means that they know exactly what AMD is now, and why they’re absolutely necessary to keep Intel and Nvidia from monopolizing us and giving us $1,000 pentium chips, and Nvidia from charging us just as badly as Intel used to.

  77. Nvidia wants to kill other platforms market by releasing variant number of chips to suit large number of customers needs. Off course the bigger piece of pie will be the mainstream users but the pc takes big portion of it .

  78. And you are still playing the same games as those console peasants at the end of the day, yet you are spending ten times as much as them for a bit shinier graphics.

    what a fukin tool

  79. That’s like saying driving a Hyundai is the same as driving a Lamborghini because they are both cars and they both drive on the same road.

  80. If only money mattered…

  81. He’s a walking talking Birth control advertisement

  82. Aye, it was, I had the Pitcairn based one (HD7870 1280 Stream Processors) And Tahiti LE (HD7870XT 1536 Stream Processors) Ran like a dream, it used the same chip as the 7950 in effect, I think it should have been called the 7930 or something like that, the one I had, had 3GB GDDR5 I did spot a 4GB one, somewhere, still a great card, I’m currently on a GTX 770 (for LM Prepar3D v2.x as they seem to favour NV in performance, but the day Lockheed start favouring both GPU brands, I’m switching back to AMD, same when AMD Zen is out, can’t wait to go back to the brand that brought me x86 computing (switched from a Mac back in the K6 II days)