Home / Component / Graphics / Nvidia updates specifications of GeForce GTX 970 following scandal

Nvidia updates specifications of GeForce GTX 970 following scandal

After numerous reports claiming that Nvidia GeForce GTX 970 graphics card cannot use more than 3.5GB of onboard memory, Nvidia Corp. updated the official specs of the graphics solution. Apparently, the difference between the GeForce GTX 970 and 980 is more significant than Nvidia originally said.

According to a report by PCPerspective, which cites Jonah Alben, senior vice president of GPU engineering at Nvidia, the GeForce GTX 970 features 56 raster operating pipelines (ROPs) and 1792KB L2 cache, not 64 ROPs and 2MB of cache, as previously reported. In a GeForce GTX 980, each block of L2/ROPs directly communicate through a 32-bit portion of the GM204 memory interface and then to a 512MB section of on-board memory. Since the model GTX 970 lacks part of L2 and ROPs, it cannot effectively use 512MB of 4096MB of onboard GDDR5 memory.

While the GeForce GTX 970 graphics cards do carry 4096MB of memory, it is divided into two pools: one pool is 3.5GB, whereas the second one is 0.5GB. The larger, primary pool is given priority and is then accessed in the expected pattern. Since the majority of games do not need more than 3.5GB of memory, no problems occur with the GTX 970. However, in cases when over 3.5GB of memory is required, things get much worse. The 0.5GB of memory in the second pool on the GTX 970 cards is slower than the 3.5GB of memory, but is faster than system memory accessed using the PCI Express bus. As a result, performance may degrade when the 0.5GB pool is used.

nvidia_geforce_gtx_970_980_pcb_assembly_970

According to Nvidia, the rather tricky memory sub-system of the GeForce GTX 970 does not negatively affect performance of the board in the majority of cases. While there might be specific instances where performance drops are more severe because of this memory hierarchy design, in the vast majority of cases everything should be fine.

Discuss on our Facebook page, HERE.

KitGuru Says: It is completely unclear why Nvidia decided to advertise 64 ROPs and 4GB of memory for the GeForce GTX 970. The scandal with incorrect specifications clearly affects Nvidia’s reputation in a negative way.

Become a Patron!

Check Also

First AMD UDNA GPUs expected in 2026

AMD's unreleased UDNA GPU architecture is back in the news, with a fresh leak suggesting …

97 comments

  1. yeah, thanks nVidia, I wanted to buy one 970 but I think it’s safer to wait for AMDs turn and get a decent single GPU for 4k gaming and also FreeSync in mind

  2. Charles Charalambous

    Freesync is worse than g sync also run 2 970s here use over 3.5GB still get 60fps no stutter 4k… So not sure what the issues are here. Also AMD hyperbandwidth will be insanely bottlenecked by PCI-E 3.0. Marketing bs for the sheeple

  3. RIP Nvidia

  4. Why are you defending a corporation that just cheats on you?

  5. on their site the same specs are available

  6. Why are you condemning the entirety of NVidia for one single issue? Sure they fucked up pretty badly here and it does make you question their future graphic cards but that doesn’t mean a boycott is in order when they are still a dominant force in the market.

    If anything just set up a class action lawsuit for misadvertising, get some cashback from it and carry on as you were

  7. That’s funny.. 970s in sli cant dominate amds old 290x crossfire at 4k.. I think you are the sheeple.. have you actually benchmarked any of amds offerings? Do some research. The 970’s specs already sucked and the only reason it gets decent performance is because of software optimizations and of course this is at lower resolutions.. Like 1440p not 2160p. -AA

  8. you aren’t going to be gamin in 4k with a $350 card and barely so on two of them.

  9. Or just get a 980.

  10. Nvidia is doing just fine. Great in fact. AMD are the ones who are dying. Their CPUs are at the point of upgrade but no news whatsoever and on the GPU side they just BARELY keep up with Nvidia in terms of performance but not drivers and feature innovation.

  11. I don’t talk about 350$, but the 980 won’t be much better in 4k, if AMD gets a 500$ Card ready for 4k gaming I’ll buy it

  12. It should be a rule of the internet: If you put something out there someone will tear it apart and find all its secrets. If you think you can get away with cutting corners or misleading marketing, just don’t!

    This was probably a case of marketing and tech departments not communicating well. Regardless as to why it happened, this will still effect Nvidia’s reputation.

  13. While it is clearly a bad move from NVidia (credibility loss), and is a insult to those who bought the card thinking they had a one that can use all the 4GB to its fullest … on my side, I don’t really care.

    I didn’t bought the 970 (Zotac AMP! Extreme) for its 4GB, because I play on an 1080p HDTV, and I probably won’t get to use more than 3.5GB even with downsampling and AA (I probably could but the framerate would be too low for me play enjoy before it reach 3.5GB).
    And from the benchmark I saw (not nvidia’s obviously), the performance degradation between a 980 and 970 is similar when switching from a setiing that use less or more than 3.5GB.

    That will not make me want to switch to AMD beacause I have lost faith in them a long time ago, with their “way to hot” GPU and un-optimized drivers. It’s too bad because I like how they make there new features not proprietary and make the industry move forward (Mantle -> DX12) unlike NVidia.

    That doesn’t mean everyone is in my case.
    I suppose some people bought a 970 (maybe 2) to play at ultra high settings, because they couldn’t afford a 980 (and frankly the difference in performance isn’t worth the difference in price).

  14. 4k is pretty much only just coming into the picture, the first iterations of 4k ready gpu’s will never be that great either way you look at it

  15. Seems a really unnecessary lie for Nvidia. People would have bought the 970 anyway as they do so based on performance per pound. Now they have just given the public a stick to beat them with despite the product itself being excellent.

  16. Hmm enjoy the thermal throttling with dual 290x’s causing horrible stuttering. Both sides have their advantages and disadvantages.

  17. 970s are not meant to compete with R9 290X cards. That’s 780Tis/980s. 970s are versing the R9 290 and the 780 cards.

  18. Charles Charalambous

    Funny kid. I used to have AMD then i swapped, would never go back. A friend went with AMD because it was cheaper and is now desperately trying to swap back to nvidia as soon as he can.

    The 970 specs did not suck I got these as a 670 replacement and you know what. MGSV 4K 60FPS MAXED, SHADOW OF MORDOR 4K 60FPS MAXED. AC UNITY 2 K 2X MSAA 60FPS MAXED. Every game runs really well, is reallly quiet, really cool and power efficient. I have no issues with stutter when going over 3.5GB. They also OC amazingly well. I get another 15% out of them while still running below 70c…. So I basically have a rig capable of 4K 60fps on every modern game I have thrown at it with the exception of the one threa dutilizing FC4. I’m sorry but you obviously don’t know what you are talking about when it comes to the 970s

  19. Arafat Zahan Kuasha

    nVidia fans are pretty relentless, defending the stupidity of their beloved company. Wow.

  20. Yes… yes… good job fanboy, I’m wating here and will laughing too if AMD release a joke for their graphics card ᕦ( ͡° ͜ʖ ͡°)ᕤ
    Lets wait and see (ง ͠° ͟ل͜ ͡°)ง
    this is interesting, the graphics card war between nvidia and amd
    and the more interesting is… their fanboy arguing in here, good job kid have a nice day ( ͡° ͜ʖ ͡°)d

  21. Good thing i got two GTX 980s and not 970s 😀

  22. Nvidia is doing just fine, especially when clueless gamers don’t demand a refund from a company that obviously cheated them. GG

  23. yeah, sure, I must be a fanboy 😛 let’s be straight, I’m using a HD 7870 and want to upgrade. I thought about the 970 and maybe it’s gonna be a 970 but I’m not in a hurry so I’ll wait for AMDs turn first and if the new Cards are more competitive at 4k than the GTX series. THEN I’ll decide so stick that “fanboy” bullshit up your ass

  24. actually driver issues disappeared for everyone i know after the omega drivers came around. additionally even the 980 doesnt have an insane performance gap against the 290x which is a generation behind. when the 390x pulls around it will most likely stomp the 980 to at LEAST the same degree that the 980 beats the 290x.

    nvidia and amd have traded blows pretty closely in the gpu area for a long time. the last few gens nvidia had far better software, but currently omega is doing nicely so amd might have finally realized the problem.

  25. hmmm… well in fact Shadow Of Mordor and MGS are both non GPU intensive games. I would have preffered examples like BF 4. I wonder if you get those 60 FPS after or before the AC unity updates.

  26. If had a GTX 970 i woul demand a refund, the card was sold by using misleading publicity…

  27. Bottlenecked by PCI Express 3.0?…Explain that better pls…

  28. Charles Charalambous

    Before update on ac. BF4 I tried on the free weekend thingy and I got 60fps no problems no stutter. I just do not like BF4.. So didn’t play it passed 1 full day.

  29. Charles Charalambous

    PCI-E 3.0 Doesn’t have anywhere near the bandwidth the new AMD cards are rumoured to have.

  30. Good job fanboy kid, I salute you ( ͡° ͜ʖ ͡°)d

  31. You’re so wrong, if it was that simple all cards withPCI EXPRESS 3.0 are 32 GB/sec of bandwith thus all cards withmore than it would be bottlenecked and still they exist for years, example GTX 970 has 224GB/sec of bandwith therefore it would bottleneck…So explain it better because the way you explained you just seem wrong!

  32. Charles Charalambous

    http://www.bit-tech.net/hardware/2010/11/27/pci-express-3-0-explained/1

  33. try not to butt hurt by him

  34. Anybody have the energy for a class action? I feel there is false advertising going on. Chances are I would have bought the card with 3.5GB anyway, but “here’s a 4GB card, but oh wait – the last .5GB makes everything awful” is completely ridiculous.

  35. No matter how we all end up using the card, it’s very clearly deception by the company. There is no excuse. The buyers of this card were purchasing a video card with presumably fully usable 4GB, not a 3.5GB card with a bonus 512MB.

    Car analogy time: when I buy an AWD 300HP vehicle, I expect it to be that. I don’t expect it to actually be a 200HP vehicle, that occasionally gets 300HP under specific conditions, but then loses AWD at the same time.

  36. As you said, there is nothing to excuse NVidia on this on.
    But not all buyers bought with card with the same expectation. Frankly, if the card was sold with 3GB, I would still have bought it (I was planning to buy a 780 at that time), but that’s not the case for everyone.

    Still, I’m surprise how NVidia is handling that “drama” so casually. I would have expected them to give away a free game or something like that to put out the fire.
    I’m also surprise not to see 970 prices drop a little, but that may be too soon for that

  37. “Obviously cheated”
    Maybe if you had bought it because it had those exact specs that you seem to think it had. The only people that were cheated are those that REALLY care, probably the ones that spent to upgrade from an already decent card. The rest of us, ones who upgraded from an old card or just bought it because it was a good card for a decent price, don’t give a fuck.
    I bought my 970, I’m perfectly happy with it, I have yet to see any performance drops on any game that I play, so it’s fine. (Especially after coming from a 560ti)

  38. a single 970 cannot run Assasins Creed Unity at MAX without being overclocked heavily. I get 30-40 FPS with all graphics features enabled on 1080p.
    And i got the Gigabyte GTX 970 G1 whick is at 1400 mhz

  39. Yeah, I am not bashing the performance of the card really. I bought the Giga G1, and I could not be happier with it. That said, I was told it was a 4GB card, and it’s not really. Given that I did use that number as part of my decision-making process in buying the particular card, I feel deceived by nVidia. I don’t even care if I never hit 4GB, but I was sold a product under a false premise.

  40. Charles Charalambous

    For starters I said with 2. Also I was runnign ac unity on my I7 3770K @3.9GHz, DDR3 RAM at 2133MHz and 1 GTX 970 (this was before I went sli) game maxed except shadows which were on ultra not soft shadows, and AA was FXAA on 2560 x 1440 and I was getting 45-60fps for the most part with rare dips to 38-40fps with my core clock on 1393 MHz and my Memory clock at 7.5GHz

  41. Charles Charalambous

    Excuse the Grammar… I can not seem to edit my response…

  42. i would have waited. wtf is the reason for doing it this way? big ol nvidia cant handle 4gb? took the risk with them and they already screw me over.

  43. A Nvidiot at it’s finest.

  44. It’s fine that you were given incorrect specs on something you paid for?

  45. “Still, I’m surprise how NVidia is handling that “drama” so casually. I would have expected them to give away a free game or something like that to put out the fire.”

    The fact is, retail add-in consumer board sales are a comparatively small part of their revenue stream. NVidia makes most of their money on professional graphics boards, GP-GPU computing sales, and large contracts with OEMs. In other words, to put it simply, they don’t care about us end retail users, and they know the legions of nVidiots will gladly bend over and take more from them.

    I’ve said it many, many times on this and other tech forums that this company is particularly arrogant and devious, and now they’ve been caught with their hand in the cookie jar (or our wallets). I used to like nVidia, before they:

    -ripped me off with a bumpgate failed GPU in a Toshiba Tecra
    -before they tried to bullshit everyone that 2GB was plenty of memory for the GTX680 and GTX670, the GTX670 going for $400 when the Radeon 7950 was going for $319 with 3GB of memory
    -before they tried to lock out AMD card users from games by paying off game makers to add useless PhysX effects
    -before they tried to make us all pay at least an extra $100 for G-Sync monitor, and lock out AMD video cards from proprietary G-Sync monitor technology, a technology that most monitors can have for no extra cost, and which will be available to all card users with FreeSync, if only nVidia can pull their heads out of their asses and enable FreeSync in their drivers for their customers, which I doubt they can bring themselves to do, because that would be excellent customer service.

    If AMD had done this, they would be universally crucified on all the tech forums. But somehow, nVidia can actually lie and rip people off, blatantly, and there’s still tons of nVidiots defending/minimizing their actions. For crying out loud, if you want to give money to a company that has a proven track record of outright lying and ripping you off, be my guest. But just realize you’re a goddam, straight-up hippocrite, no mistake.

  46. “Nvidia is doing just fine. Great in fact. AMD are the ones who are dying.”

    Well, too bad nVidia shot themselves right in the head just before winning the race. Now any sane, intelligent person will think twice before buying nVidia. Only an empty-headed nVidiot would still bend over and take more from them now.

  47. I was relieved to discover that my 980M is unaffected by this funky design structure.

  48. For me, yes. I didn’t buy it because it had 4gb memory or because it had 64 ROPs and 2 MB cache. In fact, I didn’t even look at the specs before I bought it. I did a comparison between my card, the 970 and the 980, and since the difference between the 970 and 980 wasn’t very large, and still isn’t, the price was good. I understand some people’s frustrations, but all the idiots screaming that Nvidia’s going to go down or take some huge loss because of this are just idiots.
    I’m still happy with what I bought, especially since I got it discounted and got a free game with it.

  49. “I bought my 970, I’m perfectly happy with it, I have yet to see any performance drops on any game that I play, so it’s fine.”

    Yeah, you have yet to reach the part of the chocolate bar with the cockroach in it, so everything’s just fine. Keep eating…

  50. I’ve played every graphically intense game I can think of (FC4, BF4, Metro, etc.) and I get a solid 60+ at all times. The only limiting factor in my computer is my processor. If I never get to the cockroach, does it matter?

  51. Un-optimized drivers is a fair criticism but the heat issue was always a red herring, it was only with reference coolers, and the power usage and heat with bespoke coolers was as competitive as could be wanted. Not with the 900’s but with the 700’s. The throttling with reference was a travesty, but later coolers fixed that, and while criticism of the reference design is warranted it does not paint a failure of the entire line.

    I had hoped they would improve their DX11 drivers soon, but Mantle and DX12 may be the best I get from them. Better than nothing, worse than what would be optimal.

  52. GTX 960!!!

  53. Charles Charalambous

    Please if I am wrong tell me where. I may well be rather than insult how about you explain AMD BOY. Oh btw I buy whatever works best for me regardless of Brand. AMD has not been it for years. Last time I had an AMD card it didn’t work with my monitor. AMD’s response buy a new monitor. So I swapped to Nvidia instantly worked with my monitor and the drivers are a lot better. Also swapping to intel over doubled my fps from my AMD X6.

    as I said if I am wrong CORRECT ME PLEASE. If I am wrong I want to know as I have started getting a real interest in GPU architecture. Rather than do the normal AMD INSULT. EXPLAIN WHERE I AM WRONG

  54. The PCI-E bus has nothing to do with the internal speed of the memory talking to the GPU on the card. The bus speed is fine. Current gen cards dont even fully utilise PCI-E 3.0 x8 , let alone PCI-E 3.0 x16. It utilises the bus for communication with the CPU and with eachother (in SLI/Crossfire configs).

  55. I don’t know if you followed closely the fre-sync thing, but that’s only available natively for laptops for now. Standard displays can’t modulate their refresh frequency depending on input.
    A new standard has been décided based on what Free-Sync outlined, but no existing display support that natively at the moment.
    Frankly, I dont see how different it is from G-Sync (having to buy a new display).

    But like I said in my first post, I like how AMD make most of their discoveries non-proprietary, unlike NVidia (moreover because nvidia has some nice features that won’t be vastly adopted because it is resticted to their card). Free-Sync and Mantle being the last to example of that (and i’m pretty sure we wouldn’t had DX12 if Mantle didn’t show there was a lot of margin to free the CPU for something else).

    But as a fact, NVidia’s driver are way more optimized, and their dev support program helps dev more sooner, meaning that for the same hardware gen, their cards give better performances than AMD’s (pushing AMD to sell their card cheapper).

    I like how AMD is doing things, but I like NVidia’s product better. Nothing hypocrite here.

  56. It’s explained in the above comment fairly well. Unless you have poor reading comprehension. As far as being an “AMD BOY” well a picture says a 1000 words right? Obviously you need to do more research.

  57. Charles Charalambous

    Does not explain jack on how the AMD hyper bandwidth will not be bottlenecked at all. Thanks for going for insults while I did not by the way.

  58. And with another 970 you maybe get 40% more efficiency (not more). And at the same resolution and the last heavy features enabled… u wont get stable 60 FPS at 2K… your just sayin some sh!t 😉 😛

  59. Charles Charalambous

    Want me to link you proof via 60 fps videos ? Guess what I get more than 40% boost

  60. not an AMD fan, in fact i love Nvidia until they falsify the specifications. basically you are buying because it’s Nvidia and not the specifications of the card.

  61. Great job, Nvidia, great job. Like as if they needed any more negative reputation (which was rightly deserved) from the people affected by The GeForce Go Defect seven years ago (of which I am one of), and then the angry GTX Titan early adopters when the 780 Ti was released. I dearly hope they learn from this and stop trying to rip people off.

  62. PCI-e moves data from the storage drive to the VRAM of the GPU. A GPU’s bandwidth is how fast data is moved from the card’s VRAM to the shaders for processing. You’re a moron.

  63. Charles Charalambous

    Right so there’es no way then that a card with like 10 times the bandwidth will be bottlenecked by current tech because it’s all internal. Therefore everything will be able to use it no problem? I remember GPU manufacturers complaining abut PCIE limits for years.

    thanks for the insults btw. Generally thanks for the info, so I guess all AMD retards are aggressive pre-pubesent kids who can’t just explain to someone why they are wrong without inulsting.. Cjeers

  64. What a joke.

  65. Wow Charles. Oversell you friends desire to switch back some more. I can see him now, hanging by his balls off a cliff. Drama much.

  66. Well it’s either a marketing mistake by nvidia or just a ploy to get them to stay top of the GPU market.. either way I’m getting a 980 now.

  67. Can you provide a source where F-Sync is inferior to G-sync? Or a source where HBM will fully saturate PCIe 3.0. Thanks.

  68. Jarrod White’s comment explains exactly why it wont bottleneck or fully saturate a PCIe 3.0 x8/x16. I’m not trying to be rude but.. It doesn’t seem like you know “exactly” how PCIe works. http://en.wikipedia.org/wiki/PCI_Express This might be a good place to start. So you could have a better understanding.

    Just an example. A R9 295×2 or Titan Z wont even fully saturate a PCIe 2.0 x8 slot. No bottleneck or frame loss.

  69. Heat and throttling are only an issue with the reference cooler. you’re an idiot if you buy reference. Why do Nvidia fanboys always use that in their arguments?

  70. I wounder how Nvidia would cope if everyone returned the cards for refund? I’m certain in my country it’s a protected right to do so if a product is not the genuine article. They play adds about it all the time on TV.

    Although saying that I think most people are pretty happy with there 970s so it probually won’t happen much.

  71. Well you just confirmed what i said if your argument is that this bandwith ofPCI EXPRESS 3.0 will reduce new amd cards performance since new amd cards have higher memory bandwith requirements still you are wrong:

    1.There are already cards that have memory speeds of well over 32GB/S that is the bandwith of PCI EPRESS 3.0

    2.Those cards as GTX 970 have 224GB/S of memory speed and still aren’t held down by PCI EXPRESS 3.0.

    3. Given that theres no reason to think new amd cards will suffer with that.

    Therefore you helped me prove you are wrong with your assumption, that’s why you are not an engineer, PCI lanes are used to transfer commands, not actual “graphics” data, that’s why bandwith isn’t a problem in this case the memory you talk about only has influence in how fast graphics can be processed by the graphics card, and since that’s done internaly in the card that memory has no relation with the PCI EXPRESS 3.0 lane bandwith 😀

  72. That is what i’m trying to explain to this guy in an understandable
    way for his mind to understand.

  73. I did it in a simple way and if you so interested in GPU architecture as you say you will understand why you are wrong, i explained it from an engineering point of view…

    PS: see my comment above.

  74. Oderus you deserve my like even me having both amd and nvidia cards i didn’t thought of doing that 😀

  75. HBM is the memory used by the card for internal cummunication, while PCI EXPRESS 3.0 lane is only used to send commands for the cpu to follow, things the card can’t do and so tells the cpu: -Here you have!Do this because i’m busy right now!…

    This is the most basic explanation if you can’t understand this go buy a brain 😀

  76. This guy is just one of those that disagrees with everything for no reason even when the facts are on his face…

  77. Yes another well did explanation.

  78. I explained it to you with out using a single curse word and still you didn’t believe me!

  79. indeed my friend

  80. Don’t tell that to people :D…We others like to laugh to this braggin boy’s that know nothing about things…ssssshhhhhhhhh

  81. Ummmmm.. Where have you been. AMD’s drivers have been pretty solid for quite some time. If anything Nvidia’s Maxwell drivers currently have been underwhelming in comparison thus far. Way to spread fud.

    https://forums.geforce.com/default/board/155/geforce-900-series/

  82. You do that… Not when you enable all features. Soft shadows, HBAO+ and TXAA. you simply dont get a stable FPS at 60 at 2K MAX sattings… send me a link and i will apology…

  83. Sign the petition for them to refund all NVIDIA customers who purchased one. Or sign it anyway to show your support for highlighting NVidias disgusting practises:

    https://www.change.org/p/nvidia-refund-for-gtx-970

  84. People really need to calm down. For the majority (as in, 99%+) this has no effect at all – it does not change the fact that the 970 is the best bang-for-buck card around, nor does it diminish the accomplishments of getting 780ti-level performance for 660ti-level power draw. It’s also cheaper as a whole than the GTX 670 was, and that itself was quite the sweet spot for gamers (those on a lower budget would have gone for a 660ti but the 670 remained the true sweet spot).

    Note, I am not a ‘fanboy’ for either team, but much like how I will only ever purchase an Intel CPU until AMD’s offerings actually compete in the performance area, I have been buying Nvidia products for almost the past decade because they’ve managed to always have the crown in overall price/performance/power balance.

  85. Not sure what they were complaining about, considering you need to go all the way down to PCI-e 1.0 x8 to actually have any performance hits. Maybe multi-GPU setups?

    I called you a moron because you didn’t listen to anybody else that was already saying that, and didn’t realize that GPUs being limited to PCI-e bandwidth would make any bus wider than 64-bit completely useless even though you can see gains from increasing bus width a large amount of the time. Also, you keep calling everyone that disagrees with you an AMD fanboy, for no particular reason, while being generally very rude.

    I call ’em like I see ’em. It’s not my problem if you choose to act like that.

  86. Considering the 290X and 980 can play most games with semi-decent frame rates at 4k with maxed settings on most games, I think the Big Maxwell and the 390X should be able to handle 4k just fine.

  87. Sprocket_almighty

    No they are not only an issue with a reference cooler. Even still you should be able to purchase a reference design without it thermal throttling. I had 2x MSI Twin Frozr R9 290’s, the top card thermal throttled at 95C and the bottom ran at 85-90C. This was all in a well ventilated Corsair Carbide Air 540, plenty of fans. Re adjustment to fan configuration and so on. I’ve owned an ATI5770 and an AMD 6970 previously, call me an Nvidia fan boy if you can’t face the facts but those temps are too hot. At those temps the cards sound like jet engines, way too loud. Can you understand my disappointment?

    AMD is going to have to manufacturer products that do not get so hot before I consider purchasing from them again.

  88. I get 45~50fps in AC Unity Ultra with FXAA running at 2560×1080, with my gtx 970 at 1480mhz.

  89. lol i am tired of seeing these nvidia/amd fanboys lol… they always think that each of their product is superior, making their brains inferior lol

  90. Last month i bought this;

    CPU: Intel Core i5-4690K
    Mainboard: Asus Z97-KGraphics:
    MSI GeForce GTX 970 Gaming 4G
    RAM: 2x8GB Kingstong HyperX Beast 1866Mhz
    OS: Windows 8 64-bit Service Pack 1
    PSU: Coolermaster 650w Silver+
    Monitor: Asus PG278Q (27″ 1440p G-sync 144hz)

    SSD/HDD: 120gb Intel 520 SSD for OS, 250gb Intel 540 Evo for games, 1TB WD Purple for recordings.

    Every fast paced game drops when playing at 1440p, not like small fps drops, i from my normal 90-144 fps (G-synced) down to 5-26fps for 3-4 sec, where as the first 0.5 sec feels like a complete freeze.

    Doesnt matter if im recording or not.

  91. give them corn flakes ( ͡° ͜ʖ ͡°)d

  92. how does that make him a fanboy? If anything, you’re the one that seems like an angry nvidia fanboy for lashing out just because someone brought up AMD…

  93. price says otherwise.

  94. funny thing is, you own an AMD APU…

  95. Yeah… NOW price says otherwise. And it’s not even in the product line. AMD’s just dropping prices left and right now.

  96. How does that make me like an angry nvidia fanboy? Ok pretend that I’m an angry nvidia fanboy, and I’m here just want to watch some movies ( ͡° ͜ʖ ͡°)d

  97. Funny, I had to re-image my computer to downgrade from the “Omega” drivers because they not only caused repeated blue-screening on boot, they couldn’t be removed in Safe Mode thanks to AMD’s driver package model. I may consider switching to nV for my next card, since I was staying with AMD mainly for the ability to add SMAA rendering (which is vastly superior to any other AA method at a similar level of processing power) with RadoenPRO, but RadeonPRO is no longer being developed, and before that the support for Crossfire on non-AMD boards but no SLI on non-Nvidia boards sold me.

    I always found AMD’s drivers to be WAY better for my usage cases, especially multi-monitor options on Windows, so I’m never sure what driver issues people are always complaining about, but then I’ve mostly stopped trying to fathom the ridiculous rage of so many of my fellow gamers.