While only a small fraction of GeForce GTX 970 owners plan to return their graphics cards to stores following a scandal with the product's specs, there are still quite a lot of people, who got upset not only because the company originally published incorrect specifications, but because it decided to remain silent even after the mistake was uncovered. Apparently, those people have decided to take Nvidia Corp. to court because of it.
Andrew Ostrowski, individually and on behalf of all others similarly situated, this week filled a class-action suit against Nvidia in the U.S. district court for the northern district of California. The complaint accuses Nvidia and Gigabyte Technology of misleading advertising, unfair business practices, unlawful business practices, and deceptive business practices. The lawsuit seeks a jury trial as well as disgorgement, restitution, injunctive relief and all the other damages and reliefs permitted under California law.
Back in January it was discovered that Nvidia incorrectly declared the amount of raster operations pipelines (ROPs), actual memory bandwidth, capacity of L2 cache and the amount of high-speed onboard memory for its GeForce GTX 970 graphics adapter. Instead of 64 ROPs, the GPU features only 56 ROPs; actual usable memory bandwidth at present is less than 224GB/s; the L2 size is 1792KB, not 2048KB; and the amount of memory should be indicated as 3.5GB + 0.5GB because only 3.5GB can be accessed with maximum data-rate due to limitations of the cut-down GM204 architecture of the GeForce GTX 970.
“The defendants engaged in a scheme to mislead consumers nationwide about the characteristics, qualities and benefits of the GTX 970 by stating that the GTX 970 provides a true 4GB of VRAM, 64 ROPs, and 2048 KB of L2 cache capacity, when in fact it does not,” the lawsuit states. “Defendants’ marketing of the GTX 970 was intended to and did create the perception among purchasers that the product was, in fact, able to conform with the specifications as advertised.”
Nvidia has admitted that it incorrectly stated specifications of the GeForce GTX 970. The company said that all performance limitations associated with the specs cannot be cured with a driver update. However, the GPU developer did not promise any compensation to owners of such graphics cards. Moreover, Nvidia unofficially wants to distance itself from the scandal, according to a media report. By contrast, numerous retailers and graphics cards makers accept returns of the GeForce GTX 970 graphics adapters or provide partial refunds to their owners. Only between 1 and 5 per cent of owners have so far returned their GTX 970 add-in-boards.
Since the lawsuit is a class action one, it may be joined by other people, who believe that Nvidia and Gigabyte deceived them with the GeForce GTX 970 specifications.
Nvidia declined to comment on the lawsuit, reports PCWorld web-site. Gigabyte did not comment on the news-story.
Discuss on our Facebook page, HERE.
KitGuru Says: Usually proceedings like this one last for years, hence, do not expect any results any time soon. In fact, one of the reasons why the lawsuit was filled was the fact that Nvidia decided admit its mistakes, but offered nothing back to gamers. Keeping in mind that mobile Nvidia GeForce GTX 980M graphics processor is also affected by the same memory and ROP issues as the GeForce GTX 970, expect notebook gamers to slam Nvidia too…
C’mon Nvidia what the fuck.
assholes right
This was bound to happen, and should happen so nvidia learn they can’t just lie to customers the way they did.
This is still uncalled for and people have been WAY overreacting to it. it was an internal mis-communication. The cards still perform very well. NOBODY is actually complaining that their card is too slow and you have to actually TRY to hit 3.5 to actually do it. By the time that you actually hit 3.5gb of VRAM usage, your framerate will have been too low to game smoothly anyways and it takes a lot to do it. Nobody was complaining until the reviewer companies like PC Perspective were intentionally pushed it to the absolute max. Go ahead, go play BF4 in 1440p maxed out… It will NOT even come CLOSE to hitting 3.5gb of VRAM usage.
Its not about the performance, its about the concept. Why would a company need to give a full product when people will sit passive when given a gimped version? Its exactly what is happening to the game industry currently.
Card is a beast and people are making a huge deal out of nothing. Typical human race.
People defending Nvidia saying its not a big deal… WTF. Would it not be a big deal if I gave your girlfriend not my whole dick but just the tip? That’s your arguing point?
Nvidia as a company just made a mistake. The Nvidia customer support will still stand up to its positive influences and help remedy the situation for dissatisfied customers, I presume anyway. 😀
You can not misrepresent a product which you then intend to sell. It is deception.
The majority of people who purchased the GTX 970 most likely do not know about the flaws/deception; Case in point as there is a measure of trust.
For the those that do and see no issue. They’re cheating themselves.
It speaks volumes of ones character when you’re willing to accept a grievance against you and you’re not even prepared to do the bare minimum to rectify it.
The worst are those who understand it, acknowledge it and yet defend the practice. How can anyone accept the problem exists, understand they’ve been deceived and then go on defending the practice as a “mistake” all the while defending Nvidia, even when it refuses to do the correct thing which is to offer a recompense.
It doesn’t matter if it was a mistake. A mistake does not exempt you from correcting the situation which Nvidia has refused to do. By calling it a “mistake” you’re apologising for them and i think you perpetuate this idea that they’ve done nothing wrong.
They have. They refuse to accept accountability, a further insult. This makes them shockingly devious and cowardly, which in hindsight makes the argument that they’ve made a “mistake” all the more suspect.
There are plenty of games that hit 3.5Gb at 1440p and in the next two years it’s entirely possible that some games will do so at 1080p. I’ve also gotta wonder why the Nvidia engineers/marketing/whoever’s responsible for this clusterfuck bothered with this scheme. What’s the point? Is it even possible to do it unintentionally? I really doubt that. The card is still a killer, there’s no denying it, but the fact is Nvidia at best did an engineering mistake and at worst lied and advertised a gimped product. The actual impact of this mistake and/or lie is irrelevant.
The “huge deal” is about weather or not its acceptable practice to be lied to. Think about not what you’ve got but all the ROPs, Cache, full speed 4GB vram that you did not get. The card would be better than what it is. You got cheated.
People are human and so are the people at Nvidia, Humans make mistakes and yes as I stated before, they made a mistake. Also it was stated “By contrast, numerous retailers and graphics cards makers accept returns of the GeForce GTX 970 graphics adapters or provide partial refunds to their owners. ” I don’t see why people like you Mr. topdown get so upset over the stupidest things in life. Yes a company made a mistake so you as a consumer can choose to get a refund and return the card to the retailer it was purchased and move on with your life and not dwell on a mistake from Nvida enough that you think it justifies trying and act hard in the comments section of a kitguru article…
If your tip was enough to satisfy her more than the one they have, it would make a huge difference for their ego :p And that’s where Nvidia’s marketing decided to play with the specs and the buyers mentality.
Dying Light uses between 3.3gb and 3.6gb of Vram at 1080p on Max settings, guess what happens when you hit 3.6gb of usage? Yeah, stutters like crazy.
Judge will toss the lawsuit out. For a class-action lawsuit, the plaintiff(s) here have a weak case. I wonder what douche lowly law firm or lawyer took this case up. Probably looking for a quick buck.
I better see some of this money! The moment my VRAM hits about 70%-85% utilization it takes a dump. Single figure performance dip my arse the whole thing locks up.
It is most definitely is not a weak case. No matter what side you are on, consumer rights infringements have been made and that is not something that can be denied, it is not a matter of opinion, it is fact. Moreover, it is something which has pissed off ALOT of people, and the judge will have to consider the scale of the effect this has caused also. Not a weak case, Ms. Harvard Lawyer Commenter
I called it but the overreaction is REEAAL
They might say, purchase a DLC to unlock further abilities of the card 😮
Judging by some of the comments here, lying to your customers is OK, it’s a mistake or a mis-communication or something, and apparently any consumer who isn’t happy about being lied to and actually does something about it is overreacting. Really? Do some people really believe consumer rights and companies’ accountability are less important than, what, shareholders’ dividends and corporate self-image? They lied…let them pay, let them be publicly humiliated, let them learn not to do it again.
I believe we will win, so we should get at least a refund/step up to 980. Go to http://www.gtx970lawsuit.com i am referring you to the site because i am the plaintiff in the case, just so you know. I bought two G1 gaming 970s and i have the same issue, playing shadow of mordor or dying light etc.. even at 1080p i hit 3.5gb vram easily and my minimum fps tanks and i get horrid stuttering. If i try to play at 1440p, especially above 60hz it’s literally unplayable, nearly half the games i own freeze and stutter so bad and even crash completely. It ruins my chances of taking the college courses i wanted to take as well.
That’s a lot of lies. Nvidia should just apologize and make penance
Nvidia deserve to be sued for misadvertisement* but people blew this way out of proportion. They acted like as soon as the news came out all of a sudden their GTX970’s were absolute dogshit when in reality nothings changed from day 1 and if they were happy with what they got when they bought it then there is no issue.
The only problem is less future proofing since that extra .5GB of VRAM might be needed in some modern unoptimised shit games and games to come within the next few years but if you’re playing at 1080p you’re rarely going to need that much memory in the first place and if you’re playing at 4k well then you’re just a straight out moron for buying the GTX 970 for 4K gaming
*even though it wasn’t technically misadvertisement. there is still 4GB in the 970 whether only 3.5GB is used or not**
**It’s not even that 3.5GB only is used, all 4GB isused just that .5GB is slower
I saw it coming…
If they’ve been caught doing it this one time, what other chips in the past could have been affected by such malpractice?
Problem isn’t that it can’t accept the .5 GB RAM. It’s that it _does_ and then crawls to a standstill because it’s not working correctly. This is a huge issue for multi-monitor or large-resolution setups.
this is the 1st time since maxwell is the 1st decently modular architecture from nvidia. in the past the extra memory module would be disabled and consumers would get a cut down card with a 192 bit memory interface that is much slower in every way but it wouldnt act weird and stutter like the 970, with this they could keep the extra memory and bandwidth but once you go over 3.5 gb, 1gb of the memory slows down to the slower speed.
Well said.
Signed,
GTX 970 owner.
Show me where the card “crawls to a standstill” after 3.5GB. I’ll wait…
FYI, The OS has control over the .5GB pool, and according to nVIDIA, it’s 4x faster than system memory.
That’s what can happen anytime you’re settings are too high/not enough VRAM. Lower your quality settings.
Also, using that much VRAM @ 1080p sounds like a poorly optimized game if you ask me.
It’s not one of “the stupidest things”. Trust me, if you shelled out hundreds of dollars for a graphics card that isn’t what it should be, you’d be pretty irritated, too.
im having issues at 1440p with my 970.
No it doesn’t. Stop lying. You liar.
It is not a settings issue, as you move through the map, the game loads more assets into memory, and when it goes over 3.5 GB, then it will stutter. claims that it is faster than system memory does not mean much when you are going from over 200 GB per second, to less than 20 GB per second. GPU’s need a ton of bandwidth because the way they access memory, is bandwidth intensive, and they are designed with having hundreds or thousands of cores, all hammering a single pool of RAM.
If I had one ;_;
internal miscommunication
that lasted for four months until it was found out
or alternatively
active deception
that lasted for four months until it was found out
what sounds more plausible given the timescale here?
not enough vram? its only using 3.6GB of the vram when it starts to stutter there should be another .4GB of vram so no it is not running out of vram but it is running out of vram that is actually fast enough to do the job
the people at nvidia are people yes
but nvidia is a company and there are many many people at the company and the fact that no one in engineering apparently noticed the incorrect specifications or decided not to try and escalate the issue higher if there was indeed a miscommunication for four months makes their excuse of “it was internal communication fuck ups” seem a little thin
Since Nvidia admitted their mistake, took steps to correct the specifications after the fact, vendors selling the cards are accepting refunds, these three things will account for a huge plus on Nvidia’s side if/when this hits a courtroom. Whatever the final amount is, whether it’s an out of court settlement or not, it’ll be a paltry amount due to these three things. Welcome to the world of big business.
You’ll get a coupon for $20 off your next Nvidia card. The lawyers are the only ones that will make money
OK, I have a 780, a card with only 3gb of VRAM, .5 less than you and I have never seen performance drops at 1080p with Shadow of Mordor or Dying Light.
I thought it wouldn’t be a problem too but I have observed it on system monitors and benchmarks. So fuck you with a sharp stick.
Bear in mind it knows it only has 3GB so it doesn’t try to utilize 4GB, Because it shows up as 4GB on a 970 it tries to use it all and then tanks when it hits the bottleneck.
Do you own a 970? If not you can’t comment. I have observed it on my card.
It’s not ” a beast” at higher resolution or multi monitor set ups when it reaches 3.5gb or more of vram usage. Performance tanks. Typical apologist!
Dafuq? Read back to your self what you just typed. If it supposedly has 4gb of vram, then 3.6gb of vram being used shouldn’t be a problem. Dying Light is horribly optimized, there’s no denying that, but when you got 4gb of vram, and the game is using 3.6gb… how does that translate to not enough vram for you? It’s bandwidth peformance tanks past 3.5gb. That’s a straight up fault in how the 970 was advertised. We should have 4gb of working, actual vram.
My issues with the 970 started when I found that the R9 290X was both faster and cheaper. Then I found that they didn’t stick to their power targets in reviews, often pulling up to 240-270W on load at stock speeds. That’s nearly TWICE the advertised power draw! Then the whole 3.5GB issue came up, followed by the findings that NVidia lied on the specs…
Makes me ashamed to have a 980, doesn’t help NVidas case that they also locked the overclocking on my laptop. I was running just fine at 1300MHz core 70C thank you very much.
NVidia’s pissing me off lately, can’t wait for the R9 300 series so I can get this POS Maxwell out of my system.
R9 290X is both faster and cheaper. And AMD didn’t blatantly lie about the specs to sell more, then just hope that no-one notices…
They deserve to be sued over this. They are liars who cheated people exited about a card that on paper made much more sense than the 980 price/performance wise. People should be very careful about buying their products again…
No, they lied about the spec of their product. It’s that simple. Saying you don’t need that memory for 1080 is also false as you can hit over 3.5GB in titles and especially with custom downsampling and high AA. The 970 isn’t as advertised and Nvidia are 100% to blame.
Card is not a beast. Card would be a beast if it had the spec as advertised. Some people are unwilling to accept they got conned
True. I don’t see Nvidia suffering much because of this even though they should.
except even crisis 3 maxed out in 1440p with 8x MSAA only hits about 3.2gb so your point is invalid. The cards are NOT made for 4k they are made for as high as 1440p.
They messed up big time
It is still a card that performs at a much better price/performance than the 980. On paper and in benchmarks it still does perform at that level. Especially since overclocking the 970 can make it faster than a stock 980.
So you use their computer and know for sure? You are the liar and the person with a problem. Go back to your mom’s basement you air thief.
There’s a couple problems here. The first being that the GTX 970 and R9 290X are basically the same price. The other being that the R9 290X doesn’t really outperform the GTX 970. Sure you can cherry pick a few games at specific resolutions where the R9 is clearly better, but I can do much more of the same with the 970.
It’s a bad idea to mention how much “better” you believe the R9 290x is and then bring up power consumption on the very next line. On full load, the 290x uses 70 watts more than the 970, and the 970 uses nearly the same amount of power as the GTX 680, nVidia’s flagship from 2 model lines ago.
The bottom line is that despite their fudging of the technical specifications, the card performs as intended, as expected, and as advertised. You ARE getting the performance that you paid for. For justification on this, you might want to look at the last 3 launches of nVidia’s flagships and the next step down and cross compare their performances and prices. You WILL see that I am correct.
Accessing it requires tying up a controller that can only allow one read/write command at a time, which means the extra .5GB not only has lower throughput but also affects the timings of the 3.5GB pool. So why do you think Nvidia engineers decided to provide a convoluted path for that .5GB? Do you think it’s because it’s faster than system RAM or because Nvidia executives really wanted to print “4GB” on the box?
Your statement just might be one of “the stupidest things”. If I shelled out $370 for a GTX 970 and got the performance that it brings, then I got what I paid for. The REALISTIC performance difference between the 970 and 980 are in line with the differences between the 770 and 780, and the same between the 780 and 780 Ti, and the same for the 600 series. And the price difference of said cards is the same as the price difference between the 970 and 980. Absolutely NONE of the people who bought a GTX 970 realistically thought they would get within a 10% performance difference despite the $200 price difference.
The bottom line is that the graphics card IS what it should be, the technical specifications are not. There is a verifiable track record between cost & performance of nVidia’s #1 and #2 graphics cards in a model line, and these 2 are no exception.
WBM Opens Legal Investigation Into NVIDIA GTX 970 Graphics Card – http://www.wbmllp.com/investigation-of-nvidia-gtx-970-graphics-card
So fix your PC.
A GTX 970 without any defects is called a GTX 980 and costs $200 more. It is the same chip after all.
Can you point me to where they lied about their product to consumers?
It doesn’t need that memory. You’d probably be over 6GB of VRAM before you ever needed 3.5 GB of it at the same time. This is why the card works fine using 4GB of VRAM. It can even handle a 6GB load with less stutter then a 290X.
You sir are a moron. The TDP of the 970 is 145 Watts.
The 290X should be banned for being a fire hazard. It doesn’t follow safety standards and constitutes a danger to human life.
A GTX 970 without a flaw is called a GXT 980 and costs $200 more. Are you really so retarded as to think that there is no difference in a lower end product costing hundreds less?
Please find a mental health professional.
They just need to put a warning label on their cards for retards like yourself.
Warning: Products that cost hundreds more have better performance.
or
Warning: The GTX 970 and GTX 980 do not have identical performance.
Apparently this is needed now?
When did Nvidia lie to customers?
It was a press packet, and no engineering does engineering not marketing. I haven’t seen the press packet. Have you?
The press didn’t even quote Nvidia when they said Nvidia lied.
A GTX 970 without flaws is called a GTX 980. It’s literally the same chip. Why exactly did you think one cost $200 less?
You are a lair. If it was the memory it would effect everybody, yet nobody have show evidence of an issue.
You are probably an AMD troll. That or are having issues with your system. Do you know what hard faults are?
Every benchmark posted when the card was released proves otherwise. The card still outperforms everything except for the GTX 980.
Seriously, it’s like everything was all fine and dandy until someone pointed out that the card was performing worse once it tapped into the last 0.5gb of VRAM. Then, oh no, the card was suddenly worse than everything else.
The card is still a beast. I have it. Every game I play runs at their highest settings with very, very little performance drops. Only exception is Arma 3. I don’t feel cheated just because of the last 0.5gb. I upgraded from a 560M. I’m happy with the performance as it is.
Get over yourself. They might have lied about the specifications but the card still kicks ass, performance-wise.
NVidia seems to be a bush of liers.
nearly rvrything i know they promised/said had hook or were plain lies.
if you want a list leaver a note in the comments, but this list is going to be long….
The 4gb arent the only isue, there werre also less L2 storage and less ROPS activated then said. You could say “there are still 4gb”, but their arent 64 Rops activated, as promoted, their are just 56, and the L2 is reduced from 2048 to 1792 KB, in both points they just plain lied.
They also lied about the ROPS, 56 instead of 64, and the L2 ,1792kb instead of 2048kb.
Were you dropped on your head as a child? If the card had an issue using 4GB everybody would have the issue. This is not the case.
Every time a new card comes out people complain that it stutters. It tends to be related to DPC latency, ISR’s, and hard faults. There are some real special short bus retards who flip out and never ask for help.. They think they know everything and that turns out to be absolutely nothing. People flipping out that frame rates drop at higher settings. Go figure.
Also 32 bit apps can only address 2GB or 4GB with the LAA flag. Of course they will stutter if you mange to fill the VRAM as they keep a copy in system RAM.
If you can Alt+tab out of a DX 9 game, it’s keeping a copy of VRAM in system memory.
He speaks the truth, The TDP were lies, the Rops were lies, the L2 were lies and the memory layout were lies…
If I buy a fast car but only use it at low speeds, doesn’t mean that the car maker can fool me and say it has a maximum speed of 200 Mph, when in reality it is just 100, because I’m still paying an extra for that performance. Nvidia is at fault, they either sell it for a lower price, or get all the lawsuits they deserve.
at the 970: ROPS count. L2 storage, memory layout
at maxwell: improfement of up to 20% (compared to kepler) promoted were 70%
at kepler: fully DirectX11.1 compatible, they just werent
it continuable for a very long time…..
they just LIED to their customers!!!!
You’re going to comment all the posts here saying that people are moron and shit? Get a life, someone just has self respect, and doesn’t accept corporate bullshit as you do, we’re all different, you know?
The 290X is indeed cheaper. You can find good 290Xs for less than $300, like this one: https://pcpartpicker.com/part/powercolor-video-card-axr9290x4gbd5ppdhe . Meanwhile the cheapest 970 is about $320: https://pcpartpicker.com/part/gigabyte-video-card-gvn970ixoc4gd . From the reviews I’ve seen (and I’ve seen quite a few), the 290X is generally about 5-10% than the 970, with the 980 about 10% better than the 290X. Is the 970 a good card? Yes. Is it as good as the 290X? No.
Yes, the 290X does use more power, but we knew it used more. Many times I’ve had to intervene on the PCPP forums warning people of the erroneous power figures so that they don’t blow up their power supply. The 980 is even worse. They’ll often pull high-200s to low -300s at full load. I can confirm this myself as my 980 G1 pulls 305W at full load and 330W with an overclock, according to MSI Afterburner.
The more I scroll down reading your comments, the more I puke.
If it was the card’s memory system everybody would have the issue and they don’t. Every time a new card comes out people are complaining about stutter. It’s actually Microsoft’s fault and people who give poor advice. Download LatencyMON and check for hard faults when you are having stutters. What drivers are having DPC issues? I’ll bet you don’t have enough system memory. Many games make a copy of VRAM in system memory and this can push things into the page file. When they are needed you get a hard fault and the entire game stalls. Also drivers can act up and have DPC latency.
It isn’t Nvidia’s job to fix your PC. They aren’t geek squad. You should have looked for help with the issue. I had issues with my current build. The biggest issue was ASUS drivers and software. Asus AI suit II gave me DPC so high I couldn’t even watch youtube video’s without them stuttering.
How much RAM do you have? If you have 16GB have you tried turning off the page file? This can help a lot. Have you turned off things like prefetch, superfetch, readyboost?
AMD claimed the R9 290X was 1GHz, yet customers only got a 700 MHz card.
The GTX 970 performs the same as it did in the benchmarks.
AMD sent special cards to the press and did a full on bait and switch.
This is just a minor slip up. I do find the slip up somewhat insulting, although they weren’t really talking about the 970 at all. They literally said 2 sentences about it. All Nvidia was talking about was the GTX 980.
Want to see a video of the 970 using 3.9GB of VRAM and running butter smooth?
How much system memory do you have?
… And the 290X was selling for $550 before NV released the generally superior 970. You can thank NV for the great prices on the 290X, not AMD!
Just because NVidia advertise 145W doesn’t mean that’s what it actually draws.
Ex.
1.) This 970 pulls over 210 W at maximum load: https://www.techpowerup.com/mobile/reviews/MSI/GTX_970_Gaming/25.html
2.) The 970s in this review pull 30W less than a 250W 780Ti, and just 60W less than a 300W 290X: http://techreport.com/review/27203/geforce-gtx-970-cards-from-msi-and-asus-reviewed/5
3.) The 980 is even worse, pulling well over 300W on load: http://www.techpowerup.com/reviews/Gigabyte/GeForce_GTX_980_G1_Gaming/25.html
4.) Another review of the 980, the reviewer states that the 980 draws 300 out of the 400W the system draws by itself: http://www.eteknix.com/gigabyte-g1-gaming-geforce-gtx-980-4gb-graphics-card-review/17/
5.) This review shows the 970 and 980 pulling just less than and more power than a 250W 290, respectively: http://www.techspot.com/review/885-nvidia-geforce-gtx-970-gtx-980/page7.html
6.) Note that the 970 pulls over 240W by itself: http://www.eteknix.com/asus-strix-geforce-gtx-970-4gb-graphics-card-review/17/
I love how you complain about the TDP of the 290X but the 980 is completely fine. Let’s look at power draw shall we. To humor your idea, we’ll use a 290X Lightning, the biggest, heaviest, and fastest 290X, comparing that against the card I personally own, a GTX 980 G1 Gaming. For the power draw of the Gigabyte we can look up at two of the links stated above, both of which show a power draw of over 300W, as well as my personal experience, having my G1 pull 305W on stock load, and 330W on overclocked load. Compare that to the maximum draw of the Lightning: http://www.guru3d.com/articles_pages/msi_radeon_r9_290x_lightning_review,11.html states that the Lightning does 287W at full load, and http://www.tomshardware.com/reviews/r9-290x-lightning-performance-review,3782-5.html (and the next page) state that the maximum draw they could get was 272W. How about temps then? For this we should compare 2 cards with similar coolers. So how about http://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_980_g1_gaming_review,8.html and http://www.guru3d.com/articles_pages/gigabyte_radeon_r9_290x_windforce_3x_oc_review,11.html ? Note that the 290X runs just 9C warmer, with the older cooler rated for 150W less power dissipation. With some of the larger coolers, like the Lightning, Vapor-X, or PCS+, it’s entirely possible to run in the mid to lower 60C range on full load.
Please, do enlighten me as to what rules the 290X is breaking, because I only see NVidia falsely advertising on not one, but two of their GPUs…
Eh, more like $450. I should know, I was planning on getting a couple before the 980 dropped.
They did it as it was an improvement. They didn’t need to cut the bus down to 192 bits to repair the damage. This let them sell the card for less. The 670 was $399. You loose 1%-3% at most and save a lot of money. The card is fine and if there was an issue with the memory, everybody would have the issue.
Seeing as only 5% returned there could be some bad cards out there as lots of perfectly functional cards were returned and 5% is about the failure rate.
Lots of Nvidia cards have this. It hasn’t cause issues before and it hasn’t been found to cause an issue on the 970. Close to a dozen cards have had 2 pools of Virtual memory. Like the 660 ti for example. They highlighted the 660 ti and explained it’s memory system. The 970 wasn’t really talked about. Nvidia was all about talking about the 980.
The 970 doesn’t stutter though. There are systems that stutter and they will get the same stutter with the 980. Now there could be people with defective cards that didn’t know as the defect was in the 0.5GB.. You’s see artifacts. I did see a video of somebody with a defective card.
Why wasn’t AMD taken to court for a pure bait and switch? This is simply an accusation of a mistake in a press packet. Nvidia wasn’t even quoted in the articles.
AMD falsely claimed higher clock speeds and sent out special cards to the press.
The marketer’s should get taken behind the shed. I’d also be for technical writers getting an industry wide beat down as that lack any technical knowledge.
Why does AMD get special treatment though?
You literally have no idea what you are talking about. So you thing that the card is using every texture at the same time? Really? It is drawing everything in the game at the same time? Nope.
So long as it’s on the card it can be swapped. It’s only when it’s not on the card that Microsoft demands that GPU’s stall. Are you aware that games tend to keep a copy of VRAM in system memory? So how much system memory do you have beyond what windows uses + VRAM + what the game needs?
So 8GB for dying light + what windows uses + you have a 4GB card.. I’m thinking 8GB isn’t enough. Do you know what memory faults do? They stall a game. If the game has been pushed onto the hard drive it’s stutter city.
Another issue is that the game puts out an obscene amount of draw calls. This can cause DPC issues. This will cause the game to get lagged and cause animations to jitter.. That is the time is off causing a stutter appearance without the FPS actually reflecting a stutter. People will chart out the FPS and say your are crazy as there is no stutter, but you see stutter.. In the case of DPC latency you are not crazy, but it isn’t the card. It’s that drivers can bump the game and throw of timings.
This is all on Microsoft and Windows 10 should be better about this.. That is if MS doesn’t find another way to F it up.
Windows is not a real time OS like consoles. Timing is offered as a best effort.
Driver issues also show up as stutter in games. If it’s bad enough youtube and even audio will even stutter. It needs to be really bad for games to crash, but issues first show up as stutter.
This is a very weak case.
AHAHAHAHAHA. A POWERCOLOR CARD. Yeah, you enjoy that offbrand.
What happens is you move around and new assets are loaded into the slow block of memory. The card will not spend time swapping data back and forth between the slow and fast memory, thus you end up with situations where certain areas stutter. it does not have to use everything at one time, it just has to be an area where the data in the slow section of ram is being used for on screen assets. You can replicate the issue by going to the second part of the city then run across town and enter a few of those rooftop houses, you will see that areas where you could smoothly pan around, will start to develop issues where moving to a certain direction will cause stuttering. but restarting the game and spawning nearby will have smooth performance until the memory usage skyrockets again.
It takes some work to get that issue, but this is with current gen games. As 4GB becomes the norm, and you see games designed around 4GB VRAM for high settings, then you will be in situations where it is very likely that the game will almost always actively use data across more than 3.5GB of RAM. One of the main things which game devs do when there is more VRAM to work with, is not use texture compression (in order to speed performance since VRAM is extremely fast. They also use higher resolution textures, e.g., 8K-16K textures for a sky box, higher res textures on all common objects also. (many professional texture packs, are available as seamless 8K x 8K images (32 bit), With high bandwidth it does not significantly impact the card to use higher res textures as the calculations do not become more complex. In situations like this, the GTX 970 will choke on a game that is designed around 4GB VRAM.
If you want to test this, see if you can acquire 😉 a copy of the unreal engine 4 editor, then play with texture resolutions in any of the demo environments. (if you have no massive textures then just have photoshop generate random noise and save that as a texture. You will see that performance does not really change until something happens to cause the engine to be starved for RAM and begin using system memory. High res textures take more memory but only have a very small impact on performance when the GPU does not have to do much beyond displaying the texture (added effects to a textured service e.g., making it a little shiny or have water puddles on it can be done at lower resolutions as part of a separate layer.
How would you feel if you bought a car that was advertised 500 horsepower but then later found out you could only use 350 horsepower yeah its still powerful but you were duped into paying hard earned dollars to get the 500 hp wouldn’t you be pissed if the big 3 lied about their specs on your 40k+ muscle car well this same concept applies here if nvidia was truthful in the first place all this would have been avoided
The game engine was not designed around a GPU configuration such as a GTX 970. For modern systems, developers typically do not have to worry about video memory speeds, since the cards powerful enough to render the game will often have such a massive amount of memory bandwidth that devs don’t need to think about it. The issue comes when memory lacks the bandwidth needed.
Devs designing a game for a late 2014-early 2015 release, cannot reasonably be expected to encounter a modern high end GPU with some of its memory performing at the speed of the RAM in an 11-12 year old videocard (look at the Geforce fx series).
Bunch of wankers if nobody said anything about the issue nobody would be complaining because the card works fine its way better then a 770.
“only a small fraction of GeForce GTX 970 owners plan to return their graphics cards ”
Not so small:
http://www.pcper.com/news/Graphics-Cards/Amazon-Newegg-and-others-offering-partial-refunds-GTX-970-purchases
also, two more lawsuits are brewing:
http://www.wbmllp.com/investigation-of-nvidia-gtx-970-graphics-card
http://bursor.com/investigations/nvidia/
Saying you can’t show me would of sufficed. Pity…
If the 0.5GB pool was not there, with any other architecture you’d have a 3GB card with a 192 bit memory interface and even fewer ROPS and L2 caches instead of what we got. Did nVIDIA fuck up? Yes. Was this mistake blown out of proportion? Yes.
You know what I saw? I saw numerous sites try to bring the card to a crawl and they were all unsuccessful. One of those sites was pcper.com.
Even running BF4 at the equivalent of 6K by setting resolution scale to 150% resolution didn’t even do it. Did it come to any kind of crawl? Not even close.
In fact they said the only way to do such a thing would be to enable unreasonable settings for a GTX 970.
So I showed you mine, now you show me yours.
WHOA! Slow down. I can’t process all this information right now. I’ll need a couple days to do some extensive research on your unfortunate problem.
Bla bla bla. ” the” game engine? what are you talking about? Any situation where more than 3.5gb of vram is used slows down the 970.
Someone hasn’t seen the PCS+ in action… Powercolor hasn’t been the best in the past, but they really stepped up their game with the R9 200 series, and the PCS+ is one of the best cards you can get. Still don’t like it, here’s a Gigabyte for even less: https://pcpartpicker.com/part/gigabyte-video-card-gvr929xoc4gd
Question: If the 970 came with all the ROPs and 256KB extra L2, would it perform better? Possibly. So you are happy with your, quite possibly, under performing GPU that you paid for? You might be but seems others aren’t because it isn’t what they paid for. They paid for a product that was said to have all the ROPs and L2 as specified.
I can understand why people are suing.
i have GTX 970 and i will still buy GTX 970 so suck it lawsuit hahaha
Your a bit of a twat aren’t you Abram ? We all know a 970 hasn’t got the power of a 980 ! What people are against is the fact Nvidia told a blatant lie to its customers and the press, and haven’t even so much as produced an apology.
that is why he said TDP , not draw wattage.
you putting words in his mouth
i have no issues in most games with 1x 780 lightning on 1440p
but some games with big open world or lots of physx
i need a second 780 lightning.
but some games also run on 2160p with 2x lightning.
but again the physx explosions is bottlenecking the GPU’s
not so much VRAM peaks.
i would rather buy the 780 TI classified at this moment in time.
I’ve used after burner to see its always when the game hits that 500meg slower vram when the fps gets hit……….All these nvidia fanboys are pathetic the company isn’t paying for this……….
does the 780 have two different speeds vram modules?
This will do nothing.
Its like mrluckypants96 said $450 ($550 was the entry price as it was released)
also we have spiken about the tdp lie, and all the other lies of NVabout the 970.
just admit the problemwith a apology and give all 970 owned=rs a free game, thatll probably do it.
to be fair I only game at 1080 hd and yeh NVidia should just do a proper apology but in all my becnmarks and real game scenarios where I see up to 4066 being used casue I force it, I don’t see a single stutter so im getting a very different experience to you.
a defect implies a fault or mistake and as NVidia did this by design that terminology is incorrect. Do you reall think NVidia ment to lie when even with the seperat slower pool of ram this card animates all the competition. This card is fantastic and its just that good that theres no need to lie so I believe it was a mistake on spec, but NVidia have handled it poorly, they should have just come out apologized for the error and offered a small token in the form of a credit note small refund or a free game, you know that same free game they offer to the 970 users than bought the game later. But again they designed the card the way it is and as much as I hate their mistake the card is simply the best I have seen in years, even the overclocking is beyond any card I have seen in about 12 years lol its that good and quiet. Ive yet to experience any stuttering whatsoever. Farcray 3,4,blood, bioshock 1,2, gtaiv(though a crap game),bf4 tombraider 2013, crisis 3, wolfienstien all run insame fps and ultra smooth on ulta graphic settings on hd and on 1440 so I don’t get that part people are going on about.
yeh I saw them try to and they just couldn’t lol.
yeh in the marketing they said 64 rops and speed was also higher, the speed was a factor that got me to buy the 970 over a 290x. so it was a lie via a marketing mistake, truth is people are upset cause NVidia tried to sweep it under the carpet instead of saying sorry and hey heres a little token, like a free game, cause a marketing lie or mistake is misleading. However the 970 is a magnificent card and even professional game testers are struggling to make it stutter, just watch the you tube vids lol. There’s also a lot of lieing vids, ive seen people run bf4 in hd and even before it reaches the 3.5 gig ram amount the game was stuttering, and I know that’s complete bollux lol. Gota love all those trollers. Nvidia sort out your reaction .
the 290x isn’t faster dude. I had both cards and tes as you go up into higher resolutions the 290x does strat to overtake the 970 g1 a little bit. But in hd the 970 annihilated the 290 and its a big amount, add this to a card that runs very hot and also very noisy and uses more power then yeh ill pay 30 pounds more for a quieter cooler more power efficient 970 g1 which ill make back in what I save in power in 2 years, so cost is actually a mute subject. I did some calculations and as I have my pc on 24/7 I can tell you that its gona be estimate 20-40 pounds cheaper a year to run than the 290x. Also the many bench marks by credible reviewers all show the 970 g1 beats the 290x at hs in fps and the 290x overtakes at the 4k level, so you don’t have to take my word for it you can just google it. Either way it depends what your needs are. Mine is hd to 1440 with a power efficient and cool, quiet card, that’s the 970g1 by gigabyte for me. So my sapphire triple 290x went back.
Do you have some sort of reading comprehension problems or something?
What if the car maker sold you a car saying it could go 200 MPH, and it went 200 MPH? The analogy is doesn’t fit the situation. I’m a owner of a 970 and i’m pissed at what they did but lets call a horse a horse and stop comparing apples to oranges.
A better Analogy would be buying a car that’s advertise the engine will last 250,000 miles, but that the car will itself will explode at the 200,000 mile mark. Now we check and it turns out the engine will only last 201,000 miles. The analogy i just gave still needs some work, but generally you are going to be limited by the power of the GTX 970 before you’re limited by the ram.
I am not saying nvidia is blameless here or that you should forgive them. They sold a product and mislead their customers ie me/everyone else. Just keep your analogy more in line with the reality or give warning where they stray.
http://onegoodmove.org/fallacy/falsean.htm
yep, I was explaining the reason why. To put things in perspective, the last 512MB of RAM on the GTX 970 performs at the same speed as the RAM found on 11-12 year old videocards (think about what PC gaming looked like 11-12 years ago). It is extremely slow by today’s standards, and that is a limitation that many game developers do not take into account because it is an issue that should really not exist. Nvidia messed up on this card they crippled the memory bandwidth in a videocard during a time when for years, game developers never had to worry about the memory bandwidth of the cards rendering their games. It is also unlikely that game development companies will invest in modifying game engines to handle the memory config of the GTX 970, as it is the only card with such a massive memory bottleneck.
Dying light is a poorly optimized game, no doubt about that; the issue is that even if there is a game that is perfectly optimized to give the best possible graphics at the best possible performance on a high end GPU, if it uses more than 3.5GB video memory, then it will lag if any of the data that is actively being used, is housed in that last 512MB chunk of memory.
Nvidia cheated people with what is now revealed to be a card that will run into trouble when you hit the last 0.5GB of VRAM which isn’t hard to do with downsampling, high AA etc and will render the card an even shorter shelf life and no future proofing.
What a humiliation for Nvidia.
The VRAM means that you run into trouble anytime you go into that last 0.5GB. It would be fine if had been sold with correct specs as a 3.5GB card at a slightly reduced price. Nvidia pretended the card was one thing and reality showed something else…..
That is a completely different thing compared to the price/performance argument that you yourself put forward in your statement. The benchmarks are not changed because the cards came with this issue from the getgo so really minus nvidia lying about the true specs, the price/performance of this card is still pretty good.
Upset that nvidia lied, yes. But to argue price/performance compared to 980 when overclocked 970 consumer cards can beat a stock clocked 980 has no base.
I mean at 330$ for that performance compared to 550+ for the GTX980, the premise you are trying to work has no grounding.
Nice argument dawg. Say it again.
Ok, no idea why you think I need it explained to me. The reasons why there are issues beyond 3.5gb of vram usage and the design of the gtx970 is neither here nor there. People buy a 4gb vram gpu, they expect to have acess to 4gb of vram. I was responding to the ” That’s what can happen anytime you’re settings are too high/not enough VRAM” post.
Were the gtx970 not designed this way 4gb would be “enough”, even when vram usage is 3.5gb and 4gb.
Also weather Dying Light is poorly optimized is nether here nor there. We’re talking about the fact that performance tanks if it uses the last .5gb cluster.
Nvidia is on the hook for false advertising, for sure.
And your last part is flat out wrong. The only gpu’s that have issue with using more than 3.5gb of vram is the gtx970. Why do I even even to point that out to you? Now you’re making it sound like all GPU’s are designed with this stupid .5gb of gimpy buffer vram. No. Just the gtx970. And they were being sold with no mention of this, leading buyers to believe they were getting 4gb of working vram.
I’d be surprised if anyone runs off to link you benchmarks and articles. They are out there. Hense why we’re here.
Derp? edit: haha his post was removed.
My point isn’t invalid at all. Hense why people have complained about performance tanking when vram usage exceeds 3.5gb like say in higher resolutions /multi monitor set ups.
And again you have to push the card past what it can already handle in order to hit that mark already. THAT IS THE POINT.
As long as there are viable options, I would Never buy Nvidia graphics cards again… They even say it was miss comunication with the marketing department are they kidding?! On a company like this everything is planned to the detail, dont bullshit us…
I compare this with buying a 600bhp lamborghini, I mean, you May never need the 600bhp, in fact the car runs great with 400bhp, but If you buy and pay a lamborghini with 600bhp it better have 600bhp.
I dont want to overreact but I feel scammed…
I would sign and be a part on that lawsuit…
the fuck are you even talking about. We’re talking about vram usage. resolution is vram dependant. And you can easily force vram usage with texture mods and anti-aliasing. Point is if it utilizes more than 3.5gb, performance plummets, and you go off on a tangent trying to be a wise ass?
Not sure if just stupid…
“off brand”? It’s one of many third party venders, such as XFX, MSI, Saphire etc etc.
My old 7870 Myst PCS+ was great btw. Overclocked to 1175 core clock without aditional voltage.
You do realize third party venders are usually just a different cooler, sticker and warranty right?
I’m not some damn idiot who knows nothing about how a pc works. I had ZERO stuttering with gtx660 SLI, gtx780, and now with gtx980. With 970s, single or SLI, doesn’t matter, i get terrible stuttering ONLY when it goes over 3.5gb vram usage. And i have 32gb of system ram so no, nice try. I even threw in another 8gb stick of some slower ddr3 ram and no improvement, tried with 16, 8, 4 etc.. no difference. And if it had anything to do with superfetch (it doesn’t since i turned it off long ago, but i humored you and turned it back on…same thing), or readyboost (don’t use it) etc.. then it would be doing it all the time; not just in games and ONLY when it hits 3.5gb of vram usage on the 970. If this was any kind of system or RAM issue then it would be doing it on the 980 i have in the system now, or the 780 i had in there a few months ago, or the 660’s i had in there a year ago etc.. i even took the 660 and 780 and put them back in today and smooth as can be, no stuttering. In fact if i do an f-cat benchmark i find that the 970 is the only card that shows significantly wonky frame timings and stuttering, regardless of single card or SLI. I even tried a friends EVGA 970 (mine is gigabyte) and his did the same on his system and mine. So cut the crap, nvidia doesn’t have to fix my pc, but my pc isn’t the issue, it’s this card and the know it!!
Yes, I usually choose Asus or EVGA cards. I’m well aware of the fact that 3’rd party companies customize and sell grpahics cards. I just think powercolor is trash. I work in I.T. and I watched one of our bigger clients go through 300 of them in a year.
And the drivers don’t matter either, forgot to mention that. I’ve rolled back all the way to december 2013 drivers and back, not a single one made a difference. And i’m talking gpu drivers, mobo bios, etc.. all kinds of system drivers and things i tried changing. And i have no asus software so that’s pointless for me.
I’m more of an nVidia fan than AMD/ATI, but that said, that was some shady business there, nVidia. You know what you can do to partially fix this?
Price drop! Price drop! Price drop! Price drop! Price drop!
An even better analogy is advertising a car as a V8 when it is actually a V6. 2 less cylinders would be false advertising, same as this card.
There is a difference though: CUDA cores. Compare a 970 and a 980 on the Nvidia site: No difference in memory, just the core count and clock speeds. Asking for a free 980 is indeed just greedy. But at least allow users who have bought these cards seeing only a slight difference with the 980 a refund so we can choose which card suits are needs the best, this time knowing all the details.
I think the issue is one of should the card be truly considered to be 4GB.
For example, can you market a house as having 4 bedrooms if the 4th bedroom is made out of cardboard, and attached to the side of the second floor of the house with duct tape?
The card may technically have 4GB, but the last .5GB is virtually useless. The speed is pretty close to that of DDR3 1866 system memory. But by GPU standards, we had videocards from 11-12 years ago which performed at about the same speed as that last .5GB of RAM
A modern game that can take advantage of a GPU as powerful as a GTX 970, then you will likely need more bandwidth than that last .5GB can provide, thus if you are unlucky enough for that last .5GB to be used for something actively being rendered, then performance will suffer.
It may not be too much of an issue now, but if a game is designed around having access to a full 4GB of VRAM that is not performing like 12 year old VRAM, then the card will not be able to handle the game even if the GPU has the raw power to render the game. (you will likely end up with issues such as you look in a certain direction and both frame rates and GPU usage drops as the card gets bottlenecked by that last .5GB ).
Its impossible for you to be seeing 4066meg used……….
Oh… man… where to start with this?
GTX970 may be designed – but not the way you see it. It was designed to increase GM204 chip yields. Essentially, to market defective GTX980 as lower-spec’d and lower-priced products by disabling faulty SMM’s. So, yes, it is somewhat defective – and somewhat designed. That’s exactly why misleading marketing is unforgivable. They knew what they did, yet they wanted to hide the facts from the consumers.
As for real-world performance of GTX970, I can tell it first hand – that card is great bung for the buck. I tested several models in different situation and, for the truth sake, I didn’t run in those overhyped issues – but that does not mean someone else didn’t. Still, main point of this article stands – nVidia deserves class action lawsuit over GTX970 specs – they lied, and I do not believe it was merely a miscommunication.
Exactly. It’s hard to think of it as 4GB card. That .5 is like 20$ hidden in the wallet for emergency cases – you know it’s there, but you don’t want to use it 🙂
Simple math can tell it will cause issues once it is being properly used – but games that can hit that .5 of VRAM are fairly rare today. Doesn’t mean it will be the same tomorrow, tho…
You sir obviously can read. So, why don’t you read some reviews?
I agree. It’s not the same thing. But 145W TDP is also not the same as ~160W TDP.
like I said I don’t condone the lie they told but I do think its likely a marketing error that led to the lie(why because the card is so good its special that they didn’t need to lie.), but the 970g1 card I have takes all my modern up to date games and anialates them in ultra hd mode. Theres no denyting this card rund hd games fantasticly and beter than most cards including the 290x, though id have gotten the 290x if I was doing 4k. Not only does this card fly in hd inultra settings in all games currently, it also does it using less power, that means an incredibly cool and quiet card, So side stepping the lie thand the terrible respose that NVidia did, id be very happy with this card even if it only had 3.5 gig hell 3 gig even and I wouldn’t care if they separated the ram into 3 pools with 2 pools being very slow if it ment getting the awesome performant 1 and many reviewers are getting from the 970 g1. Oh and the overclocking headroom is the best I have seen in any card I have owned for years.
Now I was gona jump on my moral horse and return this because what happened did change my thought from a 290x to the 970, ebuyer.co.uk was even ready to offer me a full refund and that’s after my 3 months with the card, but I just couldn’t do it. And why well the card is that good in games is first but the fact I have a very quiet pc and it doesn’t over heat is too huge a taste treat to give up.
Ive owned the 4850, 4870,5970,6870 and every single one though great cards were very hot and extremely load. My pc sits at 40-45 dba at none gameing use and then when I rack up the game it goes up to maybe 46-48 and that’s the gpu on max the ati cards easily surpassed those sound lvls, im guessing but id say triple the loudness easily so att 12dba so that’s 58- 62dba. (if your not sure about the dba measurement you can always google it.). I don’t use headphones so to me that’s a massive issue on my last 10 years on ati. Im not saying ati are bad cause they are not and I miss certain cata options(not all) but ive put up with loud cards to long, im not gona do that again, ps and all my cards are none reference.
We all have our opinions I personally think it was originally a mistake but I also think they realized it once all the cards were made and packaged so instead of spending millions they chose to lie at that stage, I too don’t like that. Show me a 300 pound ati card that runs at 1544 with 4g memory that’s as silent as the 970 series and cool and ill swap. But for now ill wait 2-3 years and go back to ati and that’s only cause NVidia didn’t do a proper apology and a little token like a free game.
Do you know what binning is(im guessing you do so this is for others.)? theve been doing it at least 15 years that I know, its a practice where you remove something or damage something in order to sell a product at different lvles, eg, low midranged or high. Its a very cheap and convienient method to lower costs. It does this buy allowing the manufacturer to build the same exact product. if you need to know how that works take economics dude cause im not here to explain it, but trust me it results in cheaper manufacturing costs and hence cheaper prices to you and me the consumer. I remember when you could use a pencil and draw a line to make a gpu overclock loads but they soon made that less simpler to do lol. Also when you talk about gpu processors the same system applies to cpu processors. A huge wafer of processors is made, then tests can show with ones can run faster and which run slower (they run a current though them.) again that catogrises them as the low or the high end chips. Overclocking things is a happy result than can sometimes result from these processes.
Either way my g1 card you know the binned 980 so that you and me can buy and afford the card at 300 pounds, 500 pounds is beyond my means, and then with some overclocking im reaching 1544 that a little beter than stock 980, and im saving 200 pounds that aint a bad thing in my eyes, but then again I can only go by HOW I SEE IT and the very basic theory I have read about. Im sure if any information here is dodgy you can just google it. Thk you for the education and feel free to correct any thing I may have gotten wrong lol. I actually really do appreciate it as its good to learn new things and being corrected is often part of that process, so thks truly.
Like you said others might have issues but unless they wana play games around 20-30fps I really don’t see anyone getting this issue, but then again that’s assuming they don’t play at 20-30fps on 4k. They could just turn aa down 1 notch that would sort the very rare issue.