Although it was expected that Nvidia Corp. would release its new GeForce GTX 800-series graphics cards for desktops this September and October, the company has decided to skip the GeForce 800 family for desktop PCs and jump straight to the GeForce 900 line of products.
Model numbers of modern central processing units and graphics processing units have been criticized criticized for years since in many cases they barely reflect performance and positioning of products and in many cases cannot even determine their generation. At present Nvidia offers high-end GeForce GTX 700-series of products for desktop computers. At the same time, the company sells GeForce GTX 800M family of products for notebooks. What is rather confusing is that both lineups – the GTX 700 and the GTX 800M – are based on similar chips powered by the Kepler architecture.
In a bid to avoid confusion with its upcoming families of graphics processing units for desktop and mobile computers, Nvidia decided to sell its products powered by the second-generation Maxwell GPUs code-named GM200, GM204 and other under the GeForce 900 moniker, reports VideoCardz.
As a result, Nvidia is now expected to formally introduce the GeForce GTX 970 and the GeForce GTX 980 graphics cards on the 19th of September, 2014. The GeForce GTX 960 is projected to arrive sometime in October.
Nvidia did not comment on the news-story.
Discuss on our Facebook page, HERE.
KitGuru Says: While the idea to sell similar generation of chips under similar model numbers makes a lot of sense, it should be kept in mind that in order to complete the GeForce 900 and the GeForce 900M product lines, Nvidia will likely add old chips into new families and therefore will confuse end-users again.
really?
Why the hell skip it? This just means they have to start a whole new naming scheme even sooner…
Didn’t they do this before?
They did this with the 300 series. There was no mid – high range cards in the 300 series.
You act like there won’t be a GTX 1000..
What is going on here?
You forgot to log in to another account to answer your own question(???)
They’re acting as if we think that a card’s quality is tied to its number- like the 900s will be hugely improved over the 800s, when it’s just a naming convention.. It makes no sense to skip a product lineup’s chronology.
Unless, of course, they’ve already developed the next-generation video card’s architecture and are planning on moving away from the GTX line of cards.
A lot of people do think that believe it or not.
they might have rebands in the 800 series though for laptops or something.
Its not that they think we align the performance with the nomenclature. This was a good move because in anycase the desktop versions of the 800 ones werent going to give an all outclassing performance compared to the 900 series which they had already developed to a good extent. On top of that the public response for the 800 ones would again be negative…saying why release a new series for a performance update like that of a game patch. And then they have the competitors who are just raring to use any opportunity.
so, if it’s supposed to be out next month, what are the specs?
No specs = no confirmation
The cards will not be released next month, rather that is when they will be formally introduced. A paper launch showing off the card and explaining specs and features with actual products coming to the market afterwards
If it’s basically the 800 series with a 900 tag slapped on it, then the first line states they were supposed to be released in September and October.
lmfao
They probably did this to deal with the messed up Maxwell naming scheme.
you’re an asshole, maybe he just recalled of it later, or found it online.
They also did it with the GTX 100 series (Which was a rebrand of the 9xxx series too IIRC)
I’m really still waiting for the high end cards. Remember when they used to introduce high end cards first? 😐
I want/need a GTX 980Ti that is sigificantly faster than the 780Ti and has a bit more memory and memory bandwidth!
Well, now they are gonna mess up the 900 tier of chips.
Keep waiting, in the meantime, AMD is not only catching up, they are actually getting ahead.
It’ official ! GTX 880 confirmed weak.
It’s slower than 780 Ti.
For me it was weird all of the talk about GM 2xx chips in 8xx series, per Nvidia’s naming scheme those are supposed to be 9xx parts since Maxwell was announced. Though there remains question why they have skipped GK 204/210 and GM 104/110 parts.
It’s all a way to promote . my gtx 680 is still good 2go.. My first upgrade before 980 would be a 4k screen
“KitGuruSays: While the idea to sell similar generation of chips under similar model
numbers makes a lot of sense, it should be kept in mind that in order to
complete the GeForce 900 and the GeForce 900M product lines, Nvidia will likely
add old chips into new families and therefore will confuse end-users again.”
I disagree with this statement; if you have an interest in technology you can see straight
through to the architecture and process node behind the naming and make an
informed decision. There are people out there who will happily buy a BMW 530,
not realising the can buy a 525 with the same engine a bit detuned, is it
important that it’s the same engine? Maybe for some petrol heads it is.
If you area disinterested mainline consumer, you only need to know that a GTX660 is
faster than a GTX 650. As long as this is the case why would it pose a problem to “Mr/Mrs average with no technology interest that casually buys a PC for the family and wants to add a bit of
graphics pep for the kids gaming? They are even more likely just to regard Intel integrated graphics as enough. To me anyone interested in a GTX n660 and above with $200+ plus dollars to spend knows the technology to a tee, and is shrewd/savvy enough to make up their own minds; let’s face it we’re talking about the price of a next gen console for a graphics card. Nvidia marketing heads clearly know this, is
why they are paid seven figure salaries.
Shadowplay and Nvidia Drivers still tend to be superior.
Ahead? Those power hungry overheating things… getting ahead? Hah.
Then they better kick up the performance to let 980 have 60% better performance! Cause every new release has (almost) been 30% better than the previous X80.
I’m guessing this is the other account then…
Just in case it’s not, he could have used the “Edit” option here on comments 🙂 So just chill, dude.
oh wow nvidia upping there gpu already dayummm!!!
I paid £150 for an R9 280. I’ve overclocked it to 25% higher than default and it doesn’t even hit 70c in games. And I can’t hear it either. It beats a GTX 770 which is well over £200. It uses less power than my GTX 570 and is considerably more quiet. It’s a great investment and I will be sticking with AMD.
I think that is what I would call “Ahead”
Well to be honest – I’m not bothered about AMD/nVidia arguments regarding desktop GPUs as they are better at different things that don’t include gaming.
AMD > nVidia for GPU mining.
nVidia > AMD for scientific computation.
But when you get to notebook GPUs, AMD are horrendous and tend to have a large slew of overheating issues (Last notebook I used has an AMD card that wasn’t desperately bad performance wise but for being a low mid-end GPU the amount of damage it did to the system was unforgivable).
Currently running a 780M on my laptop and everything runs great – no issues. Even when the GPU runs around 90 I get no issues on the other hardware because nVidia thought about a cooling system.
Well, Nvidia still has built in Driver HBAO+ http://forums.guru3d.com/showthread.php?t=387114
along with SGSSAA and other AA method support
http://forums.guru3d.com/showthread.php?t=357956
(In DX9 only at the moment at least) that AMD doesn’t have. And that’s enough to keep me green team..unless Nvidia decides to ignore https://forums.geforce.com/default/topic/544701/geforce-drivers/petition-for-directx-11-anti-aliasing-driver-profiles/
Then i’ll probably end up keeping an Nvidia card for that kind of stuff and buy an AMD card for the future.
in 4 years they will reach the 4000 series and it will be the end for both x000 and x00
You both need to chill. 😛 You’re jumping to conclusions based on the guy responding to his own post.
It was a terrible joke, I didn’t think it’d get out of hand.
If anyone was offended, I apologize sincerely 🙂
careful with joking on the internet, mate
the internet is serious business ;D (j/k)
Then I think you mean it better have 69% better performance. 1.3 * 1.3 = 1.69… not 1.6.
They thought about the cooling system yet it runs at 90*C? Okaaaaaayyyyyyy.
Yeh I’ll agree with you on the laptop side of things. On the low end gaming scale I’d recommend an AMD APU if someone is just playing some gmod or something like that. But anything more I’d have to go with nVidia as they use less energy and don’t get as hot. Although I can’t speak for the latest ones.
90’c is expected on a laptop. Its pretty standard when playing a game. Laptop chips can generally withstand up to 110c/120c before shutting themselves down or getting damaged.
Nvidia will always be better than Radeon. In gaming both cards perform the same. When it comes to cuda support, physx and render supports Nvidia wins all day because there are no supports for those features when it comes to Radeon. And that is the only reason why Nvidia is more expensive. And also AMD Cpu’s and Radeon Cards can also die faster than an Intel & Nvidia
I’m not sure a lighting effect and AA is really worth the extra. Also HBAO+ is available on AMD, but can’t be added to games like it can on Nvidia.
My R9 280 cost me £150. It outperforms a GTX 770 which will cost £230+. How do they both perform the same for the price?
And AMD is better for many things as well, not just pure game performance. For example they considerably outperform their Nvidia priced equivalent at BitCoin mining or any mining.
My previous laptop had an 8970m which ran at most 85°C during the summer with an ambient temperature of at least 32°C. It has the performance of a 780m. Say what you want but AMD’s mobile GPUs are amazing.
okay have fun mining your BitCoins while I enjoy my higher framerate
I doubt they will do that…Nvidia will probably think of an alternative naming scheme like AMD did.
They have the 800 series for mobile GPUs.
It gets weirder than that, even. The 8800GT was released after the 9800GT using the same GPU, but a narrower memory bus. The confusion can occur in both directions. *shrug*
Well no shit…
So now I have to wait til October, might as well just get a GTX780 now
its because the mobile parts are already GTX 800 series and new 900 series parts will be out soon. They skipped the GTX 300 series in the same fashion lol are you new?
what higher frame rate the 295×2 kills anything that Nvidia has on the market.
Overheating only the reference card was hot everyone knows amd reference design sucks but I have not hard of any of the non reference cards over heating hell I have 3 290s that run nice and cool all day long.
It want be on 20nm so unless AMD or Nvidia come out with 20nm GPUs I am skipping this round because why would I buy a new card that is just a rebanding of a older card that might be 5% faster if we are lucky.
Well Kieran are you those type of amd fan boys who never owned the new nvidia cards and google benchmark and seeing better results thinking it will actually outrun? sorry bro i owned both cards and i admit true the r9 280 in benchmark outruns it anyday..but when it comes to actually playing games like skyrim, watchdogs and all the other gpu hungry games…same performance. As i mentioned AMD is pure gaming. Nvidia and intel are for gaming and also designing. That is the reason why i have Nvidia and intel because i am not a pure gamer, i am a gamer and i do designings and editting. So what i am saying is(since once again i say i owned both cards) in benchmarks you will see the r9 280 outperform a gtx770 but when it comes to actually playing the game, there are no difference at all. think what you want i dont mind but all i am sharing here is my experience with the r9 280 and the gtx 770 which i have owned. I sold the r9 280 and got a gtx770.
Are you knew? Fucking look at the other comments…I know that already. And even then…it says it in the fucking article, how could I not know that?
Thing is both AMD and Nvidia run neck and neck for the most part(I take the 295×2 out of this because it is a monster of a card)one never really out does the other witch is good for prices,the only thing I don’t like is how hard some of you guy want one or the other to die God if anyone of them belly up then say hi to $1000 med range video cards.
Well actually not really. My £150 AMD card outperforms a £230 Nvidia card in all games. For £150, I couldn’t even get a GTX 760 which is miles behind a R9 280. So no, I’ll be enjoying my higher framerates because I chose to get best for my money while you fanboy away claiming that the drivers are better or some shit like that.
I have overclocked my R9 280 and I compare my benchmarks to ones posted online for the GTX 770, I tend to beat almost all of them. A R9 280 also costs £80 less than a 770.
And no I’m certainly no fanboy, if anything I was the complete opposite to an AMD fanboy, I only had Nvidia up until this point. I had 2xGTX570, my laptop is a GTX850M and I even bought 2 tablets with Tegra 3 chips in them.
If you have MSI Gaming ver. of R9 280, than check this out.
http://www.xbitlabs.com/articles/graphics/display/msi-r9-280-gaming-3g_8.html
Btw. the best ver. of that card is from MSI.
Now, R9 280 is 5% better than GTX 760. In gaming. GTX 770 crushes your sweet R9 280. You can’t even compare those two.
I have XFX and I have overclocked it by 25%. I have run benchmarks MYSELF and in almost all cases it outperforms a GTX 770 by a pretty large margin. Why are you trying to disprove me with a link to some unknown site when I have tested it for myself. By default it is not more powerful, but with a large overclock like mine, it is!
They use maybe 50 watts more power. That’s 10-20%. On a desktop, unless you have utterly horrible ventilation and obscene power costs, you shouldn’t care about something like that. That’s why I have 2 r9 290’s in my rig. In my laptop, I went with nVidia maxwell chip. I get the best of both worlds – extreme price/performance on the desktop, and power efficiency for the laptop – where it actually matters.
My old laptop had an HD 5470 so I’m going to assume you had some design issues with that laptop, unless it was more recent. At that time, nVidia was actually having quite a few issues, and I personally knew people with nVidia chips in their machines who were experiencing serious overheating.
http://semiaccurate.com/2010/07/11/nvidia-plays-meltdown-blame-game/
Well you asked a dumb question
Sure…
Quite happy with my dual 7870s maxing every game out there for less than the price of the current super high end single gpu chips AND having great Openl CL support for video rendering AND having my Intel iGPU to process videos while I play my games.
I don’t feel the need to shell out for an Nvidia Titan or a 780ti but fuck me, right?
“power hungry overheating things… ”
I smell a rat that doesn’t know how to google!
Meh might as well. Their wil always be something new. I have 4 titans and I’m willing to bet nothing less than 3 of the new cards will equate so I will be keeping these for another few years.
Unless your rich and like to upgrade your PC and work on it more than use it – you shouldn’t let upcoming parts stop you from purchasing, ever.
Yea they’re just greedy money grabbing self promoting garbage.
4k ain’t even worth buying now. Get 3 1440p screens and run a higher resolution with double the refresh rate for the same price or less depending.
Ironic that with radeon you get the same performance, better software, and better customer support for half the price.
Your delusional. The only place they are superior is marketing due to clowns like you lol. Mind you this is coming from a life long nvidia fanboy. I currently have 4 titans. I had so many problems with them and nvidias software and their reps, I built my first amd rig to prove a point and confirm my suspicions. Now my three 290x rig is alongside it and has none of the problems that my nvidia rig does.
The fans are a bit louder though……
Oh u mean your 3 more fps for 350 extra? Bwahaha. I’d say you got had. AND you can’t mine..
i am not experience with triple sli or quad sli/4way fire and all that, just crossfire and sli and i never had any issue with sli n’or crossfire with 2x660ti and 7850crossfire. Both same performance in gaming. nvidia more expensive due cuda and render supports, physx. i am not sure what you did to get problems with your 4way. i am not sure why you want to 4way a titan either… an sli 780ti is way better than an sli titan. And since i have never had any issues with sli u probably did something wrong with your build
The maxwell release was delayed a full generation. So, they decided to skip one.
I’ve OC’ed my GTX 770 and it wrecks every r9 280x and even r9 290 benchmarks. Doesn’t make sense to compare a OC’ed gpu to a non OC’ed GPU, also taking in its running in whole diffrent set up. Go get a GTX 770 OC it 25% and then bench both cards, i bet ya the GTX 770 would wreck your r9 280.
Btw im not saying a r9 280 is bad a card, its on of the best priced/preformance cards on the market right now, just like the r9 290. But comparing it to a gtx 770, and saying its better is ridicoules.
The gtx770 is positioned vs the R9 280X with about a €10/15 diffrence, and i think its positioned quite well. Nvidea will reposition its GPU’s as soon as these new cards hit the market.
I’m still running my close to 5 year old 5870 without any problems. Although that is anecdotal… you’d have to look at actual failure rates to validate your comments.
I do agree that NVidia cards tend to be better built and PhysX counts in some of the games I like to play.
My next one will be an Intel / NVidia combo, moving from a full AMD system that has been running very reliably for me since ’09.
4 Titans in the same system? Do you mostly work or game with it?
Nvidia > AMD for the single fact that I can still download updated drivers for my 8600gt, which is about 8 years old.
What benchmarks did you run? Some synthetic garbage or in-game experience?
Sry for late replay…
Both, I ran a few game benchmarks such as the one in Metro 2033, I ran a few games for a while monitoring fps in the background and I ran Firestrike benchmark.
hey..was looking for info on why nvidia skipped 800 series and saw your comment. It’s been 2 years and I’m replying to your comment to remind you that you were right 😀
Actually he was wrong, it’s GTX 10 not 1000.
Technically he’s still right because we’re in the 10XX… but Nvidias naming is absolutely ridiculous. 400, 500, 600, 700, 900-series (hundred) and now it’s 10
I think it’s because GeForce 10 series are successor of GeForce 9 series (9xxx).
Makes sense. Cheers
I am from the future there are GTX1xxx
XD