It looks like Nvidia might be making the jump from the 16nm process to 12nm for its upcoming Volta GPU architecture as reports this week indicate that TSMC has received a new order for High-Performance Computing chips from Nvidia. This isn't the first time we have heard whispers of Volta recently either, with traces of the GV100 being spotted in drivers just a few weeks ago.
According to a report over on DigiTimes, which can admittedly be hit or miss, this week TSMC landed orders for HPC chips for AI applications from Nvidia, which will potentially be the GV100. The paper cited in the report goes on to say that TSMC will fabricate Nvidia's next generation Volta GPU using the 12nm process.
Nvidia's Volta GPUs may end up being paired with the company's own Xavier supercomputer chips, which are used for things like AI and self-driving cars. This means the GV100 core will primarily be aimed at advancing Nvidia's supercomputing goals. Meanwhile, we can expect GV102, GV104 and GV106 graphics cores to make up Nvidia's GeForce gaming lineup of Volta architecture GPUs.
KitGuru Says: As someone who is waiting for the price of achieving 4K/60fps to come down a bit before upgrading my GPU, I am very much looking forward to what Volta can bring to the table. However, while Nvidia might now be placing orders for the GV100, it will be quite some time before we see the Volta architecture in a gaming-grade GPU, especially since the GTX 1080Ti only just launched a week ago.
Nvidia made a very nice jump going to Pascal which I’m told is like a Maxwell node shrink that you can overclock a lot more. If they want to replicate that success it’ll have to be a revolutionary architecture, but if they obsolete their own high-end Titan X again that’d instantly make 4K/60/Ultra gaming possible on a midrange budget. So we’ll see how it goes!
I got paid 104,000 thousand dollars in last twelve months by freelancing online and I manage that by working part time for 3 or sometimes more h every day. I’m using work opportunity I was introduced by this website i found online and I am happy that i earned so much extra income. It’s very user friendly and I am just so thankful that i discovered this. Here’s what I do… http://gee.su/8QTs9
Pascal is a very optimised Maxwell, they changed the execution paths and optimised the layout in the silicon itself to allow for higher clocks. Volta will likely be much the same, noones trying anything new in terms of architecture, nVidias still on its cuda cores trying to optimise that as much as possible and AMD is still on GCN architecture, no idea what gen thats on now, but gpus wont advance much until we hit the limits of silicon, then we will see the push towards better performing cores, higher clocks, better efficiency etc
Vega may still be GCN but it is a massive change. Almost as big as the switch to GCN was and a bigger change than going to Polaris or VLIW5 to VLIW4 was. The High Bandwidth Cache Controller alone can potentially change things massively. But we will have to wait and see if things live up to their potential, which relies in more than just AMD, but developers as well.
EDIT: Fixed an incorrect name.
True, Polaris didn’t really live up to it’s expectations, that was quite literally just standard GCN with the clock boosts from being 16nm. I’m waiting for Vega to drop, or perhaps news of nVidias Volta. Either way, will be interesting. My titan x maxwell is mostly ok at 4K for the time being. The itch to swap out for a 1080ti is strong…
I think it is more that people set their expectations too high and assumed Polaris was the big change when it was always intended to be Polaris. It does not help that until we heard about Polaris and Vega we had heard of “Greenland” as the big change. So people assumed both were Greenland and Vega was simply larger Greenland chips than Polaris. The truth turned out to be that only Vega was the former Greenland.
But no, Polaris was more than just clock boosts. It included a few new features and improvements to existing features. However, Vega has over 200 new and/or improved features.
As for Titan, I have long thought it was overpriced. I remember the day when the top end cards were US$500 but Titan changed that. And 1080 Ti makes it so I will never buy a Titan regardless of how good they are. The card is almost identical to a Titan for barely over half the priced. At least with the 900 series there was a semi-noticeable difference between Titan and Ti. But with the 1000 series Titan seems to just be an early adapters card as opposed to how Titan used to be more of a professional video editors card that was also the best gaming card.
Alot of things nowadays are to milk the early adopters. Though DDR4 is a wierd one, when DDR4 first came out it was fairly expensive but quickly dropped a lot, so much so i got a 32gb 3200mhz kit for £185 (its rrp at the time on corsairs site was 195), but however, the very same kit right now costs 319.99?!?! Not to mention for a red heat spreader i must shell out 419.99!!! Over double what i paid for my kit. They werent even that much when they released… probably because Ryzens just released so now all the diehard AMD fanboys will be putting a sudden huge strain on supplies – look at how motherboards all were out of stock everywhere for over a week lol