Having the world's fastest car is no fun unless you have the right racetrack. If that track does not exist, then you will need to build it. For cars, that means splashing out on a Nürburgring or something similar. With graphic cards, it means having a triple A title that pushes everything you do well, to the absolute limit. Enter EA.
With the GTX480, nVidia had a massive lead in graphics processing techniques like tessellation. Unfortunately, there were almost no games in the market that utilised that additional processing power. Aliens Vs Predator included tessellation – but just enough to allow AMD and nVidia to compete on a similar playing field. Great for gamers, not so good for nVidia. Millions have been paid to create a Fermi processor with butt-mastering tessellation capability, which isn't being utilised by game developers. What a quandary. How to fix it?
Enter Crysis 2 and a $2m spend from nVidia's marketing team.
Looking at the GTX580 specifications revealed by KitGuru here, this card has a very specific profile. To justify a price tag of almost £450 it will need to do some very special stuff. It will need to pull clear ahead of AMD's best GPU in a series of top games.
It's likely that Crysis 2 will become a major benchmark for 2011.
In recent months, in development labs across the region, Crytek experts have been busy beavering to get the game into stores on 25th March 2011. This date has been set in stone on sites across Europe for many months. We're betting that it will change. By quite a bit.
One of KitGuru's cohorts was at the Multiplay i29 event on 24th November 2006 when several people involved in Crysis, including Sebastian Spatzek were asked a ton of questions about the (then) new game.
Just prior to the main onstage demo, the question was asked “Why the delays and why does the game now seem so slow on normal hardware compared to the earlier builds that beta-ed their way into the market. Anything to do with recent sponsorship deals?”. The reply was a wry smile and a gentle shrug of the shoulders. Not sure that you can read anything definitive into that, except to say that the game did come out long after it was expected and it ran like a pig on most set-ups. Strangely, with Zardon's Tri-SLi nVidia uber-rig it worked fine. Coincidence?
KitGuru says: There are lots of reasons why a company would spend $2m sponsoring a game so close to launch. Whatever the reason, it's hard to ignore the possibility that it is being re-styled for the GTX580 and GTX590 cards with an absolute ton of additional tessellation IQ. For those that can afford a GTX580 going into Xmas, we believe that your Crysis 2 experience when it launches (several months after March) will be as good as it gets.
Nano blast below, BFG in the KitGuru forums.
make sense to me, why spend $2 million and not get it to use your hardware strengths to the max? crysis was always better on nv.
I am looking forward to this, but I sure as hell aint spending £450 on a GTX580 for the pleasure.
Should be good, I was one of the few people who liked crysis.
It should run well on most fermi hardware, probably just need to dial it down a bit for a 460 at high res with high iq settings.
Playing it on console anyway, most of FPS i do on Console, RTS, RPG, SIM etc = PC
.. but you know just once id love for AMD to have a release that no one expected and that made the back door deals with NV look like a waste of money
I am one of strange species who loved Crysis. Now if the new game is going to be better optimized for the PC, why not? What is the point in buying all the hardware if there is no software to make use of it?
I am looking forward to hearing more.
If they delay Crysis 2 one more time it will have taken longer to make the sequel than the original game!
If they delay Crysis2 again, even a few months; it will have taken them longer to make the sequel than the original..
I also liked Crysis… actually very much to be honest… though the aliens were (too) dumb… apart from that, all the rest was pretty good 🙂
Everyone seems to be missing the point. The GTX 580 is not significantly different to the GTX 480 that has been on the market for many months, it may be that Nvidia is spending money to make sure it is twimtbp game, but I really do not see months of extra work redesigning it for the GTX 580 specifically (unless of course it was never designed to take advantage of the Fermi architecture in the first place)……
The thing is though, that the GTX580 will have such a huge power advantage over even the GTX480 (rumour is 15-20%) that they may very well tweak the IQ settings and tessellation options to allow for higher maximum settings to ensure those who buy the GTX580 can select the ‘insane’ level of candy in the options. They dont want to undersell the card. It was the same to a degree with crysis. the highest settings just trounced the hardware at that time, allowing only people with tri sli to max it out. I can see them working with nvidia to adjust tessellation settings at the maximum levels to ensure it is just about playble on the 580 at good resolutions. this means when people get around to using it as a benchmark tool then the current hardware of that day cant handle it, but only GTX580 and 580 in SLI can……..
@ Louis.
Even though the GTX 580 is supposed to be 15-20% faster than the 480, the architecture is the same so I see no need to redesign the game code for the addition of the GTX 580, Dx 11 is DX 11 and tesselation is what it is…… There might be continued optimization to allow this game to run faster from the get go with Nvidia hardware but that’s about it……
The only logical reason I see for Nvidia to spend $2 million with the game developer is to gain sole sponsorship for the purpose of advertising that the GTX 580 video card is the one to purchase if you intend to play Crysis 2…
Nvidia are banking on Crysis 2 being a hit for the PC and this in turn will push sales of its GTX 580……. Or so they hope……
Let’s not forget that there is a very very strong possibility that Nvidia are making sure the developers use code which favours the Nvidia way of doing things. There is no way in hell Nvidia are paying 2 million dollars for a title load in screen and a shiny logo on the box.while I would take what AMD said here in the interview with the tech guy with a pinch of salt, we already know that nv paid for stone giant to use levels of tessellation which only worked with Nvidia boards, even if it didn’t add anything visually to the benchmark. You can be damned sure some serious in game tweaking will be in place now by Nvidia techs at crytek to ensure it runs well on Nvidia even at the cost of it running badly with AMD. That’s what these cash hand outs are for.
I am in two minds about what happens with TWIMTBP games. Sure, spending money and working with companies to get the best experience for your users is fine, im all for that. But do you not think that sometimes moves are made behind closed doors?
For example say for instance there were three ways of coding specific functionality, one way benefitted AMD, one way benefitted both and the last benefited nvidia. which way would suit nvidia more? mode 2? I dont think so. It already happened with batman. http://www.kitguru.net/components/graphic-cards/ironlaw/nvidia-offers-the-constant-smell-of-burning-bridges-says-amd/ so I would assume there are other titles that did the same.
I remember when Crysis was released it ran well on ATI cards then all of a sudden with a patch it ran worse, and then better on Nvidia.
its niave to think that 2 million dollars passes hands with no restrictions. the coders at Crytek already know which hardware is good at what. You can be pretty sure that tessellation wont be working as a 2×2 grid of pixels. tessellation on this game will more than likely be running at a single pixel level which will badly harm AMD solutions.
We could argue the point and say ‘its a weakness of AMD hardware, cause if nvidia can do it why cant amd’ – however what is the point if we cant actually see any difference?
@Optix: We believe any Crysis changes will be done in a completely different way from Batman AA. With Batman, there was a detection and a turning off. That’s pretty blatant. With Crysis, the game could have been right near a final RC (release candidate) when it was decided that a significant increase in tessellation would ‘enhance’ the game. That would be true for nVidia hardware, but not for AMD. This is a much tougher case to argue, because you need to make a decision on ‘how much tessellation is enough’. We have highlighted the fact that this kind of ‘increase to image quality processing’ may well happen, but you’d need to see the final game and benchmark running to have a real opinion about whether it has been done to be primarily (a) beneficial to nVidia customers or (b) hurtful to AMD customers. We’ll have to wait and see.
@Optix (follow up): If any additional enhancements have been done to improve the experience for Fermi card users and it happens to negatively impact on AMD users, then that is one thing. If it negatively impacts AMD users, but you can hardly tell the difference when you put 2 screen shots side by side, then that’s just wrong.
Finger’s crossed that any enhancements are done to improve IQ and the gaming experience.
It would have been nice if Nvidia actually did hurt ATI and their crappy drivers with their work on Batman and it’s AA, but the fact is, AMDATI failed – even after Richard the liar Huddt squealed he cared about gamers and had his top programmer on it making certain he delivered AA for ati card purchasers, he FAILED.
ATI has CRAPPY tesselation still, I’m sure they want NVIDIA to write all the extra drivers and enhancements for all games so they can just glom onto someone else’s work for free, as usual.
Oh and they need to with their 8 billion in debt problem.
I’m amazed how the idiots of the world bought ATI’s whine about Batman and AA, and there are a LOT of idiots in this world.
GET OFF YOUR BROKE BUTT, AND PAY YOUR OWN PROGRAMMERS TO DO THE WORK IF YOU WANT YOUR CRAP BROKEN DRIVERS ATI TO WORK WITH ANY GAME AT ALL – STOP WHINING AND BLAMING IT ON EVERYTHING AND EVERYONE ELSE ! IT’S YOUR FAULT AMDATI AND RED ROOSTERS ! YOUR CARD SUCKS, IT’S BOTTOM LINE SUCKS, ITS DRIVER TEAM SUCKS AND IS A TOTAL OF “A FEW PEOPLE” AND IT BLOWS BLOWS BLOWS !
Not Nvidia’s fault, never was and never will be.
Your eyefinity implementation SUCKS and is too expensive and lacking an adapter and special monitors BLOWS as well.
If you ha da chance to blame it on Nvidia you no doubt doubt would, you insane, deranged, lying red roosters ! YOU SUCK and you BLOW.
Yep, your an nvidiot.
instead of being on the fence like a consumers supposed to be your a nutta.
If the move is done in order to hurt those who have AMD/ATI cards, wouldn´t Crysis 2 sales go under? Wouldn´t be much better to EA to sell as many games as possible? Isn´t this the reason why so many producers and publishers starting to develop for both XBOX 360 AND PS3? I have both systems, besides a Core i7 GTX 480 SLI rig, so, at least in theory, I can play any game at all. I´m anything but a fanboy and, if the soon to be released HD 6990 ends up better than GTX 480, I´m going to buy two of them.
GTX 480 is not faster than HD 5970, so it´s highly unlikely GTX 580 is gonna be better than HD 6990, even if tesselation comes into play.
false news created by nvidia fanboy.
SiliconDoc never seen such idiocy writing together! broad face of being dumb and stupid and will learn more about gpu.
Nvidia is not going to pay Crytek to deliberately make a game that only runs at an acceptable frame rate on it’s flagship card because that then means the 580 then can’t run it in stereoscopic 3D, and because the flagship cards are a very small part of it’s sales,
Another issue with this article is simply that while the details of the GTX580 is big news for you it is probably very much old news for Cryteks bosses or the team that is doing the PC version of Crysis2. If thats true it’s certainly old news for the Nvidia engineers writing the drivers for Crysis 2.
Don’t get me wrong, expect delays. Just not for that reason.
“Re-designed for the GTX 580”? Sounds a lot like “inserting crippling code for ATI” to me. Just like Hawx 2, we’ll be getting sub-pixel sized tessellation that literally can’t be seen because it’s smaller than a pixel.
i realy like this game.