Advanced Micro Devices has reiterated plans to start high volume shipments of new graphics products and accelerated processing units later in this quarter with meaningful revenue contribution starting in the second half of the year. As reported, AMD is set to officially release its new lineup of GPUs in June.
Because of weak demand for personal computers, AMD’s issues with channel inventory as well as competitive positions of its products, shipments of AMD microprocessors and graphics processing units have been dropping for about a year now. Moreover, AMD’s CPU and GPU market shares have been decreasing for many quarters in a row as well. As a result, the primary tasks for AMD’s management is to boost sales and improve positions on the market. According to the chief exec of the company, this is exactly what AMD plans to do in the second half of the year after launching “Carizzo” accelerated processing units, Godavari (Kaveri Refresh) APUs and all-new Radeon graphics processors in the late second quarter.
“As we go into the second half of the year, we would like to see some regain of share in both the desktop and the notebook business,” said Lisa Su, chief executive officer of AMD, in the company’s quarterly conference call with investors and financial analysts. “I talked about Carrizo being a strong product for us, I talked about some of our graphics launches that we will talk about later this quarter.”
As reported, AMD plans to officially introduce its new Radeon 9 300-series graphics processors in June, around the Computex Taipei 2015 trade-show. According to sources with knowledge of AMD plans, this time the company will take a rather untraditional approach to introduction. Instead of unveiling numerous new graphics cards one after another, the company is going to reveal the whole family at once. As reported, the new series will contain both new and old graphics processing units.
Based on media reports and sources with knowledge of AMD’s plans, it is expected that the Radeon R9 300-series family will contain three all new graphics processors: Fiji, Grenada (improved Hawaii) and Trinidad. In addition, some forecast that AMD to finally use Tonga XT chip with all compute units/stream processors activated.
Here is the compilation of what we do know about the Radeon R9 300-series range so far (please keep in mind that not all model numbers and specifications may be accurate):
- AMD Radeon R9 390/390X – Fiji Pro/Fiji XT graphics processing units featuring GCN 1.3 architecture with up to 4096 stream processors and 4096-bit interface to HBM memory. Price range: $649 and upwards.
- AMD Radeon R9 380/380X – Grenada Pro/Grenada XT graphics processing units featuring GCN 1.2 or GCN 1.3 architecture with up to 2816 stream processors and 512-bit interface to GDDR5 memory. Price range: $249 – $299 – $329. Since “Grenada” GPU is basically a revamped “Hawaii”, it is possible that instead of making a new GPU, AMD will simply use the old one under a new moniker.
- AMD Radeon R9 375X – Tonga XT graphics processing units featuring GCN 1.2 architecture with up to 2048 stream processors and 384-bit interface to GDDR5 memory. Price range: around $229.
- AMD Radeon R9 375 – Tonga Pro graphics processing units featuring GCN 1.2 architecture with up to 1792 stream processors and 256-bit interface to GDDR5 memory. Price range: around $199.
- AMD Radeon R9 370/370X – Trinidad Pro/Trinidad XT graphics processing units featuring GCN 1.3 architecture with up to 1536 stream processors and 256-bit interface to GDDR5 memory. Price range: $119 – $149.
The situation with the upcoming Radeon R7 300-series look less clear. On the one hand, AMD could continue offering its code-named “Curacao” graphics processing units for Radeon R7 360-series graphics cards, but the company could also introduce something powered by the GCN 1.2 or the GCN 1.3 for the price range of around $100.
If the compiled information about the new Radeon R9 lineup is correct, then AMD’s positions in the second half of this year will be somewhat better than today. Still, given that the market is very competitive, a lot depends on Nvidia’s GeForce GTX 900-series refresh plans. Nonetheless, AMD remains optimistic about its future products.
“I think from the standpoint of being able to capture more of the market and increase more to where our normal shares are in graphics, I think that something that we believe we can make progress towards,” said Ms. Su.
AMD did not comment on the potential Radeon R9 300-series lineup.
Discuss on our Facebook page, HERE.
KitGuru Says: It will be interesting to see how successful will the new Radeon R9 300-series family be. Releasing one new graphics card after another ensures that every GPU receives a lot of attention from the media. When AMD launches the whole lineup at once, the press concentrates on the flagship product and hardly covers mainstream offerings. As a consequence, some customers may never learn about AMD’s new offerings in the $100 – $200 range.
Savings account…HERE I COME <3
You are saving but I’ve already in my account! just waiting for the launch! 😀
No thank you AMD, HBM1 is limited to 4GB and memory is not the problem, the problem is AMD’s hot watt wasting micro stuttering GPU core and inferior drivers both for single and misfiring crossfire. Rebranding old watt wasting GPUs and gluing them to HBM1 is not gonna fix AMD’s issues, and I don’t want to add massive water cooled radiator just to keep up with a air cooled Maxwell, so NO thank you AMD.
I’ll be looking for performance per watt champion Maxwell 980 Ti with its full fat GM200 GPU, 6GB with help from Win10 DX12 which stacks memory will make this mighty Maxwell the 4K future proofing solution. Add in the Fact that Maxwell runs cool and has enormous amount of overclocking potential OEMs will be offering 1400 MHz + speeds.
You are full of shit.
With a response like that you must have a IQ of MINUS -100, you’re such a special shit for brains mental midget.
AMD’s hot watt wasting micro stuttering GPU core and inferior drivers both for single and misfiring crossfire.
No really?? Why do you post shit like this when its not even accurate? The Truth is crossfire is better than SLI and microstutter us a thing of the past. I am running 3 290x in crossfire and It’s the smoothest multi GPU setup I have ever owned. You are conmplianing about power usage.. Lol. This card is old.. When it came out the gtx 780ti is on the same page in terms of power usage. And by the way I own 2 of those as well, I buy either side and I have every since the late 90s.. As far as the gtx 980ti those are rumours.. Same with the 390x. So quit looking like a dumbshit and posting shit that’s not accurate, idiot.
Your words is true but I still want a Titan X.
The prices are really horrible, I mean just look at them, its either 100-320 area or its over 600, even Nvidia has better pricing this time around, and thats even after taking into consideration that not everyone trades in $ but euros, and that will translate even more to a bad pricing move. Also realllllyyyyyy, rebrands again, I understand they have a surplus of 290s but thats just …..
I don’t find the 390 prices believable. Maybe if those were 8GB versions. AMD should know they won’t sell too well at that price. Kitguru has always had a higher estimate.
AMDs reference cooler needs to look less tacky. That should help them a bit. Less red maybe. Red accents better
I CALL BS, AMD’s runt rendering, micro stuttering Crossfire was never better than SLI, its Drivers are a nightmare and performance a hit or miss (mostly miss).
AMD’s xfire is a micro stuttering misfiring horror slide show Grand Theft Auto V comparisons prove it. Google it learn and quit crying and lying.
Nvidia’s GM200 GPU is DOMINATING ALL of AMD Watt Sucking GPUs, 980Ti is ready to rip even more market share from debt hobbled AMD. Stop being a AMD POS pumper, AMD and Nvidia use the same 28nm TSMC process BUT Nvidia’s Performance Per Watt Advantage Blows AMD’s GPU Watt Wasters into the Stone Age.
Its Not Nvidia’s fault Debt Laden AMD can’t keep up and is Geforced to Re-Brand Watt Sucking Lame Old GPUs.
XDMA Crossfire has been reviewed and at this point it’s general consensus that it’s better than SLI.
HBM1 is only limited to 4GB w/o interleaving. AMD is interleaving 2x4GB on the 390x.
Minus -100? So…. he has an IQ of 100. Double negative = a positive, dude.
AMD’s Crossfire SUCKS
GTA V review,
What’s really interesting here is Nvidia’s GTX 970. I mean,
look at it! It’s neck and neck with the GTX 980, and the 970 I’m using
has reference clocks. It takes 2x Radeon 290x in CrossFire (or the
Radeon 295×2) to hit those same numbers, it’s a strong argument for the
price/performance value of the 970 (and a not so strong argument for
AMD’s scaling potential, with only a 13%
scaling uplift with 290x).
AMD Aside from some stuttering in CrossFire configurations !
The results with Nvidia GTX 2x 980 SLI back up Rockstar’s decision to use that very configuration when demoing the PC version of GTA V. And AMD’s dual 290x option
provides a compelling value proposition, though they need work on their
frame pacing issues, I didn’t notice any
stuttering while using Nvidia SLI configs.
AMD’s Crossfire SUCKS
The 380 is similarly priced to what the GTX970 and 290 are right now, depending on performance, that seems reasonable enough. And it’s not like the GTX980 is that much cheaper. But yeah, something in between 330 and 650 would be useful.
Have you ever used an AMD graphics card?
You’re obviously a fan boy talking shit that you can’t compute in that tiny brain of yours. Your expecting a card released in 2013 to compare itself to one made in late 2014? How do you know most of these cards are rebrands? 780 ti had the same performance/watt but with a heavier price tag. R9 290x has the same or less if a price tag than a 970 and you may be surprised to see that the AMD card actually has 4 GBs of ram.
Just keep making stuff up as you go to fit your narrative. You can also go back and check reviews from any game out there. XDMA Crossfire is better than SLI. Again, it’s general consensus at this point.
http://www.guru3d.com/articles_pages/gta_v_pc_graphics_performance_review,5.html
Go away sponsored troller….
June! Ugh! Upgrading for Witcher 3 and I cant wait till June 🙁
You keep using GTA V in your comparisons even though it hasn’t even been released for more than a week. Are you sad that you have have to use SLI bridges? Your false claims are very inaccurate and you just use games that haven’t been fully optimised yet.
Guys… why are you all feeding such an obvious troll?
I want to know what AMD is going to do (has done) about memory granularity, because I can’t see it being easy to split that 4096 bit bus into 32bit controllers – but maybe they have managed that. If they haven’t though, they will have a hard time getting the peak bandwidth out of HBM which would be a crying shame.
Problem in GTA V Crossfire scaling ?? LOL
Look its 100% scaling. Does any SLI do 100% scaling ??
Maxwell’s Premium performance commands a premium price/margin that Debt laden AMD wishes it can have but can Not because their old watt sucking rebranded GPUs can’t compete despite using the same TSMC prosess. Its Not Nvidia’s fault Debt Laden AMD can’t keep up and is Geforced to Re-Brand its Watt Sucking Lame Old GPUs and losing massive market share. I don’t give a crap about what’s cheapest I want the best and its up to me to decide what I can afford.
LOL, “AMD card actually has 4 GBs of ram” BUT it can’t keep up with 970 that’s using
reference clocks. It takes 2x Radeon 290x in CrossFire (or the
Radeon 295×2) to hit those same numbers, LOL.
STOP being a Lying Crying AMD POS Pumper.
This is a big deal and AMD missed a chance. They missed both GTA and witcher 3. What other PC games are people going to be looking to upgrade for afterwards?
I will wait, but a lot won’t.
“Crossfire is better than SLI” ONLY to a Blind as a Bat AMD POS Pumper. Crossfire has been a stuttering runt rendering misfiring mess FOREVER! AMD did this to up its FPS at the cost of sacrificing visual quality. Crossfire drivers CRASH and BSOD Burn like Kamikaze on crack.
STOP being a Lying Crying AMD POS Pumper.
You can call BS till the cows come home. The fact of the matter is the SLI bridge has a bandwidth issue even Nvidia has stated it that’s why there working on a new interconnect. This is also why AMD droped the Crossfire bridge on there newer cards & is now using PCI-E.
Maxwell still uses the SLI bridge for now which is the worst of the 4 options. It goes Nvlink>PCI-E>Crossfire Bridge>SLI bridge. This is why AMD has better scaling in most games. AMD has always had better scaling.
Next generation Nvidia finally dose away with SLI & will have near perfect scaling. Until then AMD is kind of multi GPU.
AMD gets 92 – 99% scaling in most new games now.
Almost all games launched last year give 92% scaling in 4k & 99% scaling in 1440p.
I CALL MORE Blatant BS from AMD POS Pumpers
Nvidia DOES NOT have a SLI Bridge bandwidth issue with Maxwell, and Nvidia NEVER said it did.
Prove it or SHUT the F up.
star wars battlefront with the Frostbite engine looks promising
I think the 980 ti will be awesome but AMD’s drivers are actually better than nvidia’s at the moment. AMD cards run better than Nvidia when using direct x 12 right now, nvidia needs to catch up: http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm/6
My dual r9 290’s never go above 60 degrees C even when maxing games out at 4k. Its all about proper ventilation
AMD’s debt has nothing to do with Nvidia’s graphics cards. Its all due to the fact they compete in TWO different markets, and bought a fabrication shop and then had to sell it again when the economy crashed in 2008. Nvidia may have better performance per watt, but AMD has better performance, period. Look at the direct x 12 becnhmarks. The 295×2 is still the fastest card on the market for the price. Sure you sacrifice power, but in the end both companies aim for different objectives for different consumer requirements. No need to bash on AMD so hard.
if you want what is best you would buy a 295×2! LOL
best card on the market for the price. AMD cards are made for 4k, they have 512 bit architecture, nvidia only has 384 bit.
dude the chart above just destroyed your arguement, the 295×2 wrecks the titan x and is cheaper
Finished my new build a few weeks ago (i7-4790k, Sabertooth Mark 2, 32G 2133 MHz memory) but waiting on GPU (currently twin 670s) as AMDs cards were reportedly supposed to come out in early Q1, not later Q2. Nvidia also has the added kicker of a free copy of Witcher3 through May…. That was a day one purchase for me so theres 50 bucks saved. But that promotion wouldn’t have meant shit if AMD had just released late Q1 or early Q2. Disappointing. Add the Nvidia video memory fiasco (I don’t give a shit, the cards are still awesome) AMD has really missed the boat. Could have been a big market shift in there favor.
There are No DX12 games, there is No need for DX12 optimized drivers, any benchmarks now mean absolutely nothing so Nvidia drivers are better for the here and now.
Good for you on having great ventilation to keep your 290s cool, they’re cheap but they sure suck a lot of watts.
I agree Maxwell 980Ti will be awesome as is the Titan the GM200 GPU delivers gaming at its best.
You gotta take care of pets, otherwise it’s animal abuse.
is there any better processor than the fx 9590? I think AMD should really do something about it don’t you think? Intel and their 59xx processors are an overkill.
THE BEST IS NEVER a 2 GPU Solution that requires as MASSIVE Water Cooled Radiator and AMD’s Bug infested Crossfire.
How much is AMD’s marketing paying you to pump their Lame Old Watt Sucking Re-Branded GPUs?
Just Asking
“…hot watt wasting…” I bet you’re one of those idiots who came in their pants when nVidia mentioned Maxwell’s power efficiency (a whopping ~100W savings under *load*, and only for gaming workloads) while you keep most of your lights on in your house throughout the day and never shut off your PC.
“IQ of MINUS -100” …So you’re saying he has an average IQ?
$400 difference between a 380 and a 390? Should we make funeral arrangements for AMD now, or do you want to recant your statement saying this is what we know about the 300 series?
Playing GTA V on max settings with an old horse 7950 sapphire boost at 60fps bare SMAA and Shadows High.
No stuttering, but on some other games it does stutter and my gues is that this is bad drivers….
AMD has the horsepower in their cards and high bandwith, also Nvidia has their cards optimised a fcton better for DX11 and perhaps with DX12 the tables turn around with AMD having a clear advantage due to Mantle.
They need it as another year of this kind of sales might break their back so i hope the 300 series blow away everything on the market.
They need it !
With Nvidia already launched both the GTX980 and Titan X, there should be no more surprises left. Thus why is AMD delaying the launch further? Does AMD still worry about inventory problems, or do they really have anything compelling to launch at all? Furthermore June is kinda late because that’s another whole quarter to go, and by that time more of AMD’s graphic card market share would be eroded away.
R&D > development > testing > large-scale fabrication and testing > more large scale fabrication > release > revision release
cache, and 95% of the memory on a GPU is normally fixed size buffers, so HBM would be limited more by the cache and process throughput more than the memory itself.
it’s also still a GPU, so random memory access can be fired multiple times per clock if required, it’s generally only when one task thread is running that performance is low, which is a stupid thing to do anyway.
It does, and if you look at the docs for SLI they state it has a mere fraction of bandwidth compared to PCIe, which gives problems in high-res SLI and why their surround system requires monitors connected to multiple cards vs just one (AMD’s choice).
That being said though, I’m sure they would switch the bandwidth mode to use PCIe instead of the SLI bridge the exact same way AMD’s non-hawaii cards do when above a certain res, so the bridge is useless regardless and just an artefact of an ancient design.
Might get a 390/390x to pair with my new 5820k, but no way am i going multi GPU.. sad that micro stuttering still exists, neither AMD or Nv have completely gotten rid if it. Maybe those 2 cards will actually be great performers at 1440p max. Speaking of resolution, it seems 4K is at least 2-3 years off from getting decent frame rates with single cards. Wonder if i should stick to 1080p 120hz or go 1440p.. some say it’s not worth it. Hmm.
Ok Paul, you I sort of believe but when you say it does, do you mean Maxwell in 2x SLI on 1 monitor has bandwidth issue then I don’t believe it.
Please be more specific of what configuration do you think the GM200 has a bandwidth issue in, you mention surround connected to multiple cards but that’s not performance problem in fact its no problem what so ever, you have 2 cards so why not connect both of them, so where is Maxwell’s bandwidth issue, I don’t see it?
Inquiring minds want to know.
Read Weep and learn why AMD Lost Massive Market Share even though they are cheaper.
You don’t have to look long at our results today to see that the AMD
Radeon R9 290X is crying out for help. The AMD Radeon R9 290X is
currently AMD’s flagship single-GPU, just like the GeForce GTX 980 is
NVIDIA’s flagship single-GPU. Other than adding more GPUs, this is as
fast as it gets in single-GPU form from both AMD and NVIDIA. Yet, it
looks like the AMD Radeon R9 290X is lagging severely compared to what
NVIDIA currently has on the table.
The GeForce GTX 980 is making the AMD Radeon R9 290X look like
last-generation technology, which you can argue it is since its launch
nearly 1.5 years ago. We are shocked how far behind the Radeon R9 290X
is falling from the newer GeForce GTX 980. It seems to keep falling
further and further behind in every evaluation we write! Chock some of
that up to the clocks we are seeing, but drivers are certainly part of
that equation as well.
The ASUS ROG Poseidon GTX 980 Platinum has surely laid the smack down on
the AMD Radeon R9 290X. We were using a very high factory overclocked
customized and expensive AMD Radeon R9 290X GPU based video card today.
The Sapphire Vapor-X R9 290X Tri-X OC debuted at the same price as the
ASUS ROG Poseidon GTX 980. However today the Sapphire can be had for
about $400. The SAPPHIRE card let us overclock the AMD Radeon R9 290X to
its highest frequency we’ve ever achieved consistently.
This is the AMD Radeon R9 290X at its best, at its absolute highest
performance potential on air. However the AMD Radeon R9 290X is slower
in every game, not just by a little, but by a lot. A highly overclocked
R9 290X cannot keep up with a factory overclocked GTX 980, or a manual
overclock GTX 980. It would be even worse if this were a stock, default
clocked AMD Radeon R9 290X. It is time for AMD’s next generation,
because Hawaii (R9 290/X) just got old.
If I buy any of the new AMD GPUs, it will be the 390X. (and it will be a pair of them)
It will also be AMD partner GPUs with better cooling solutions as well. Bring them on!
Rebranded cards are not going to do it for me, and I can’t believe that they’re still doing this crap!
In short:
-The cards are not finished yet 😉
Star Wars Battlefront and The Division will be worth waiting for, personally. Not to mention reviews, early adopter card/driver issues, etc. Get the new cards in a black friday/cyber monday bundle, I say.
Pretty sure the single price he mentioned was meant for the 390X. Usually it’s a $200 jump between the initial high end and whatever card is right below it.
There’s warehouses full of unsold R9 Housefires they’d like to clear before selling anything new.
Again, the Titan X is more impressive than the R9 295×2 because it can make GTA V run 2560* 1440 2x MSAA. With only one GPU.
Just got my r9 290 tri-x oc for $239. Not even mad, thats about the best bang for buck deal you can get.
Your words is made out of lies. Maxwell GPUs is better than any R9 3xx.
May I ask where? Was it Newegg?
Shut up. Your claims doesn’t work at all. AMD spent years just to make rebrandings of their GPUs from the year 2012 and it still continues this bull**** tradition when the R9 3xx series was launched. And although the GTX 970 have bad VRAM but it still being better and quicker than the R9 290X.
June, here I come. I hope Sapphire will release its new products asap.
Guy is clearly a tool its incredible. its a 60watt difference at load. If you game 8hrs a day (idk anyone who does unless they dont have a job haha) the r9 290 costs u a whopping 24euros a year more in electricity here in France. Don’t forget thats 8hrs a day for 365 days. Yea thats a total watt sucker.
970 atm. Going to return it and get the 390x or the 380x. Either one
any [modern] nvidia card SLI with a single monitor of 1440p or less is unlikely to suffer from the issue, though at 144Hz it might once it starts to hit that and depending on the games shader code (some games are basic, some require heaps of buffer copies). At 4k or with multiple monitors on the primary card however, and you can hit the bridge’s peak very easily, which is half a reason why AMD’s Hawaii scales noticeably better at such resolutions. Especially the 295X2 and its internal PCIe controller that gives it a latency advantage.
It was never to be release q1, has been said for a long time to be released early q3.
He means his savings account, not he is saving…
Only in multiplication dude.
Funny how the R295x2 ,running a couple of those “hot watt wasting micro stuttering GPU cores”
kicks sand in the face of a $1000 flagship TitanX tho isn’t it…?
What are you babbling about fool?
Is it school holidays or something?..seems to be a lot of dumb kids here spouting off on subjects about which they know little…
⇛❦⇛⇛❦I RECEIVED FIRST DRAFT OF $13000!@ak19:
,,
➨➨➨➨https://QuickAtomApps.com/gt/ne..
my 7970 is costing me about £15-£20 a year more than a 680/770 for energy and at the point of buying i would have to be using this card for another 4-5 years (had it 3 years now) before the 680 becomes the cheaper option (and i am planning on upgrading once the 300 series launches so thats roughly a £60-£70 saving that i just put into a new 2TB hard drive)
Hey anton, I’m gonna say this here just so everyone knows, the 285 tonga doesn’t have a cut down memory bus, its full core variant will also have a 256bit memory interface and come in 2nd and 4gb variants. Gcn Cards aren’t cut down in terms of memory ininterface. Also if Grenada Is based on gcn 1.2 it will have a 384 bit memory interface and 3/6gb of vram since gcn 1.2 includes colour compression improvements.
Man I don’t know what do to, get a r9 290x now or wait for 300 series. How much better is the 300 series gonna than the 200 series? What should I do?
Point to an article where AMD clearly stated this please.
LOL, and it takes 2 lame old rebranded Watt sucking AMD GPUs and a massive water cooled radiator to beat a reference stock clocked air cooled Titan, LOL. You and AMD are a joke, dual micro stuttering AMD GPUs SUCK compared to 1 mighty Maxwell GPU, and just wait for the 980Ti highly overclocked versions to blow your AMD watt wasters into the stone age. For you water cooled fans its coming to Maxwell too and rip even more market share from debt laden AMD.
Not everyone has a nuke station in their backyard like you Radiated Frenchies do. Sucking Watts = Heat = Water Cooled Radiators = AMD’s Watt Wasting Rebranded GPUs. Maxwell = Performance Per Watt Champion = Market Share Champion
as soon as the 390x comes out the 980TI will be released a few days later.
Wait for the 300 series and 980ti then decide, HBM1 is a great big bullet point nothing, but don’t waste your money on a lame old rebranded AMD watt sucker especially the dual water cooled monstrosity.
Look at that!. a 290x kicking the shit out of a 780 and 780TI! and only a few fps behind a 980!
now either that is AMDs driver wokring some magic or nvdia is shafting its kepler users. either way AMD cards are aging quite nicely
wait for the 300. if anything the 200’s should drop in price once the 300’s hit the market
You are just loving nvidia arent you. How much do you get paid per comment? P.S. My card runs at 58C under load with the stock sapphire cooler i wouldn’t call that heat. Btw 60 watts is nothing i hope you do understand that its like turning on an extra light bulb or turning one off. For the price of one gtx 980 i can buy 2 r9 290s or one 295×2 and piss on absolutely everything. You can continue about your rant on amd drivers, i have not had a single problem with there drivers and all my games work and run flawlessly this included GTA V thats running at 60fps on very high on 1080p. P.S. This is with a card that draws 100watts of power. Are you still blabbing on about power consumption? What does your car pull out. Yea i thought so shut up. My whole system probably draws max 350watts with a 92% power efficiency.
I base my buying decisions on Facts, you base them on AMD fanaticism fiction.
Try holding a 60 watt light bulb in each hand you’ll be crying in seconds, that’s the heat you’ll put in your rig and need to get out. The market and I don’t care about how cheap AMD’s watt sucking lame old rebranded GPUs are, the market and I want and are willing to pay extra for the best Performance Per Watt WITHOUT massive air/water cooled radiators.
Just Google AMD driver issues, especially micro stuttering crossfire, the hoops you have to jump though to update AMD drivers, and how late and lame they for new games.
read, weep and cry me a river
Both AMD and Nvidia released new drivers for GTA V, “AMD card actually has 4 GBs of ram” BUT it can’t keep up with 970 that’s using
reference clocks. It takes 2x Radeon 290x in CrossFire (or the
Radeon 295×2) to hit those same numbers.
AMD Aside from some stuttering in CrossFire configurations !
The results with Nvidia GTX 2x 980 SLI back up Rockstar’s decision to use that very configuration when demoing the PC version of GTA V. And AMD’s dual 290x option provides a compelling cheap proposition, though they need work on their frame pacing issues, I didn’t notice any
stuttering while using Nvidia SLI configs.
.
Interesting. According to this with very high textures the r9 290x beats the gtx 970. An almost 2 year old card well well.
http://kotaku.com/grand-theft-auto-v-benchmarked-pushing-pc-graphics-to-1698670906
Oh wait and another.
http://www.guru3d.com/articles_pages/gta_v_pc_graphics_performance_review,6.html
You guys sound like a bunch of fucking kids arguing over which console is better, Just shut the fuck up jesus.
New rumors said that the 980ti will be released earlier than the 390x…
For that price it should dont you think ?
I mean damn 1200 euro for 1 gpu compared to the 295x for 620 euro thats half the price 😛
Dude the market share champion is Intel with 75% total dominance with Nvidia and Amd fighting for the remaining 25%.
Check your facts there kiddo 😛
Dude Nvidia OWNS over 75% of the discrete GPU market and growing due mostly to Maxwell’s Performance Per Watt Advantage, AMD share has been dwindling due to its Watt Sucking Old Rebranded GPUs, and yes Intel DOMINATES the APU market, Skylake will put a stake in AMD feeble APU heart.
You’re such an ignorant idiot who knows little, the amd r9 290x competes with the 780ti not the 980 so wait until the r9 390x releases then see how it wipes the floor with the gtx 980 and before you call me an amd fanboy I currently own a gtx760 I’m just pointing out what the truth is and how dumb you are.
You and Judge_Chip = RETARDS.
Go to bed, fucking fanboy
Loser
Sorry AMD pump boy but AMD’s watt sucking old rebranded GPUs compete with whatever is on the market and AMD has been losing massive market share to Maxwell for months now and will continue to lose share for even more months. Gamers want what they want when they want it and its NOT Nvidia’s fault that debt laden AMD can’t keep up despite using the very same TSMC process that Nvidia uses for its Performance Per Watt Champion Maxwell that owns over 75% of the dGPU market. So sorry but debt doomed AMD fights on too many fronts with too many late and lame products that get killed by the competition.
Batman is coming & I guess Star Citizen. Although the later is more next year. 2015 is pretty light on for game releases unfortunately.
At this point I’d be waiting until Star Citizen launches to buy a card unless I really needed one now. Star Citizen with it’s 8k textures will likely be the most demanding game we have for along time.
Alright. Got to the bottom and I’ve just about had it with your ignorant, foolish, retarded, insensible, redundant pieces of shit that you call “comments.” All you’re doing is going on and on about a fucking 290x and 295×2 against Nvidia’s much newer 900 series.
Hey retard, I’ll let you in on a little secret.
You’re comparing the wrong goddamn cards on one goddamn game that hasn’t been fucking released for PC in ages. For the past couple years you haven’t said shit, clearly, because the only game you seem to mention is GTA V. THERE ARE MORE FUCKING GAMES you dumbass.
Oh, by the way? Your whole “watt sucking,” “old hot,” remarks are found each time. Must you say the same shit whenever you decide to let your inner idi–I’m sorry, not inner, it’s just there–idiot out? Come up with something new.
By God, I know this moron is going to comment by saying AMD needing to do something new first, add the words watt sucking, include R9 3xx, and press post. So you know something else? They aren’t all rebrands. And if they are, they’re refreshes. There’s a difference.
Your driver argument has been invalid for quite a while.
Your CrossFire vs SLI argument has been disproven.
You compare a 290x to a 980 and yet this “watt sucking old” GPU still can make it close in the game you’d make babies with, GTA V.
You want to be a fucking troll you piece of shit 12 year old kid?
Go tell your mother the human eye can only see 24 frames per second.
She probably won’t care.
We don’t care about the shit you’re spewing around either.
(If you say I must care if I typed all this, no, I hold no deep sentimental value for anything you have said.)
Now shut the fuck up kid and go to sleep.
Goodnight.
I expected this and it is actually nothing new. And certainly not if you knew they were going to make this annuancement at CES. It is also a comen factor that the graphics cards are not available in large volumes directly after this announcement. So why the fuzz. and if you waited this long, a month or so doesn´t mather either or you should have bougth a card from nVidia already. 😉
I got a r9 290 on sale this week, no regrets. I had a gtx 570. Really depends on what you are upgrading from. That said i doubt the price drop will be as massive as people think, plus its another two months of waiting. My max watt usage from the wall is 400w but that involves running crysis 3 pretty much on max. Other games are are in the mid 300s. So don’t listen to all of the power consumption people. If you have a decent card now i would wait but if you are upgrading from really old hardware say 4 years plus i would jump if you see a good sale.
What are they doing?! I know they cut their R&D but have they cut out something crucial? We haven’t seen anything new from them in a LONG LONG time.
Last year was 295x in April and Kaveri in June which didn’t really deliver anything for single thread. After a year Beema or Mullins are impossible to find with any retailer as an SoC, even though they could have given a decent choice against the ridiculously expensive Rangeley Atoms. Looking at this year we have had Carrizo announced since Feb, but there has been no news since.
I have lost all hope for AMD in the past quarter :/
You and RUBY and Jo Blo = Retarded AMD idiots.
Shut up. You are retarded.
You would make a great argument in politics.
Buzzwords, repeating ones self, acting aggressively = Perfect for parliament!
Really hoping for a promising ~$250-350 (canadian) card from AMD… would love an R9 290X at that price point (in Canada R9 290 is around $375+ and R9 290X around $450+) as I’d like to get a FreeSync monitor.
Right now it’s looking like Nvidia is a better buy because AMD is just constantly losing money, losing market share, etc. which means driver support and product innovations could very well go downhill from here… but god damn I’m not going to support G-Sync. Proprietary crap module that costs an extra $200 on top of the scalar just to ALLOW the monitor to use the technology that it would be capable of without that stupid G-Sync module.
Lol. Congratulations, you’re either a mediocre troll or the dumbest person on the internet.
Microstuttering is a thing of the past.
Crossfire works across a wider range of games than SLI.
Eyefinity is better than Nvidia surround.
Nvidia cards don’t go into proper idle state when multiple monitors are connected (and you complain about AMD wasting watts?).
Drivers perhaps are worse on average, but Nvidia’s have really stupid issues too. Nvidia drivers broke my registry and critical system files requiring me to do a full reinstall of Windows before.
If I upgrade… it’ll be around that time or by end of this year… we shall see…
dual gtx 770 with 4gb ea. @ 1080 is still more than enuff to me.
GTA V is a BIG Deal, Best Selling Game on Steam for a long time, AMD
and Nvidia Both released Drivers for GTA V, BUT AMD’s Watt Sucking GPUs
Micro Stutter, and it takes 2 of AMD’s Flag Ship Watt Suckers to do what
1(ONE) 970 can do, WTF AMD is Lame and Lame to the GTA V Game.
Read AMD Apologist and Cry Me a River
What’s really interesting here is Nvidia’s GTX 970. I mean,
look at it! It’s neck and neck with the GTX 980, and the 970 I’m using
has reference clocks. It takes 2x Radeon 290x in CrossFire (or the
Radeon 295×2) to hit those same numbers, it’s a strong argument for the
price/performance value of the 970 (and a not so strong argument for
AMD’s scaling potential, with only a 13%
scaling uplift with 290x).
“AMD Aside from some stuttering in CrossFire configurations” !
And AMD’s dual Watt Sucking 290x option provides a compelling CHEAP proposition,
though they need work on their frame pacing issues, I DID NOT notice any
stuttering while using Nvidia SLI configs.
LOL, Dual AMD ReBranded 290x Watt Suckers Can Not Beat a Single Maxwell 970, LOL.
Micro Stutter is here and now in AMD’s Watt Sucking GPUs
Misfiring Micro Stuttering Crossfire DOES NOT Work in a wider range of games and IS ALWAYS Late and Lame to the Game.
Eyefinity SUCKs AMD duped gamers drop it and use one monitor after all the try and fail attempts. Multi monitor set ups are a miniscule market at best, I don’t know anyone gaming on it.
Its AMD duped customers that cry me a river the most because of all the hoops they have to jump through just to update drivers, they end up dumping their Watt Sucking ReBranded AMD GPUs on eBay, and joining the Green Team and are gaming happily ever after.
I reinstall Windows every 2 years on all my rigs it take about 2 hours, no big deal, and in 14 years of being on the Green team I only needed to do that ONE Time after a beta driver installation, again no big deal.
You still didn’t answer. How much does nvidia pay you per comment?
P.S. I posted 2 benchmarks that prove ur comments are ignorant and well wrong. The r9 290x beats the gtx970 in both benchmarks or dead even.
I hope they ban this idiot.
Do you really want to see a monopoly on the discrete gpu market because if that happens expect the next 980 to cost $1000+ and titan $2000+. Do you really want to see that?
AMD’s Watt Sucking ReBraned GPUs aren’t putting any market pressure on Nvidia now so why should I expect crazy prices from Nvidia.
Its Debt Laden AMD that has been Geforced into selling their lame old watt wasters at or below cost that make you poor AMD duped fanatics think Nvidia over charges for performance per watt champ Maxwell. BUT the market does NOT agree with you AMD duped fanatics, despite what you fools think Maxwell is selling like hot cakes in freezing blizzard.
There will be no monopoly when AMD sells off its ATI IP in BK court, someone who doesn’t pay SA Charlie to blatantly fling FUD will pick up the Graphics IP and do much better than Debt hobbled AMD.
Verdict
SA Charlie reaps what he sows and what goes around comes around to rip Charlie (NVDA Hating paid AMD Pumper) to shreds.
The Judge
I think there’s no point in arguing with a blind nvidiot like you! You really like trolling, don’t you?
Okay.
i have 7970
Who remembers the time an NVidia employee apologized and advised upset 970 customers can get a refund as they were mis-sold.
http://www.maximumpc.com/nvidia_will_help_disgruntled_gtx_970_owners_get_refund_says_driver_update_coming_2015
Only to be told shortly afterwards this was not the case.
Then there CEO called it a feature and never even apologized.
http://blogs.nvidia.com/blog/2015/02/24/gtx-970/
Absolutely disgusting behaviour.
Remember 970 owners in the UK. Trading standards rules will allow you to make a claim for being mis-sold either through your retailer or credit card company.
We are nearly at the time of the 300 series release so grab yourself a cheap upgrade and kick Nvidia and their arrogant, lying practices into touch.
He wa sposting shit but so are you sir. CROSSFIRE IS PROVEN TO BE WORSE – ON PAR WITH SLI. Depending on the game. SLI 970S Are smooth as butter here no microstuttering. Doubles my FPS inf ar cry 4. LITERALLY DOUBLE
YES FAR CRY 4 on my SLI 970S While using 4GB VRAM NO MICROSTUTTER
Now what? Gonna fight us? You’re spitting shit like you’re about to get up and fight us. C’mon, stop being a little bitch behind a computer screen when you know damn well you’re wrong and retarded. Unless you’re really *that* incapable. No? Stfu and gtfo.
WTF you crying about, I wasn’t posting to you Skycryguy, you posted nothing but BSing fantard crap, so quit wasting my time fantard with your lame ass say nothing posts. I post facts while Howler monkey Skycryguy flings poop from a tree branch, say something worth saying or just shut the f up.
Are you gonna punch me bro? Are you that upset bro? Lmfao “say something worth saying or shut the f up”
That’s funny, that applies more to you than anyone xD
Yep, Newegg ftw.
Ignore the dumbass. Intel doesn’t dominate the APU market at all. Intel doesn’t even produce APUs, it produces CPUs with integrated graphics. That’s doesn’t make it an APU.
Hmmm
Considering the actual yearly operating cost differential between a 290X and a GTX980 are (here in the US) a matter of a few dollars, and the actual performance differential when one considers the entire range from 1080P thru 4K is very close, then the over riding factor is obviously the fact the 980 COSTS 90% PERCENT MORE MONEY!! And let us not forget that the R9 290X platform has better double digit precision processing power than the 980. And it absolutely positively stomps the crap out of the new Titan X at 1/3 the cost
So looked at in terms of cold hard performance per DOLLAR as apposed to an essentially irrelevant performance per watt (people do realize that the 980 is not that much more efficient than current 290X’s) and the constant reference to heat- the latest crop of 290X’s run within a few degree’s of the 980’s, in some cases actually cooler, the AMD wins by a wide margin.
How could anyone rationally defend buying a product that is on average a few percentage points faster in many games below 4K (and actually SLOWER in many business applications) and the SLOWER at 4k, costs only a few dollars a year less to actually operate, and doesn’t run any cooler and runs any non game app requiring double digit procession power much slower, WHEN IT COSTS 90% MORE MONEY? I am truly curious as to what motives people to spend so much more money for such marginal and in many cases mythical increases in performance.
I repeated myself as this appears to be the methodology that has been working for the 980 fans.
.
I only have £470! D:
I salute you
Tool.
He’s like this everywhere; like some kind of child that’s refusing to go to sleep because he hasn’t had his bedtime story read to him.
Actually, not multiplication. A negative minus a negative makes it adding, therefore it’s 0 + 100, so 100. No multiplication necessary.
You will have to excuse the Judge_Chip dude. He’s very pro-NVIDIA and will talk absolute trash about AMD regardless of whether there’s a basis or not. As for the Crossfire remark, AMD’s drivers offer a much better performance increase with Crossfire than NVIDIA’s drivers for a second card in SLI. The gain is close to 50%, while it’s merely 20% for a second NVIDIA card.
They are.It’s called Zen
LMAO I didn’t know what was zen a year back xD
Zen is a complete ground up arch.
Aye. Waiting to get an AMD rig. They should rename their new processors from FX to ZN.