videocardz AMD Radeon R9 290X Memory Bus: 512-bit

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

iMacmatician

Member
Oct 4, 2012
88
0
66
youtube.com
looks like a strong card, but it all depends on what the reaction from NV is going to be,
GK110 is clearly competitive against this card, NV can play with prices, maybe higher clock, a full GK110 card? also how far is NV from releasing a new GPU?
The Quadro K6000 shows that they are capable of, whether we'll see that level of specs in the Tesla/GeForce line is another question.

There have been rumors of a future Tesla "K30," my guess is that a full GK110 for the desktop won't come before a K30. If the Maxwell on 28 nm rumors are true then I wouldn't be surprised if the x70-x80 lines go straight to "GM104" without a Kepler refresh. The Titan on the other hand has high DP rate as a feature, which might not be present in "GM104," so it might get a full GK110 refresh for next year.
 

Zodiark1593

Platinum Member
Oct 21, 2012
2,230
4
81
I went from a 7970 to a Titan. Basically wanted a gaming card, but also one for CUDA apps and the change was worth the jump. As for the difference between the 780 and the titan having extra 3gb of ram for 3d rendering is a huge bonus for complex scenes (using Octane 3d in Lightwave).

And before you lot start questioning my sanity going from a 7970 to a Titan, Octane 3D using a Titan is about x10 to x20 faster than my six core i7 970 rig for rendering and that is being on the conservative side, it is insanely fast.
My friend has a 2.2 GHz Core 2 Duo with a GTX 460. Rendering with Blender Cycles using CUDA outright destroyed my dual core i5 at 2.8 GHz (order of magnitude seems accurate).

Since there isn't yet an OpenCL solution available, I'm torn between AMD and Nvidia.
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
My friend has a 2.2 GHz Core 2 Duo with a GTX 460. Rendering with Blender Cycles using CUDA outright destroyed my dual core i5 at 2.8 GHz (order of magnitude seems accurate).

Since there isn't yet an OpenCL solution available, I'm torn between AMD and Nvidia.

Buy a Fermi card if you're starting with Blender, they worth every penny you spend on them for Cycles. If you need more RAM trying to get a GTX 580 3 GB for ~$250 makes much more sense than going for a GTX 780 that barely beats it in Cycles.

And forget about AMD for Cycles. It would be a miracle if they sort their OpenCL driver.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
Ooo new GPU release...so exciting!!!

About time too,very unlikely i will be buying a 290x but still its pushing performance forward and most likely prices down on older models and perhaps when i upgrade being a good 2 years or so,maybe Titan performance can be had for $250 or less.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
If this card is roughly near Titan, it looks like AMD is stuck in a cycle of playing catch-up, much like the landscape prior to 4XXX.

Maxwell will be released next year, and will most likely offer significant improvement over Titan.
 

ICDP

Senior member
Nov 15, 2012
707
0
0
If this card is roughly near Titan, it looks like AMD is stuck in a cycle of playing catch-up, much like the landscape prior to 4XXX.

Maxwell will be released next year, and will most likely offer significant improvement over Titan.

Swings and roundabouts, sometimes AMD are ahead and sometimes it's Nvidia. Maxwell will not be the only GPU released next year, AMD will there or there about. The cycle continues.
 

thilanliyan

Lifer
Jun 21, 2005
12,060
2,273
126
If this card is roughly near Titan, it looks like AMD is stuck in a cycle of playing catch-up, much like the landscape prior to 4XXX.

Maxwell will be released next year, and will most likely offer significant improvement over Titan.

IIRC, AMD was REALLY behind with the 29XX and 38XX series, but were ahead from the 9700pro days up until G80 launched. And starting with the 58XX series they have been going back and forth with AMD usually launching first, giving time for nVidia to possibly tweak and beat whatever is out. You would expect whatever comes out after is able to beat whatever came before.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I noticed that, seems they have very short memories at AMD. Why they have insisted on preventing that fan from sucking in air in crossfire mode is beyond me. A two slot exhaust cooler is good. Leaving a 3mm gap between the cards for the ingress of air when in crossfire is simply moronic although I know many newer board have a bigger gap between the two primary PCIe slots.:(

Yes, but we'll have people and reviewers cramming them in and there will be issues. I don't know what it is with manufacturers, something can cost ten cents less to manufacture but cause issues for percentage of their customers and they'll opt to save the ten cents and not even consider the consequences.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
My friend has a 2.2 GHz Core 2 Duo with a GTX 460. Rendering with Blender Cycles using CUDA outright destroyed my dual core i5 at 2.8 GHz (order of magnitude seems accurate).

Since there isn't yet an OpenCL solution available, I'm torn between AMD and Nvidia.

Without OpenCL the CUDA enabled renderer is going to slaughter anything else. You'd think with Blender being the open source darling of the animation/modeling crowd they would have jumped all over OpenCL by now. I'm sure down the road they will, but I'd choose by what's currently available with the app., and that's CUDA acceleration.

I don't follow the development of Blender. Unless you know that they're releasing an OpenCL enabled renderer real soon and can afford to wait until it's tried out on AMD hardware. It would suck to buy something like a 780 and a month down the road have it be OpenCL capable and be better.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Buy a Fermi card if you're starting with Blender, they worth every penny you spend on them for Cycles. If you need more RAM trying to get a GTX 580 3 GB for ~$250 makes much more sense than going for a GTX 780 that barely beats it in Cycles.

And forget about AMD for Cycles. It would be a miracle if they sort their OpenCL driver.

Elaborate, please. :)
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
IIRC, AMD was REALLY behind with the 29XX and 38XX series, but were ahead from the 9700pro days up until G80 launched. And starting with the 58XX series they have been going back and forth with AMD usually launching first, giving time for nVidia to possibly tweak and beat whatever is out. You would expect whatever comes out after is able to beat whatever came before.

Except this isn't a refresh AMD is putting out to barely have a faster card....this is a new generation, which you expect to show significant gains over the competition.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
If this card is roughly near Titan, it looks like AMD is stuck in a cycle of playing catch-up, much like the landscape prior to 4XXX.
What are you talking about? nVidia regularly releases behind AMD. The 580 and GK110 have been the only exceptions, and it can be argued with those two that they were actually heavily delayed versions of their prior generation.

I'll go as far as saying that AMD will likely have the entire model lineup from top to bottom out this gen before nVidia does, which, if there's a cycle that we've "been stuck in" lately, it's that one.

Maxwell will be released next year, and will most likely offer significant improvement over Titan.
Lots of stuff will be released next year. What part of next year? Before or after Pirate Islands? Will they get the top Maxwell chip out in a timely manner, or will it be later on like Fermi and Kepler?
 

yacoub

Golden Member
May 24, 2005
1,991
14
81
In the photos on the first page, the last photo appears to show a card bearing two SINGLE-link DVI outputs. It was bad enough that they cheaped out on the 7000 series by giving it one double-link and one single-link on the DVI outputs, but if they switch them both to single-link that's really unconscionable. It'd better have two dual-link DVIs in its public form, or at the very least the same setup as the 7000 series, else they should just reduce it to one DVI output and swap something more functional in the other's spot.
 
Last edited:

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
Except this isn't a refresh AMD is putting out to barely have a faster card....this is a new generation, which you expect to show significant gains over the competition.

They have better perf/W and per mm2 and by quite a margin...

If this chip is 430mm2 then increasing its size to GK110 level
would improve its perfs by 40% given that the added area
would be dedicated solely to computing since all others parts
are already adequate for such an extension.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
In the photos on the first page, the last photo appears to show a card bearing two SINGLE-link DVI outputs. It was bad enough that they cheaped out on the 7000 series by giving it one double-link and one single-link on the DVI outputs, but if they switch them both to single-link that's really unconscionable. It'd better have two dual-link DVIs in its public form, or at the very least the same setup as the 7000 series, else they should just reduce it to one DVI output and swap something more functional in the other's spot.

It's not likely going to have 2x DL-DVI. AMD considers DVI legacy. If your requirement is 2xDL-DVI don't bother waiting, go buy nVidia now.
 

thilanliyan

Lifer
Jun 21, 2005
12,060
2,273
126
Except this isn't a refresh AMD is putting out to barely have a faster card....this is a new generation, which you expect to show significant gains over the competition.

Titan and 780 were released WAY after the 7970, nearly 1.5 years later in the case of the 780, so you would expect it to have a large performance lead. If the 9970 is released within this month, it is less than 6 months after the release of the 780, and so I wouldn't expect it to have as large a performance gap compared to if it was released 1.5 years after the 780. I expect the 9970 or whatever it is called will be a decent boost over the 7970 however.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Titan and 780 were released WAY after the 7970, nearly 1.5 years later in the case of the 780, so you would expect it to have a large performance lead. If the 9970 is released within this month, it is less than 6 months after the release of the 780, and so I wouldn't expect it to have as large a performance gap compared to if it was released 1.5 years after the 780. I expect the 9970 or whatever it is called will be a decent boost over the 7970 however.


780 and Titan were never answered.....they were cards nobody expected to see. NV could have done what AMD did with the 7970 and let the GTX680 stand as their only release for years, but they did not.

As to your last sentence...I would definitely hope it is a decent boost over 7970, as this has been quite a long wait.
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
Elaborate, please. :)

This will explain it way better than me:

http://wiki.blender.org/index.php/Dev:2.6/Source/Render/Cycles/OpenCL

TL;DR They have CUDA running nice and smooth and they won't bother to fix the OpenCL version of Cycles until AMD fix their [mistakes] since an OpenCL version would help only AMD.

In fact even the CUDA version lags a bit behind the CPU one feature wise. Luxrender which you may have seen around sometimes in benches as SmallLuxGPU or something like that is running into those issues too even having way smaller kernels.

So it's 100% AMD's fault.

Profanity isn't allowed in the technical forums.
-- stahlhart
 
Last edited by a moderator:

yacoub

Golden Member
May 24, 2005
1,991
14
81
It's not likely going to have 2x DL-DVI. AMD considers DVI legacy. If your requirement is 2xDL-DVI don't bother waiting, go buy nVidia now.
It's not my requirement. It would just be dumb to put 2xSL-DVI on a modern high-end card like that. Yet that image appears to show such an arrangement. It's baffling.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
This will explain it way better than me:

http://wiki.blender.org/index.php/Dev:2.6/Source/Render/Cycles/OpenCL

TL;DR They have CUDA running nice and smooth and they won't bother to fix the OpenCL version of Cycles until AMD fix their [mistakes] since an OpenCL version would help only AMD.

In fact even the CUDA version lags a bit behind the CPU one feature wise. Luxrender which you may have seen around sometimes in benches as SmallLuxGPU or something like that is running into those issues too even having way smaller kernels.

So it's 100% AMD's fault.

Cheers.

You seem to read it differently than me. Sounds like it's simply a work in progress, seeing as how OpenCL works on everybody's hardware, not just AMD's, and it's not working on any yet. I imagine they are more concerned with Autodesk's renderer first. They do need to get it sorted though.
 
Last edited by a moderator:

tential

Diamond Member
May 13, 2008
7,348
642
121
If ATI keep sorting their framepacing issues then Crossfire 290X may be a viable option for 4K gaming. Spec wise the 290X appears to fit the bill of what I would have thought upcoming 4K capable cards would be..

I want to be excited for this until I remember a 70 inch 4K screen for under 2.5k is not happening for years to come so when thath appens then I'll care about 4K resolution a LOT. Right now, it's just nice to know that when 4K screens are available, the cards will be there to drive them.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
It's not my requirement. It would just be dumb to put 2xSL-DVI on a modern high-end card like that. Yet that image appears to show such an arrangement. It's baffling.

I agree. I really doubt that would happen, though. We'll see. Although, MSI did away with DL-DVI on the 7970 Lightning. Didn't appear that too many people noticed though.

You aren't going to see 2x DL-DVI though. AMD moved away from that and I can't imagine them going back.