AMD 7000 Series Desktop Graphics Parts Delayed to Q2 or Q3 2012?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Firestorm007

Senior member
Dec 9, 2010
396
1
0
Unfortunately, I do believe that was in reference to GPGPU (double-precision floating point) and therefore is not directly related to graphics performance (although I'm sure they will still dramatically improve performance per watt).



I like how you always assume the best scenarios for AMD future hardware (6950 having 5970 performance and 50% more tessellation power than gtx480 LOL), and choose to assume moderate to poor improvements for Nvidia. With nearly the exact same die size and on the exact same node process, Nvidia improved the performance per watt of Fermi G1 vs. G2 by 25%. No die shrink, no new architecture, no massive revisions.

Your EXTREME bias when talking about companies you do not prefer oozes from every post you make.


Now that's funny. Pot meet kettle. Let's just remember the revised fermi should've been that from the beginning.

Nvidia improved the performance per watt of Fermi G1 vs. G2 by 25%. No die shrink, no new architecture, no massive revisions.

So, in essence they haven't done squat since '09. They simply fixed their original scew-up. AMD had no choice but to go 40nm. If they had the die shrink that they should have had, things would be a lot different. Add to the fact, that Cayman was a new architecture. They didn't have to FIX anything, since they were already winning. ;)
 
Last edited:

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
I just wish their driver department would be as good as their hardware department is.....

AMD is excellent when it comes to putting out frequent updates for drivers, its pretty much impossible not to find a set that works if not have a luxury to find a set that works best for whatever game you're focusing on.

The only thing they could be seen as being behind in is the driver control panel, nVidia still has a slight edge in that regard, but when it comes to simply getting games to run and run well and stable, there's no way AMD is behind.
 

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
I like how you always assume the best scenarios for AMD future hardware (6950 having 5970 performance and 50% more tessellation power than gtx480 LOL), and choose to assume moderate to poor improvements for Nvidia. With nearly the exact same die size and on the exact same node process, Nvidia improved the performance per watt of Fermi G1 vs. G2 by 25%. No die shrink, no new architecture, no massive revisions.

wow, did you eve realize that nvdia NEED to CUT DOWN GT480 JUST to be MANUFACTURED ?

its not an improvement at all. please take away your extreme bias, and see the truth
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Yes, they do. Just because they use the same semiconductor doesn't mean AMD doesn't have a lead in the manufacturing process. If you're working closer with the semiconductor and have dies that are much smaller you'll have much better yields and will be able to get working samples sooner, hence an advantage in manufacturing process.

And in the tone he was saying it, he was mentioning NVIDIA getting better perf/watt as something very noteworthy. It's not; it's like mentioning AMD getting 50% higher perf/watt with a Bulldozer revision when they're 100% behind Intel in that metric.

I think you're basing too much on Evergreen vs Fermi I. Last year NVIDIA launched their high end gpu before AMD, and I don't think AMD or most of us saw that one coming. NV and ATI have gone back and forth between well executed product launches for years now.

I thought ATI was done for back 2006 when NV launched the 8800GTX and ATI came out with the slow, hot 2800XT only months later. Even the 3800s weren't all that great... However, the 4800s put ATI/AMD back in the game with a vengeance, and NV didn't really impress all that much their GTX 200 series especially in the terms of price/perf.

My point here is that you can't really count either AMD or NV out. They've both had their successes and failures, and they both are the absolute best at what they do. The fact of the matter is that neither you or I know what they have planned, and if you've been following graphics cards for a while you should know they've both pulled off some unexpected miracles.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
So, in essence they haven't done squat since '09.

If - as you say - Nvidia hasn't done anything since '09, it's embarrassing that their main competitor has yet to release a chip that is hands down the best GPU to buy. Really, when taking a step back and looking at the overall picture, AMD hasn't released a GPU that is convincingly faster than Nvidia's broken, year and a half old, gf100 gtx480. Sure, AMD's hd6970 wins in half the benchmarks thrown at it, but we're talking about the current flagship GPU and it's up against a broken, unmanufacturable chip that was out 9 months before it....

weird....

wow, did you eve realize that nvdia NEED to CUT DOWN GT480 JUST to be MANUFACTURED ?

its not an improvement at all. please take away your extreme bias, and see the truth

So let me get this right, You are saying the gtx580 - being 20% faster and consuming 5-10% less power - is not an improvement over the gtx480?
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Arg... I need to upgrade and have been holding out for a 7K series chip. A 6950 will work fine for me, but I rather get a card that will last me as long as possible :/
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
I think you're basing too much on Evergreen vs Fermi I. Last year NVIDIA launched their high end gpu before AMD, and I don't think AMD or most of us saw that one coming. NV and ATI have gone back and forth between well executed product launches for years now.

I thought ATI was done for back 2006 when NV launched the 8800GTX and ATI came out with the slow, hot 2800XT only months later. Even the 3800s weren't all that great... However, the 4800s put ATI/AMD back in the game with a vengeance, and NV didn't really impress all that much their GTX 200 series especially in the terms of price/perf.

My point here is that you can't really count either AMD or NV out. They've both had their successes and failures, and they both are the absolute best at what they do. The fact of the matter is that neither you or I know what they have planned, and if you've been following graphics cards for a while you should know they've both pulled off some unexpected miracles.

Doesn't prove much, really. AMD didn't release anything new in the very high-end because they didn't NEED to. They had already built up a sizable advantage in various ways.

AMD didn't expect the GTX 460, but it didn't really make them go crazy or anything as it was only one product. The HD 5850 and 5870 were better for the money than the GTX 470, and the GTX 480 was simply horrible. The HD 6800 series launched, and now there was pretty much no match. The GTX 460 was worse bang-for-buck than the HD 6850 given the $20 spread and higher power consumption, the HD 6870 quickly asserted itself as a better buy than the GTX 470, and so NVIDIA released shortly thereafter the 500 series. The GTX 580, again, a power hungry monster that was too expensive. The GTX 570, a very good contender: not THAT power hungry, and a decent price. The GTX 560 Ti was meh. A bit more powerful than the HD 6870, costed $30 more, less power efficient. The non-reference designs were great, though. Then came the GTX 560, a card that to this day I simply do not understand its purpose. It was the same speed as the HD 6870 and costed $30 more for some reason, even though it was a tweaked 460 with 5% higher OCing headroom or so and somewhat better power consumption.

Overall ever since 2008 AMD has consistently won against NVIDIA overall, just like Intel has consistently won against AMD since 2006, but seeing the others win has been an exception to the rule. I don't think we'll see that change.
 
Last edited:

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
If - as you say - Nvidia hasn't done anything since '09, it's embarrassing that their main competitor has yet to release a chip that is hands down the best GPU to buy. Really, when taking a step back and looking at the overall picture, AMD hasn't released a GPU that is convincingly faster than Nvidia's broken, year and a half old, gf100 gtx480. Sure, AMD's hd6970 wins in half the benchmarks thrown at it, but we're talking about the current flagship GPU and it's up against a broken, unmanufacturable chip that was out 9 months before it....

weird....



So let me get this right, You are saying the gtx580 - being 20% faster and consuming 5-10% less power - is not an improvement over the gtx480?

It's still a power hungry monster that's extremely overpriced, so I certainly wouldn't call the 580 a good card. The GTX 570, apart from the power phase and DOA issues, yes. It's good for the money.
 
Feb 19, 2009
10,457
10
76
How is the gtx580 the best card, are you purely judging on 1080p?

The much cheaper 6970 has something to say when we're talking 1600p or multi-monitor and multi-card scenario, all while using a lot less power. 3x 6970 costs nearly the same as 2x gtx580. Which one wins?

Also, the top single card is the 6990.

Sure, NV did boost perf/watt of the gtx580 vs 480 by 25%. But fixing a power hungry inefficient card to start with isn't a big deal. We're talking 40nm to 28nm transition here.
For both companies, if they keep die sizes similar, the most performance/watt you can expect to get is in the x1.6 to x1.8 range. NV claiming a 3x boost is too ridiculous to even consider.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Looks like I'll be better off getting another 460 for SLI to hold me over until these 28nm parts arrive...

Once I saw the GeForce GTX5xx cards and Radeon 6xxx cards, I grabbed a second 5870. Of course with the 7xxx cards starting to roll out at the end of the year, I'm not sure I would buy a second card just yet. But my guess would be that we'll see a 78xx part first, something that gives ~6970 performance but runs cooler and uses less power. I doubt we'll see the 'good stuff' until next year.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
How is the gtx580 the best card, are you purely judging on 1080p?

NO IT DOES NOT MATTER. There are games and situations when the hd6970 is faster, but the gtx580 is faster more often than not than the hd6970 when running games at playable frame rates. You can't possibly dispute that and believe yourself at the same time.

The much cheaper 6970 has something to say when we're talking 1600p or multi-monitor and multi-card scenario, all while using a lot less power. 3x 6970 costs nearly the same as 2x gtx580. Which one wins?

Also, the top single card is the 6990.

I wasn't talking about price, I wasn't talking about multi-card performance, I was not talking about the hd6990.

For both companies, if they keep die sizes similar, the most performance/watt you can expect to get is in the x1.6 to x1.8 range. NV claiming a 3x boost is too ridiculous to even consider.

This is the second time I have clarified that statement in this thread - the claim was in direct reference to Kepler's double-precision floating point performance. If you've read this far, and hopefully know now that the claim was not at all talking about Kepler's graphical performance vs. Fermi, is it still too ridiculous to consider?
 
Feb 19, 2009
10,457
10
76
Unfortunately tviceman, the vast majority of consumers do not live in a world of unlimited money to pay for overpriced goods, so price and power consumption is ALWAYS a factor in what determines "the best" graphics card.

DP FP matters in games since when? Its irrelevant, i was talking about general expectations of 28nm performance, 60-80% is the logical expectation. You jump in with a personal atk and quoting 3x DP FP.. why?
 

7earitup

Senior member
Sep 22, 2004
391
0
76
Arg... I need to upgrade and have been holding out for a 7K series chip. A 6950 will work fine for me, but I rather get a card that will last me as long as possible :/

I was going to do the same for BF3 because I thought something in the 7 series would be out soon. I guess I am going to pull the trigger on a 6950 now.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
If - as you say - Nvidia hasn't done anything since '09, it's embarrassing that their main competitor has yet to release a chip that is hands down the best GPU to buy. Really, when taking a step back and looking at the overall picture, AMD hasn't released a GPU that is convincingly faster than Nvidia's broken, year and a half old, gf100 gtx480. Sure, AMD's hd6970 wins in half the benchmarks thrown at it, but we're talking about the current flagship GPU and it's up against a broken, unmanufacturable chip that was out 9 months before it....

What a short memory you have. ATI has over the course of their existence had multiple GPU's that were hands down the fastest on the market. 5870 I believe was one of them, not to mention the 9700, 1950xt, etc, etc.

Nvidia and ATI have traded blows for many years -- Sometimes nvidia had the fastest part, sometimes ATI has had the fastest part. You are naive indeed if you don't believe AMD can retake the crown. Hopefully the 7xxx will be released in December like AMD is projecting and is super fast - competition is a good thing in the GPU business.
 
Last edited:

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
To put all this e-peen fighting to rest, we really haven't seen a major leap in single GPU technology since October of '09 when AMD put out the Radeon 5870. In April of 2010 we saw nVidia much delayed GTX480, and of course both sides put out refreshes with the 6970 and GTX580 pushing each companies' flagship ever so slightly higher. So anyone suggesting that any one side has been disappointing is full of themselves, and it should be obvious why any of us are anxious to get our hands on these next gen parts.

Of course the major reason we have not seen a major leap in GPU technology is because we've been stuck at 40nm since '09 (in fact the Radeon 4770 debuted as far back as April 28, 2009) meaning we've ultimately been without a new process shrink for GPUs going on 2.5 years, that's like 75 in computer years :p

So yeah, even though I'm currently using nVidia because I can make use of CUDA for work as much as I use the GPU for gaming, I'm starved enough for single GPU gaming performance that I would buy a Radeon 7900 in a heartbeat to tide me over until I can get a Kepler based card to replace my GTX580 (although if Kepler's single GPU game performance wasn't in line with a Radeon 7900, I'd skip it), for me 28nm GPUs can't get here fast enough, so any news of further delay from either side is very disheartening for me, and should be for anyone :p


How is the gtx580 the best card, are you purely judging on 1080p?
because its simply the best single GPU card money can buy? and by a convincing margin?

http://www.anandtech.com/bench/Product/292?vs=305

looks to me like the GTX580 wins every benchmark except for one game at one resolution and its only other performance loss is in a synthetic benchmark, not a game. Sure, it loses power consumption, but as I will point out later, its loads for idle and games aren't unreasonable

sure, the 580's value is still very much subjective, but that doesn't stop it from being the fastest single GPU card money can buy (again, wins in dozens of benchmarks should prove that convincingly, to make an argument otherwise is ridiculous).

if you really want to get into the semantics of GPU value we can make arguments all day that make the 6900s look like extremely poor values in comparison to the GTX460s or even AMD's own 6800s

The much cheaper 6970 has something to say when we're talking 1600p
Nope, not even 1600p, the GTX580 is still better, see benches above, only technically loses once, and by a negligible margin

or multi-monitor and multi-card scenario
Again, an entirely different animal. Multi monitor solutions all but require multiple GPUs to get playable frame rates for modern games without sacrificing image quality to ridiculous levels. Again, the argument was over the best GPU, not best multi card, which could still be argued in favor of nVidia, but I'm not even going to get in to that as it wasn't the original point of contention

all while using a lot less power
+11W difference in idle and +49W difference in the most stressful game loads means difference in total system power draw is pretty much in line with the % performance advantage the GTX580 offers, so performance per watt really isn't an advantage for AMD with the 6970 vs. GTX580

3x 6970 costs nearly the same as 2x gtx580. Which one wins?
Again, it was pretty obvious, at least to me, that he was talking about single GPU

Also, the top single card is the 6990.
See above
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81

One thing you forget: the fastest card does not equal the best card.

If I were to say which was the single best graphics card out on the market for the vast majority of gamers, it'd be hands down the Radeon HD 6870. It's great bang-for-buck and it's enough to drive most games at max settings with AA at 1920x1080/1200 while consuming a relatively low amount of power.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
JHH stated that Kepler should have 3-4x the performance/watt vs. Fermi.

A realistic expectation is in the x1.6 to x1.8 perf/watt.

Unfortunately, I do believe that was in reference to GPGPU (double-precision floating point) and therefore is not directly related to graphics performance (although I'm sure they will still dramatically improve performance per watt).

For both companies, if they keep die sizes similar, the most performance/watt you can expect to get is in the x1.6 to x1.8 range. NV claiming a 3x boost is too ridiculous to even consider.

This is the second time I have clarified that statement in this thread - the claim was in direct reference to Kepler's double-precision floating point performance. If you've read this far, and hopefully know now that the claim was not at all talking about Kepler's graphical performance vs. Fermi, is it still too ridiculous to consider?

DP FP matters in games since when? Its irrelevant, i was talking about general expectations of 28nm performance, 60-80% is the logical expectation. You jump in with a personal atk and quoting 3x DP FP.. why?

Please show me where I said DP FP matters in games? I didn't.

The subject of performance per watt started when RussianSensation quoted JHH saying Kepler would have 3x the performance per watt over Fermi - and I clarified that statement by saying that quote was in relation to DP FP. Then you jumped in and said 3x performance per watt improvement was too ridiculous a claim, when you didn't understand that JHH was speaking in regards to a very specific function of their future architecture. I again clarified the statement for you, but you again did not connect the dots and thought that I was relating DP FP to graphics performance. Sorry I didn't realize I would have to go back and quote 5 prior messages for you to see you misunderstood what JHH was referring to.

And there is nothing personal when pointing out, whether it's about current or future unreleased architectures, that you are extremely bias in favor of AMD and extremely pessimistic toward Nvidia. It's very obvious, or like I said, it oozes from your posts.
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
What a short memory you have. ATI has over the course of their existence had multiple GPU's that were hands down the fastest on the market. 5870 I believe was one of them, not to mention the 9700, 1950xt, etc, etc.

Nvidia and ATI have traded blows for many years -- Sometimes nvidia had the fastest part, sometimes ATI has had the fastest part. You are naive indeed if you don't believe AMD can retake the crown. Hopefully the 7xxx will be released in December like AMD is projecting and is super fast - competition is a good thing in the GPU business.

You are entirely correct with everything you said about AMD/ATI all the past AMD/ATI GPU's you mentioned. Unfortunately, nothing of which you said has anything at all to do with what I was talking about. I never said ATI didn't have great parts and I never said they couldn't retake the crown in the future.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
One thing you forget: the fastest card does not equal the best card.

If I were to say which was the single best graphics card out on the market for the vast majority of gamers, it'd be hands down the Radeon HD 6870. It's great bang-for-buck and it's enough to drive most games at max settings with AA at 1920x1080/1200 while consuming a relatively low amount of power.

best
   [best]
–adjective, superl. of good with better as compar.
1.
of the highest quality, excellence, or standing: the best work; the best students.
2.
most advantageous, suitable, or desirable: the best way.
3.
largest; most: the best part of a day.

by any definition of the word best, yes, the 580 best fits the bill without further stipulation, if we add in "for the buck" or "for my needs", yes, that's completely subjective, and I even go over that, so obviously you didn't read my entire post. Part you obvious skipped: "if you really want to get into the semantics of GPU value we can make arguments all day that make the 6900s look like extremely poor values in comparison to the GTX460s or even AMD's own 6800s"


The best card money can buy =/= The best card I can buy with my money.

and the world does not revolve around you, the minute everyone buying computer hardware can realize that simple fact, we'd be saved a lot of grief on the forums
 
Last edited:

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
One thing you forget: the fastest card does not equal the best card.

If I were to say which was the single best graphics card out on the market for the vast majority of gamers, it'd be hands down the Radeon HD 6870. It's great bang-for-buck and it's enough to drive most games at max settings with AA at 1920x1080/1200 while consuming a relatively low amount of power.

That was a few weeks ago. Just checked Newegg and the cheapest HD 6870 is now $180 - exactly the same as the cheapest GTX 560. More or less the same performance for both (deciding factor being if a game is Nvidia or AMD friendly). The HD 6870 does use about 30-40W less power but neither card is what you'd call power hungry. I'd rather use the Nvidia Control Panel to set up custom quality settings for individual games that the CCC. That was pretty much the deciding factor for me all other things being more or less equal (or relatively unimportant like slightly more power consumption).

With a sub-$200 card I don't understand people getting so wound up over a 10 or 20 spot one way or the other or some teaser rebate with a card you'll keep for a couple of years. Makes me shake my head over some of the scathing comments I've seen posted about the GTX 560 as if it were the worst possible buy of our day. Its basically a refined GTX 460 for $20-$30 more. My old reference GTX 460 (which just went into a family member's rig I assembled for them) wouldn't top 820 Mhz on stock voltage - definitely not a great sample. This GTX 560 easily hit 950 Mhz. Worth the little extra to me.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
I pop in to check out one thread, and am hit with a big fat reminder in the face why the VC&G forum sucks so bad. Why are you guys arguing nVidia 5xx series vs. AMD 6xxx series in a thread about AMD 7xxx series?

Someone trolled you guys hard.
 

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
Once I saw the GeForce GTX5xx cards and Radeon 6xxx cards, I grabbed a second 5870. Of course with the 7xxx cards starting to roll out at the end of the year, I'm not sure I would buy a second card just yet. But my guess would be that we'll see a 78xx part first, something that gives ~6970 performance but runs cooler and uses less power. I doubt we'll see the 'good stuff' until next year.

Yeah, I'm going to see how much I need to knock off in image quality settings to run smoothly before I decide if another card is needed. Worst case scenario, I get another 460, and then when the new cards are released I get one of those. Can keep one of the 460s in my current rig that will go as an HTPC, and the other one can be a backup. Just bad timing I guess...
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
That was a few weeks ago. Just checked Newegg and the cheapest HD 6870 is now $180 - exactly the same as the cheapest GTX 560. More or less the same performance for both (deciding factor being if a game is Nvidia or AMD friendly). The HD 6870 does use about 30-40W less power but neither card is what you'd call power hungry. I'd rather use the Nvidia Control Panel to set up custom quality settings for individual games that the CCC. That was pretty much the deciding factor for me all other things being more or less equal (or relatively unimportant like slightly more power consumption).

With a sub-$200 card I don't understand people getting so wound up over a 10 or 20 spot one way or the other or some teaser rebate with a card you'll keep for a couple of years. Makes me shake my head over some of the scathing comments I've seen posted about the GTX 560 as if it were the worst possible buy of our day. Its basically a refined GTX 460 for $20-$30 more. My old reference GTX 460 (which just went into a family member's rig I assembled for them) wouldn't top 820 Mhz on stock voltage - definitely not a great sample. This GTX 560 easily hit 950 Mhz. Worth the little extra to me.

The only GTX 560 at $180 is an ECS model, plus you have to pay $7.50 shipping. It's also a card that has had almost no buyers. The Sapphire HD 6870 is $170 with free shipping, and it's a popular card. It also consumes less power. The GTX 560's pricing makes no sense, but you could say the same of the GTX 560 Ti which is slower than the HD 6950 1GB, consumes more power, and costs the same.

Also, IIRC neither the NVIDIA or Catalyst Control Panels actually change the quality of around half the games.
 
Status
Not open for further replies.