Radeon 7990 to debut at computex - expected $849

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
The interesting question will be:
Will people be blinded by fps alone or will they take AFR woes into account? Were are these advanced frame time measurements? It's about time...

I've used both setups, 7970CF and 680 SLI and can state there is no such microstutter on 7970CF on a single screen. I do know that kyle commented on it during 3d surround resolutions, and have seen others mention jerkiness in eyefinity......So I don't know if its limited to eyefinity, but its definitely not there on a single screen at 2560 resolution. Of course this won't stop people from using this as ammo to pounce for the attack from people that haven't used both setups, but i'm honestly confused when I see people mention it. I never saw it, ever - maybe its limited to eyefinity. Shrug I dunno. I really don't plan on messing with surround in the near future. Most of the people commenting on it seem to be people that have never used either setup, so I won't say that I completely discount it but I'm just confused - maybe its eyefinity? I have no idea like I said.

Anyway, if this news is true the 7990 won't win an outright battle with the 690 because the kepler is a more efficient chip -however, there are definitely things that work in the 7990s favor. Stuff that can work in the 7990s favor are price (849$ rumored), possibly availability, overclockability, no voltage lock (you cannot over volt the 690 at all.), and eyefinity tools (eyefinity desktop management is said to be worlds better than nv surround). Of course the 690 will win on other metrics as well - It (690) will have better efficiency and all of the benefits of being in the nvidia ecosystem in terms of good software support. So the 690 might win overall but there are definitely things to like about the 7990 if these rumors are true. I think its certain this card will be great for water cooling because it doesn't impose any limitations on voltage.....we'll have to wait and see if the rumored news is true I guess.
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
It's inherently bound to AFR and it does exist. It doesn't have anything to do with Eyefinity or Surround per se, but with the absolute (displayed) fps and the GPU load. Either your fps are too high (let's say 50+), you're CPU bound (yes, even with your CPU it may happen here or there) or you are not so sensitive to it. Also it is possible that one becomes accustomed to it and it may seem normal.

Because there are so many variables to this issue, I would like to see some objective testing methodology. Have you read the techreport article? They have been messing around with a high speed camera, a very interesting way to highlight the problem.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
It's inherently bound to AFR and it does exist. It doesn't have anything to do with Eyefinity or Surround per se, but with the absolute (displayed) fps and the GPU load. Either your fps are too high (let's say 50+), you're CPU bound (yes, even with your CPU it may happen here or there) or you are not so sensitive to it. Also it is possible that one becomes accustomed to it and it may seem normal.

Because there are so many variables to this issue, I would like to see some objective testing methodology. Have you read the techreport article? They have been messing around with a high speed camera, a very interesting way to highlight the problem.

I have noticed that nvidia sli is smoother with vsync disabled. There is definitely tearing on both setups with vsync disabled, however nvidia is smoother somehow? I can't explain it. But with vsync enabled there isn't any MS that I saw on either setup - they were both really smooth. I assume most people play with vsync and only disable it for benchmarks (unless they're using 120hz), I could be wrong though. I haven't tested either setup on a 120hz display so I can't comment on that.

I agree with you that a good objective testing methodology is long overdue.
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
What fps are you usually having? If you can hit 75 but vsync limits you to 60, GPU load is well below 99%. The jitters are also less noticeable on higher fps in general. I cannot play with SLI below 40-50 depending on the game.

Anyway, this issue has been well documented way before any kind of frame metering was introduced by Nvidia. Think back to 8800GTX SLI or even before that.
 
Last edited:

Bobisuruncle54

Senior member
Oct 19, 2011
333
0
0
I have noticed that nvidia sli is smoother with vsync disabled. There is definitely tearing on both setups with vsync disabled, however nvidia is smoother somehow? I can't explain it. But with vsync enabled there isn't any MS that I saw on either setup - they were both really smooth. I assume most people play with vsync and only disable it for benchmarks (unless they're using 120hz), I could be wrong though. I haven't tested either setup on a 120hz display so I can't comment on that.

I agree with you that a good objective testing methodology is long overdue.

It's smoother because the Nvidia setup will have a lower delay between the rendering of each frame. This is why microstutter can be more or less eliminated if your fps is more or less constant and capped using a frame limiter.

All microstutter is, is the delay created by multiple GPUs having to work together. One has to check with the other to see if it's ready and prompt it to render the frame. Think of it like two people trying to pedal a bike with one set of pedals compared to one person pedaling on their own, except with the added complication of quickly varying loads due to players looking at complex to simple scenes and vice-versa from one moment to the next. The extra power is there but if it isn't coordinated efficiently then it can be less efficient than a single person.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
7970 < 6970 power use, by ~40W.

7990 < 6990 power use, should be heaps lower.

46202.png


What were you saying about the hd7970 using 40 less watts than the hd6970? Looks like 20 more, not 40 less. And if AMD is going to upclock these chips for a dual part, then we're definitely looking at a part more power hungry than the hd6990.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
And OCCT doesnt get better either. If we assume worst case:

45175.png


And as a note, HD6990 was downclocked compared to its HD6970 sibling.
 
Last edited:
Feb 19, 2009
10,457
10
76
What were you saying about the hd7970 using 40 less watts than the hd6970? Looks like 20 more, not 40 less. And if AMD is going to upclock these chips for a dual part, then we're definitely looking at a part more power hungry than the hd6990.

I thought 7970s are rated for 210W TDP and 6970s are 250W.. guess AMD playing by different measurements of TDP.

But yeah if its more power hungry than 6970, then the dual card is asking to be extremely hard to cool and stay quiet.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
I thought 7970s are rated for 210W TDP and 6970s are 250W.. guess AMD playing by different measurements of TDP.

But yeah if its more power hungry than 6970, then the dual card is asking to be extremely hard to cool and stay quiet.


Which is what I was trying to express in the OP before I got gobbled up by all the boys in red.

Let's say that 28nm process has matured and power consumption has decreased. We can safely assume that a 1000 Mhz 7970 is going to use the same power as a 925 Mhz one.

If we assume the same power savings between GTX 680 SLI and GTX 690 (around 12%), that puts the 7990 at 648W - 12%, which equals 570W.

That's 100W more (and probably many dB of noise more) than the GTX 690, for roughly the same performance.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
With the current hit and miss with driver support what would be the point of a dual card?

I wouldn't buy one if I was in the market for a dual gpu card....Unless AMD launches some magic drivers with the card.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
690 is slower than 680 SLI, 690 is closer to 670 SLI or btw 670 SLI and 680 SLI.

7990 > 690. PERIOD Overall, but individual victory depends on the game.

7990 compete/beat 680 SLI which is faster than 690.

Do you have a source to back that up, which states that HD7990 will have 1Ghz clock speeds as opposed to the rumored 850mhz clocks?

Also, the performance difference between GTX690 and GTX680 SLI is almost non-existent:
http://www.computerbase.de/artikel/grafikkarten/2012/test-nvidia-geforce-gtx-690/4/

Further, if don't cherry pick 1-2 benchmarks, GTX690/GTX680 SLI are at least as fast as HD7970 CF (see link above). HD7970 CF will be faster if you overclock each 7970 to 1100mhz and beyond, but how is this related to the single card HD7990? Single GPU 7970 overclocking is irrelevant for HD7990 unless AMD uses 3x 8 -pin power connectors.

Until there is a confirmation that AMD can squeeze 1Ghz 7970 Tahiti chips, and cool it effectively you are just guessing.

Don't forget that HD7970's power consumption is not any better than an HD6970. Actually a lot of reviews are showing that 7970 has worse power consumption:
http://www.bit-tech.net/hardware/graphics/2011/12/22/amd-radeon-hd-7970-3gb-review/8
and
Power.png


So how in the world will they fit 2x 1Ghz 7970 chips and cool it effectively when HD6990 was the 2nd loudest videocard of ALL TIME.

AMD can be competitive if the card has cheaper price and wider availability. Will it be as quiet as the 690? I wouldn't bet with 100% certainty on it beating 690 as you are so confidently stating.

I thought 7970s are rated for 210W TDP and 6970s are 250W..

It's 250W. AMD has not really improved the power consumption of the 7970 over the 6970. Performance/watt increased because HD7970 is 40-45% faster out of the box. :)

imageview-1.jpg
 
Last edited:

chimaxi83

Diamond Member
May 18, 2003
5,649
61
101
Which is what I was trying to express in the OP before I got gobbled up by all the boys in red.

Let's say that 28nm process has matured and power consumption has decreased. We can safely assume that a 1000 Mhz 7970 is going to use the same power as a 925 Mhz one.

If we assume the same power savings between GTX 680 SLI and GTX 690 (around 12%), that puts the 7990 at 648W - 12%, which equals 570W.

That's 100W more (and probably many dB of noise more) than the GTX 690, for roughly the same performance.

Don't flatter yourself bro, you think anyone who brings an opposing viewpoint is a "boy in red"? Grow up.

Anyway, why do people act like 590/6990/690/7990 buyers complain about power draw? Who cares. Show me the FPS.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
If AMD trumps Nvidia with a higher performing single and dual gpu video card, I fully expect Nvidia to respond in kind with a gtx685 (higher core and memory clocks) as well as a gtx695. Nvidia made it clear that GK110 will not see the light of day in 2012 as a geforce product unless they absolutely have to, which is all but guaranteed NOT to happen since Sea Islands is 100% confirmed for 2013.
 

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
If AMD trumps Nvidia with a higher performing single and dual gpu video card, I fully expect Nvidia to respond in kind with a gtx685 (higher core and memory clocks) as well as a gtx695. Nvidia made it clear that GK110 will not see the light of day in 2012 as a geforce product unless they absolutely have to, which is all but guaranteed NOT to happen since Sea Islands is 100% confirmed for 2013.

You keep on mentioning this gtx 685,where is it what is it why is it?,wishfull thinking?
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
You keep on mentioning this gtx 685,where is it what is it why is it?,wishfull thinking?

It's not "wishful" thinking. It doesn't exist yet, and I really personally don't care if they make it or not. I'm just stating what I think will happen. If it does come, all it will be is a higher clocked GK104 chip with hopefully faster vram. Regardless Nvidia will not sit idle if AMD re-releases the hd7970 as a faster part. Nvidia is much more reactive to competition than AMD is, and there is a history of Nvidia releasing higher clocked parts (8800gtx ultra, gtx285) with all other specs the same as the model below it.
 
Last edited:

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
It's not "wishful" thinking. I really don't care if they do or not. I'm just stating what I think will happen. Nvidia will not sit idle if AMD re-releases the hd7970 as a faster part. Nvidia is much more reactive to competition than AMD is, and there is a history of Nvidia releasing higher clocked parts (8800gtx ultra, gtx285) with all other specs the same as the model below it.

8800 Ultra was unnecessary as AMD had no competition, and just milked money from stupid people.

GTX 285 was a shrink and a cheaper PCB than the 280 to save on production costs and increase margins.

Neither of those examples are reactions to AMD.

NV doesn't really need to do anything if AMD releases a higher clocked Tahiti card. AMD will just be closing the gap, and NV can still sell with a slight premium because they offer more features with their product.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Anyway, why do people act like 590/6990/690/7990 buyers complain about power draw? Who cares. Show me the FPS.

He is not talking about the power draw from a perspective of users complaining about it. It's about how the power draw will impact the design of the board and the heatsink that will be necessary to dissipate that much heat. Because 2x HD7970 draws a lot more power than a GTX690, it's going to be far more difficult for AMD to release 1Ghz 7970 Tahiti in a pair. This is why all the rumors until this point have alluded to AMD lowering 7970's clocks to 850mhz for the dual-GPU card.

This entire thread is mid-titled.

The article roughly translates:

"Some manufacturers do not want to leave the square is empty against Nvidia's GeForce GTX 690"

There is no evidence in this source that it is AMD who is launching HD7990. All it says is that some board partners are thinking of doing custom 7970 X2 cards and not waiting for AMD.

AMD may or may not launch HD7990 at some point but it has nothing to do with what this article is about.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Given how well-known Crossfire issues are, whoever buys a 7990 shouldn't say he or she wasn't warned if he or she did any research into that at all.
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
Aaaannnddd AMD releases a 7990 that has two Pitcarn chips with the same amount of bandwidth and SPs as the 7970 clocked at 1100 mhz :D

Obviously not going to happen, but makes next gen much more interesting with AMD currently holding the compute crown and Nvidia holding the gaming crown.
 
Last edited:

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
It's not "wishful" thinking. It doesn't exist yet, and I really personally don't care if they make it or not. I'm just stating what I think will happen. If it does come, all it will be is a higher clocked GK104 chip with hopefully faster vram. Regardless Nvidia will not sit idle if AMD re-releases the hd7970 as a faster part. Nvidia is much more reactive to competition than AMD is, and there is a history of Nvidia releasing higher clocked parts (8800gtx ultra, gtx285) with all other specs the same as the model below it.

Hmm,it doest exist yet then?:eek:maybe you will talk about it less then(i highly doubt it).
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Hmm,it doest exist yet then?:eek:maybe you will talk about it less then(i highly doubt it).

Or you could just hit the ignore button and you won't have to worry about it anymore. I know it must be really irritating to see that I've mentioned it 4 times in the past month, especially since this entire thread is speculating about future products.
 

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
Or you could just hit the ignore button and you won't have to worry about it anymore. I know it must be really irritating to see that I've mentioned it 4 times in the past month.

Worry?,and ignore you?.chill out dude and amuse me some more:)
Sorry tv im bored,youve probably noticed:p
 
Last edited: