Nvidia GTX 690 = 2 x Nvidia GTX 680!!

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Until now nobody has done anything objective regarding MS except their subjective findings.Like i said in my previous post there are tools for the job,why don't use it?

Well, people have been trying to objectively measure microstutter prior to the TR article:

http://www.xtremesystems.org/forums...stutter-in-latest-gen-cards-examples-included

http://www.tomshardware.com/reviews/radeon-geforce-stutter-crossfire,2995-4.html

http://www.xtremesystems.org/forums/showthread.php?243190-HD5970-Microstuttering-tests
 
Feb 19, 2009
10,457
10
76
I have a hunch that AMD is gonna go balls to the wall with the 7990. Seeing as NV gave them the bird for their very conservative clocks...

I'm expecting 1ghz edition, 350W beast with a 2nd bios with "Turbo" setting to 1.2ghz and 400W. Top bin chips only.

Think about it, single 6970 = ~250W. 6990 = ~350-400W. This time, single 7970 = ~210W. They have TDP to spare, therefore, crank up the clocks.

As to the micro-stutter, did techreport ever added a high speed camera to capture their monitor to actually detect IRL microstutter? I read their original article and their conclusion = all that measurements they did was pointless due to the way drivers output the frames to the monitor.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I have a hunch that AMD is gonna go balls to the wall with the 7990. Seeing as NV gave them the bird for their very conservative clocks...

I'm expecting 1ghz edition, 350W beast with a 2nd bios with "Turbo" setting to 1.2ghz and 400W. Top bin chips only.

Think about it, single 6970 = ~250W. 6990 = ~350-400W. This time, single 7970 = ~210W. They have TDP to spare, therefore, crank up the clocks.

As to the micro-stutter, did techreport ever added a high speed camera to capture their monitor to actually detect IRL microstutter? I read their original article and their conclusion = all that measurements they did was pointless due to the way drivers output the frames to the monitor.

And it will get bashed in every review just like Fermi, AMD doesn't have the perf or perf/watt this generation and if they try to pull clocks they're going to be slower or on par while using grossly more power. Imagine if the 480 wasn't actually faster than the 5870, yet used more power instead of being faster while using more power... That would be the worst mistake AMD could make, a hot, loud, power hungry card that only offers similar performance... They just need to lick their wounds and move on, and considering how awful CF is right now with 7xxx series, there is no reason to even release hardware CF since their software support is terrible atm.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I have a hunch that AMD is gonna go balls to the wall with the 7990. Seeing as NV gave them the bird for their very conservative clocks...

I'm expecting 1ghz edition, 350W beast with a 2nd bios with "Turbo" setting to 1.2ghz and 400W. Top bin chips only.

Think about it, single 6970 = ~250W. 6990 = ~350-400W. This time, single 7970 = ~210W. They have TDP to spare, therefore, crank up the clocks.

Throwing more horsepower at a problem, rather than directly addressing a problem in a more elegant manner, is old school thinking. I hope AMD tries to do more than throw "moar GPU horsepower" at the problem. Think outside the box AMD! Try to get adaptive Vsync in place, fix microstutter, fix third-screen tearing on Displayport (and jettison the need for DP adapters at all), give central monitor more GPU horsepower in Eyefinity. Be more efficient with your GPU resources and think about how to address human perceptions.

Once upon a time Intel tried the "moar clock frequencies!" tactic with Netburst architecture. That strategy reached the point of diminishing returns, and Intel's rival took advantage.


And it will get bashed in every review just like Fermi, AMD doesn't have the perf or perf/watt this generation and if they try to pull clocks they're going to be slower or on par while using grossly more power. Imagine if the 480 wasn't actually faster than the 5870, yet used more power instead of being faster while using more power... That would be the worst mistake AMD could make, a hot, loud power hungry card that only offers similar performance... They just need to lick their wounds and move on, and considering how awful CF is right now with 7xxx series, there is no reason to even release hardware CF since their software support is terrible atm.

To be fair, about wattage, the Tahiti arch is more about HPC. If you want apples to apples you need to compare it with the stripped-down architecture of Pitcairn. Not that any of that helps AMD right now since they went with the Tahiti design as their flagship model, rather than Nvidia which apparently is saving BigK for HPC-only by the looks of it.
 

DeeJayeS

Member
Dec 28, 2011
111
0
0
A bit OT, but any news on 660ti/670 release/pricing/performance? I gotta buy before Diablo III hits shelves...
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I think NV should push it.I bet AMD has similar tools ,they should push it as well.We are dealing with science here not fiction.So this MS debate should be dealt with once and for all.

Neither NVIDIA nor AMD are to happy to talk about microstutter...but perhaps that will change.
Sad part is some people still believe that microstutter is a myth...if they got wiser, that might mean less dual GPU configs = less sales...so it's all hush-hush ;)
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
A bit OT, but any news on 660ti/670 release/pricing/performance? I gotta buy before Diablo III hits shelves...

D3 like all Blizzard games, is not stressful on GPUs. If you are at 1080p or lower, something like a GTX 460 1GB should be more than enough for high settings, especially with overclocking.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
To be fair, about wattage, the Tahiti arch is more about HPC. If you want apples to apples you need to compare it with the stripped-down architecture of Pitcairn. Not that any of that helps AMD right now since they went with the Tahiti design as their flagship model, rather than Nvidia which apparently is saving BigK for HPC-only by the looks of it.

Reviewers don't care, just like they didn't care when Fermi was the only real HPC solution.

Fair or not it's still done, just like 5870 vs 480 comparisons, where Fermi destroyed 5xxx in tess and compute programmability and scaled much better into DX11 than 5xxx did while having a power limiter that Fermi did not and using Furmark to compare power draw.

A high clocked 7990 will get crucified for having similar performance at the cost of 100s of watts.

Neither NVIDIA nor AMD are to happy to talk about microstutter...but perhaps that will change.
Sad part is some people still believe that microstutter is a myth...if they got wiser, that might mean less dual GPU configs = less sales...so it's all hush-hush ;)

Most people who have dual gpus are aware of it, the question becomes if you notice it or not. I notice it in Heaven, I do not in BF3, I've run my own tests with the MS program and Heaven even at 100+ fps is exhibiting MS while BF3 will not even at lower fps.

The real misconception is that only dual cards have MS, which is simply not true, even single cards have it.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Neither NVIDIA nor AMD are to happy to talk about microstutter...but perhaps that will change.
Sad part is some people still believe that microstutter is a myth...if they got wiser, that might mean less dual GPU configs = less sales...so it's all hush-hush ;)

I take it from your lack of response to my post that you were not referring to Lava or me then. Good, because that TR report only affirmed the microstutter issue.
 
Feb 19, 2009
10,457
10
76
I've seen plenty of CF reviews, 7970s are not lacking at all. A pair of them @ 1ghz will beat a gtx590, easily. Not to mention if it has a turbo bios like previous dualgpu cards from AMD.

AMD knows this too. They only have one option currently, max it and win in performance while losing in power use. Look at the 6990 vs gtx590, they specifically added in the 2nd turbo bios just so they can claim "world's fastest graphics". NV left them no choice, without a HALO card and the ability to claim "the best in graphics", their entire brand suffers. It's critical considering they are trying to gain marketshare in the mobile space relying on their brand name and FUSION graphics.
 

-Slacker-

Golden Member
Feb 24, 2010
1,563
0
76
Don't give a *bleep* about it *shrugs*

"*shrugs*"? What do you meant "shrugs", you expect people to believe you had no opinion on the hd7970 launch price?

Let me simplify the question: Do you think the hd7970 $550 launch prices were too high, or do you think that they were reasonable?
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
"*shrugs*"? What do you meant "shrugs", you expect people to believe you had no opinion on the hd7970 launch price?

Let me simplify the question: Do you think the hd7970 $550 launch prices were too high, or do you think that they were reasonable?

You must be mistaken me for one of the loud "perff/watt" or "sweetspot" advocates.
I'm not.
Look through my posts.

I advocate performance...and don't care about the rest.
(only thing I dislike are stuff like Microstutter eg.)

But nice way of rasing doubts about my credibility :thumbsdown:

So let me repeat, since it didn't get through the last time:

*shrugs*
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Nvidia has lost their minds...


I bet they sell out of every single one at launch, at that price... and probably for at least until the 7990 launches. But I won't be buying one, even with my $550 (what I paid) video card, this card costs very nearly what the entire cost of my whole system costs, and at 19x12, I doubt I notice any real difference between this and my card. The Diablo III beta has reminded me that, while pretty graphics are nice, I don't need Crysis graphics to really enjoy a game. :)
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I've seen plenty of CF reviews, 7970s are not lacking at all.

You must have missed this one:

While we know not a lot people run triple-card video configurations, looking at the gameplay experiences between NVIDIA GeForce GTX 680 3-Way SLI and AMD Radeon HD 7970 Tri-Fire CrossFireX sure gives us an interesting topic for discussion and surely brings some things to light when it comes to Red vs. Green. This is the ultimate build in terms of gaming performance, but the experiences between both couldn\'t be more different.

There is an overall smoothness and consistency advantage to SLI that is superior to CrossFireX

[H] verbalized what every article on the subject states, CF is worse than SLI when it comes to multi-gpus, noticeably worse.

Look at the words he used, SLI = Smoothness & Consistency = Superior

I know Anand has a strong pro AMD user base, which explains why it is so strongly against mutli-gpu setups, you won't find this trait nearly as common in the more diverse online communities.

Again for the record, SLI isn't the same as CF, never has been.
 

OVerLoRDI

Diamond Member
Jan 22, 2006
5,490
4
81
And it will get bashed in every review just like Fermi, AMD doesn't have the perf or perf/watt this generation and if they try to pull clocks they're going to be slower or on par while using grossly more power. Imagine if the 480 wasn't actually faster than the 5870, yet used more power instead of being faster while using more power... That would be the worst mistake AMD could make, a hot, loud, power hungry card that only offers similar performance... They just need to lick their wounds and move on, and considering how awful CF is right now with 7xxx series, there is no reason to even release hardware CF since their software support is terrible atm.

On that subject, the noise issue could be solved by a H100 like setup. With an ultra enthusiast card like the 7990 the use of those types of coolers shouldn't be a deal breaker for many. It would allow AMD to clock the card high and cut down on noise considerably.

It wouldn't solve the power problem, but most enthusiasts care less about power draw than they do noise. I think H100 style coolers are a real option for these ultra high end cards.
 
Feb 19, 2009
10,457
10
76
You must have missed this one:

HI.

CF vs SLI. Not Tri-Fire vs Tri-SLI (and its bugged in 3 monitors only, unlike NV's crap all scaling on 1600p with 3 & 4-SLI).

Last i heard, gtx690 is 2 GPU and 7990 will be 2 GPU.. not 3.

If you're gonna reference [H], pick the right article: CF vs SLI
http://www.hardocp.com/article/2012/03/28/nvidia_kepler_geforce_gtx_680_sli_video_card_review/5

1332910830lxuqiwXcM0_5_4.gif


1332910830lxuqiwXcM0_5_5.gif


That's a turbo gtx680 which [H] claims up to 1.2ghz turbo vs STOCK 7970s.

Back to my point, 7990 @ 1ghz will beat a gtx690, easy. Yes I'm aware it all depends on the game selection when a reviewer bench 4-5 games. Refer to bigger lists, CF 7970s already nearly match gtx680s that turbos.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
HI.

CF vs SLI. Not Tri-Fire vs Tri-SLI (and its bugged in 3 monitors only, unlike NV's crap all scaling on 1600p with 3 & 4-SLI).

Last i heard, gtx690 is 2 GPU and 7990 will be 2 GPU.. not 3.

:thumbsup:

Three gpus helps AMD here, sadly not enough. The problem is the same in dual configs, as [H] noted in their SLI vs CF review of the 7970 and 680.

The result of SLI feeling smoother than CrossFireX is that in real-world gameplay, we can get away with a bit lower FPS with SLI, whereas with CFX we have to aim a little higher for it to feel smooth. We do know that SLI performs some kind of driver algorithm to help smooth SLI framerates, and this could be why it feels so much better. Whatever the reason, to us, SLI feels smoother than CrossFireX.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
HI.

CF vs SLI. Not Tri-Fire vs Tri-SLI (and its bugged in 3 monitors only, unlike NV's crap all scaling on 1600p with 3 & 4-SLI).

Last i heard, gtx690 is 2 GPU and 7990 will be 2 GPU.. not 3.

If you're gonna reference [H], pick the right article: CF vs SLI
http://www.hardocp.com/article/2012/03/28/nvidia_kepler_geforce_gtx_680_sli_video_card_review/5

1332910830lxuqiwXcM0_5_4.gif


1332910830lxuqiwXcM0_5_5.gif


That's a turbo gtx680 which [H] claims up to 1.2ghz turbo vs STOCK 7970s.

Back to my point, 7990 @ 1ghz will beat a gtx690, easy.

You should remove your focus from FPS to FrameTimes...like I posted before..pelase read the link, because you are not making your case...only proving my previous point.

Hint:
Microstutter is the missing part you seem to want to ignore.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
You must have missed this one:



[H] verbalized what every article on the subject states, CF is worse than SLI when it comes to multi-gpus, noticeably worse.

Look at the words he used, SLI = Smoothness & Consistency = Superior

I know Anand has a strong pro AMD user base, which explains why it is so strongly against mutli-gpu setups, you won't find this trait nearly as common in the more diverse online communities.

Again for the record, SLI isn't the same as CF, never has been.

So it's come down to sweeping statements backed by no evidence? Do you feel persecuted here for having multi-GPU or something? A ton of posters here are on [H] which you cite to, as well, myself included, though I mostly just lurk there these days. Maybe AT Forums are more anti-SLI/CF because we're smarter and don't want to deal with problems such as greater PSU wattage; more noise and heat; incompatibilities and bugs and driver problems. SLI may have fewer compatibility problems and possibly less microstutter, but the other problems are the same regardless. So yes, SLI and CF share many common traits. Especially power/heat/noise.

By the way, take a look at the lower literacy of posters at certain other forums. It will make you appreciate AT Forums more. Not that AT F is the best... B3D and XS probably have a more technically savvy user base, for instance.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Nothing wrong with having affinity for one side or the other, just stating the obvious that Anand has a strong active user base of pro AMD posters.

I'll just ignore the rest as it was inflammatory :)
 
Last edited:
Feb 19, 2009
10,457
10
76
You should remove your focus from FPS to FrameTimes...like I posted before..pelase read the link, because you are not making your case...only proving my previous point.

Hint:
Microstutter is the missing part you seem to want to ignore.

Hi2U.

U miss the entire mind blowing conclusion of your very own reference:

http://techreport.com/articles.x/21516/11

Go there, see this??

"In fact, in a bit of a shocking revelation, Petersen told us Nvidia has "lots of hardware" in its GPUs aimed at trying to fix multi-GPU stuttering. The basic technology, known as frame metering, dynamically tracks the average interval between frames. Those frames that show up "early" are delayed slightly—in other words, the GPU doesn't flip to a new buffer immediately—in order to ensure a more even pace of frames presented for display. The lengths of those delays are adapted depending on the frame rate at any particular time. Petersen told us this frame-metering capability has been present in Nvidia's GPUs since at least the G80 generation, if not earlier. (He offered to find out exactly when it was added, but we haven't heard back yet.)

Poof. Mind blown.

Now, take note of the implications here. Because the metering delay is presumably inserted between T_render and T_display, Fraps would miss it entirely. That means all of our SLI data on the preceding pages might not track with how frames are presented to the user. Rather than perceive an alternating series of long and short frame times, the user would see a more even flow of frames at an average latency between the two."

I've already said this ages ago for people who use that article to claim its methodology (FRAPS) has any significance, it does not. Until they use a ultra high-speed camera to capture the output on their monitor, their inter-frame measurements are MEANINGLESS and does NOT REPRESENT REALITY.

Edit: Until reviews bench MS with ultra high speed cameras, ultimately its down to the user themselves, sit in front of the screen and if you notice micro-stuttering then don't use it. I've used CF for a long time and did not notice MS, therefore my opinions may be biased.
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81

Xtrmetesystem is using their own program "microstutter" which may or may not be the best to judge ms.NV's perfhd already does that in a far better way.Regrading THG its unfortunately subjective analysis again.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Hi2U.

U miss the entire mind blowing conclusion of your very own reference:

http://techreport.com/articles.x/21516/11

Go there, see this??

"In fact, in a bit of a shocking revelation, Petersen told us Nvidia has "lots of hardware" in its GPUs aimed at trying to fix multi-GPU stuttering. The basic technology, known as frame metering, dynamically tracks the average interval between frames. Those frames that show up "early" are delayed slightly—in other words, the GPU doesn't flip to a new buffer immediately—in order to ensure a more even pace of frames presented for display. The lengths of those delays are adapted depending on the frame rate at any particular time. Petersen told us this frame-metering capability has been present in Nvidia's GPUs since at least the G80 generation, if not earlier. (He offered to find out exactly when it was added, but we haven't heard back yet.)

Poof. Mind blown.

Now, take note of the implications here. Because the metering delay is presumably inserted between T_render and T_display, Fraps would miss it entirely. That means all of our SLI data on the preceding pages might not track with how frames are presented to the user. Rather than perceive an alternating series of long and short frame times, the user would see a more even flow of frames at an average latency between the two."

I've already said this ages ago for people who use that article to claim its methodology has any significance, it does not. Until they use a ultra high-speed camera to capture the output on their monitor, their inter-frame measurements are MEANINGLESS and does NOT REPRESENT REALITY.

You can deny microstutter all you will..it's your choice.
But since both NVIDIA and AMD acknowlegdes that it is real and since reveral reviews reiterate this....you are not making sense.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Nothing wrong with having affinity for one side or the other, just stating the obvious that Anand has a strong active user base of pro AMD posters.

I'll just ignore the rest as it was inflammatory :)

Cool I'll ignore your inflammatory speculation and not stating that ATF also has an active pro-NV user base.