Nvidia GTX 690 = 2 x Nvidia GTX 680!!

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
ye__s, s___li is _____ t__h___e bo__m__b.

sorry my quick reply was a slow read but I microstuttered a bit when I wrote it.

Based on what I heard (which may not be true), Nvidia's solution is slow things down to the slowest common denominator between the cards to minimize perceived microstutter at the cost of framerates. Should be interesting to see if it works.

In related news, tonight I learned for the first time that the framerates in NV Surround may be uneven because the side monitors get lower framerates and the GPU resources get poured into the central monitor. Sounds like a good idea to me.

This coupled with adaptive Vsync.

It seems Nvidia is concerned with how humans perceive things. Trying to eliminate microstutter, trying to smooth things out with adaptive vsync, trying to focus on central monitor fps and letting the sides get the leftovers, is welcome. For too long NV and AMD have not addressed these "perception" issues but simply thrown more and more fps at us without thinking about how to more effectively and efficiently throw GPU resources at the problem.

I applaud the increased focus on power and GPU resource efficiency and increased focus on perceived smoothness. AMD could learn a lesson or three, there.
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
263W @ default power target!!!!!!!!
Nicely binned.

AMD and its fans may want to sit this one out.
No point wasting resources/emotions on mission impossible, amirite?

Maybe focus on 7930, 7790... or something.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
263W @ default power target!!!!!!!!
Nicely binned.

AMD and its fans may want to sit this one out.
No point wasting resources/emotions on mission impossible, amirite?

Maybe focus on 7930, 7790... or something.

As u are aware u can have higher power targets but in that case ur card won't be pcie complaint and u won't be able to advertise this on the box.But if AMD want to play tough they can bring a non pcie compliant card to battle the 690.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I can see some people still don't get the diffrence between FPS...and Frametimes...that would be oaky...if they didn't post based on their lack of knowlegde..

I suggest suggest this read:
http://techreport.com/articles.x/21516
Several posters in this thread need to read up sadly.

That way we won't clutter the trhead with false information needing to be debunked.

Unless cluttering the thread with false arguments is the plan?

Lon, don't start with this crap unless you want another vacation. Stick to the tech; what other people are doing is for the mods to deal with.
-ViRGE
 
Last edited by a moderator:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
I can see some people still don't get the diffrence between FPS...and Frametimes...that would be oaky...if they didn't post based on their lack of knowlegde..

I suggest suggest this read:
http://techreport.com/articles.x/21516
Several posters in this thread need to read up sadly.

That way we won't clutter the trhead with false information needing to be debunked.

Unless cluttering the thread with false arguments is the plan?
Agreed.Actually i was surprised that reviewers didn't use PerfHUD(doesn't support Kepler) for fermi gpus to measure various metrics(it can show if a game is shader or texture bound,the difference in frame-times and many other stuff).Now u can do the same with nsight 2.2 .Does AMD offers same kind of tools?
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
IMO the best looking card EVER. I'm a fan of black & silver. The aesthetics scream industrial. Wow what a beauty:
GeForce_GTX_690_3qtr.jpg


We may love custom cooled models; but I'll continue to brag on nVidia's reference design prowess. The black cooling plate covers the entire PCB, contacting all components while adding to PCB ridigity. GPU heatsinks are removable without breaking the seal on the cooling plate. Great for TIM tweakers and water fans. Attention to detail screams out with this card folks. Sure the card may not be for everyone; but it's still a beauty from many perspectives:
GeForce_GTX_690_3qtr_no_shroud_575px.jpg
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
NV is responsible for their own pricing, just like AMD is... NV can do what it wants, and its chosen to "go premium"!

Well, as a company - you tend not to have any desire to keep low margins if the other guy's price/performance ratio allows you to price competitively, basically match performance, and provide insane profit margins.

I cannot imagine Nvidia is doing anything but raking in record profits (as well as the vendors) thanks to AMD's inability to produce a card that actually challenged Nvidia.
That's the whole reason we aren't seeing BigK (which, to put into the market, would most likely require a return to "normal" margins because BigK is all but guaranteed to be a costly chip at this point in time), and instead, Nvidia has competitively matched AMD's offering with a chip that is hardly stressing Nvidia's engineering prowess at the moment.

Honestly, if I was Nvidia (a publicly traded company, and the only major "independent" GPU firm in the desktop space - one that is desperately trying to reach into more markets), I'd do the exact same thing.
AMD launches something underwhelming compared to what you can offer? Well, use what parts you have that can effectively match it, price it around the same, and try to ride the wave of higher margins than usual until the competition forces your hand and makes you pull out the ace.
Once AMD does this, if they do, I expect Nvidia will launch BigK - if it's during this generation (for AMD), they'll probably cut down BigK a little and launch it with some funky model numbering scheme like usual. Shave off a little from BigK, drop it at the 685, blow away the 680 but come a little shy of the 690. It would be strange if they released a 695 as a single-GPU following this, but I think an "all you can eek out of that bad boy" BigK could give an under-clocked dual GK104 card a run for it's money. But this would have to be very, VERY late and honestly, this would be very unexpected now that I've typed it out - a sort of "the swine have taken flight" scenario.

And if it's that late, Nvidia might, just might, launch a tweaked Kepler as the GTX 700 series (akin to the 400/500 series). I wouldn't expect the next step in a major architectural change to arrive until Kepler has been worn out - and with the top-end models already slotted with smaller GPU designs, I think Nvidia has some time to wring Kepler for all it's worth across two "generations".
 

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
If it is indeed equal to 680s in SLI it is indeed awesome price.

Many SLI users would gladly have the same performance on ONE card even if a slight price premium.

Lets face it, people are willing to pay $30 more for a overclocked single card or more with bling. Its not any different than sli users.
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
263W @ default power target!!!!!!!!
Nicely binned.

AMD and its fans may want to sit this one out.
No point wasting resources/emotions on mission impossible, amirite?

Maybe focus on 7930, 7790... or something.

You are completely wrong, just look at the 5970 and 6990.

AMD designed both to be over clocked and run out side the ATX limit. The 5970 came with a little software tool to over volt it. While the 6990 was much similar with the bios switch.

It's just nice to see that when Nvidia is finally able to make a working duel GPU card that suddenly duel GPU cards count again. When comparing what company has the fastest over all card.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
They are throwing the pitchforks in the ocean :p

Imho,

The key: there really isn't strong defense for nVidia premium pricing to such levels like the sweet spot strategy never existed for the many years like there was with AMD raising prices by 50 percent. I've never seen anything like it in over ten years of posting -- the strong defense of AMD premium pricing.

Good looking product --- compelling product -- but too much premium so far with 28nm. IF a GTX 680 can command 499, well, it isn't surprising this product sku is 999.

It is what it is, but the GTX-690 sure has an efficient, sleek design.

Suppose, that lower than expected yields, constraints, market, competition, higher costs have something to do with the less than stellar 28nm price/performance but predator and aggressor belongs in there as well.
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
Lest we forget that Passion Has a Price!

Gotta bolt to Home Depot. I go to one 2.5 miles farther than another because I can open up my silver and black 2012 Camaro SS on this killer parkway. A simple errand enhanced by passion. The Price..
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
LOL at blaming AMD for nvidia's pricing when nvidia has had the most expensive cards of every generation since the AMD buyout of ATI. So this comes as no surprise.



Imho,

They're competitors and both bring strong competition so when AMD was so disruptive with the 4870 and 4850's pricing, forced nVidia to lower their pricing or risk losing vasts amount of market share -- so, in essence, so blaming AMD that nVidia lowered prices was fair. So, why can't one raise a similar point if AMD raised prices by 50 percent?
 

Bobisuruncle54

Senior member
Oct 19, 2011
333
0
0
Let me put things in perspective for you:
http://www.anandtech.com/show/2222

8800 Ultra

SP's 128

ROP's 24

Coreclock 612MHz

Shader clock 1.5GHz

Memory data rate 2.16GHz

Buswith 384bit

Memory size 768MB

Price $830+

If we then take inflation into account..from 2007 to now...+$830 translates in to +$900..for a single GPU.
You now get 2 GPU's...for a little more.

So what was your point?

That people here have no memory?

Relative to the performance gains given that doesn't really mean much. We all know dual GPU cards were manufactured; as GPUs advanced the performance difference from one generation to the next has diminished. Complaints about the price are perfectly valid because the only metric that really counts is price : performance. The fact that AMD and Nvidia now need a dual GPU card to bring the same performance increases that we used to see from one flagship single GPU card to the next is irrelevant.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
ye__s, s___li is _____ t__h___e bo__m__b.

sorry my quick reply was a slow read but I microstuttered a bit when I wrote it.

You'll be pleased to know I'm going Tri SLI then :thumbsup:


Is GK110 ever coming, or will the refresh be GK104 with two more SMU's tacked onto it?
 

lOl_lol_lOl

Member
Oct 7, 2011
150
0
0
Dammit Evga and co. are gonna have a hard time improving on the reference design.

They will all paint it different colors and call it a day.
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
Imho,

They're competitors and both bring strong competition so when AMD was so disruptive with the 4870 and 4850's pricing, forced nVidia to lower their pricing or risk losing vasts amount of market share -- so, in essence, so blaming AMD that nVidia lowered prices was fair. So, why can't one raise a similar point if AMD raised prices by 50 percent?

Because there is no basis for blaming AMD. Nvidia has always priced cards high depending on the competition.

Plenty of people around here just seem to have short or selective memories.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
You'll be pleased to know I'm going Tri SLI then :thumbsup:


Is GK110 ever coming, or will the refresh be GK104 with two more SMU's tacked onto it?

Tri and Quad offer much less micro-stuttering than dual, too. Even though multi-GPU may not be as seamless as a single GPU -- does anyone remember when there wasn't multi-GPU choice? When one was forced to just one GPU and had to wait 6-18 months for more performance? It was like a nightmare.

Today, simply add GPU's for more performance and features -- Good Lord, multi-GPU is so very welcomed.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Dammit Evga and co. are gonna have a hard time improving on the reference design.

They will all paint it different colors and call it a day.

There won't be any AIB boards..all GTX690 card will come from NVIDIA.
AIB's can then try and put a sticker on...I wish they didn't.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I can see some people still don't get the diffrence between FPS...and Frametimes...that would be oaky...if they didn't post based on their lack of knowlegde..

I suggest suggest this read:
http://techreport.com/articles.x/21516
Several posters in this thread need to read up sadly.

That way we won't clutter the trhead with false information needing to be debunked.

Unless cluttering the thread with false arguments is the plan?

Lon, don't start with this crap unless you want another vacation. Stick to the tech; what other people are doing is for the mods to deal with.
-ViRGE

Lon, if that was directed at me and/or Lava, could you be more specific? Because the TR article, like others I've seen on microstuttering, vindicates those who criticize SLI/CF for microstutter, rather than debunk the microstutter argument.

I haven't had time to sit through a recording of the NV presentation but I thought someone on here said NV has a plan for fixing AFR timings by smoothing them out, even at the expense of measured fps. If this is NOT true, it's no big deal to me, since I don't like SLI/CF for reasons beyond microstutter anyway (power draw, heat, noise, space, need bigger PSU, driver/bugs/etc. issues, perf/price, etc.).

If it IS true, I am very happy about Nvidia's taking the lead on this, similar to how NV's adaptive VSYNC and focusing on the central monitor in Surround probably help user experience, too. Possibly also GPUBoost. Smoother gameplay is something everyone should be happy about.

Also, thanks for the techreport heads up. I stopped going there a while back; glad to know they stepped up their GPU analysis. It's also interesting that a stock 7970 does pretty well against a stock 680 under TR's 99th percentile and 50ms tests. An overclocked 7970 ought to do even better. I didn't get my card just for gaming, but it's nice to know it's good for games, too.

And thanks mods: bad attitudes/rudeness have no place on this forum.
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Until now nobody has done anything objective regarding MS except their subjective findings.Like i said in my previous post there are tools for the job,why don't use it?
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Until now nobody has done anything objective regarding MS except their subjective findings.Like i said in my previous post there are tools for the job,why don't use it?

It's properly not in the reviewers guide ;)
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
It's properly not in the reviewers guide ;)
I think NV should push it.I bet AMD has similar tools ,they should push it as well.We are dealing with science here not fiction.So this MS debate should be dealt with once and for all.