Nvidia "it's coming...." facebook tease!

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Yup. My GTX580 3GB SLI cannot maintain 60fps in a heavily modded Skyrim with SGSSAA...at 1280x1024. Up the resolution, go 3D and prepare for single digits with one card.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
IMO, at this point people who say these things either A) do know the point of the cards and just want to say they don't because they are jealous or B) seriously don't know the point and are behind the times on current graphics options.

I have been saying for over a year now. These cards were not made to be marketed to you if you do not know what you would do with the graphics power. They are marketed towards larger resolutions - which will see a great benefit from these cards. I assume your resolution is not larger that 1980x1200 in which case, the card is not marketed towards you. If your resolution was higher you would understand the NEED for this type of horsepower.

This.

I am currently running 560 Ti SLI (2GB models), because in some games (basically racing sims and BF3), I run a tri-monitor 5760x1080 resolution (or 6060x1080 BC'd).

I really, really hope to see BigK released soon. Based on the current architecture, it's basically guaranteed BigK will be able to smoke every video card out there, which when you are looking for performance, is a nice thing. :D (this isn't about fanboyism)
When looking at what the 680 is capable of, and then looking at the architecture and realizing it is far smaller and less performance-packed than what the massive GPU version will likely feature, it shouldn't be hard at all to imagine the performance capability of their top-end massive chip.
A 7 Billion Transistor Kepler?

If I can get a card that is A) a single-GPU card, that can run triple-monitor, and B) absolutely smashes my current setup and gives me even more eye candy for games at that resolution, I'd really, really want it.
I'm fairly confident BigK will allow me to replace a SLI setup with a single-GPU, and increase visual details at the same time. That is a Win/Win all around - SLI is great most of the time, but micro-stutter pops up from time to time between different games and driver versions, and removing that issue would be awesome.

edit:

If this pans out, it could be an interesting switch to Nvidia's previous product strategy. The x80 cards used to be the top-end single-GPU card featuring their largest GPU (by transistor count/die size), and the x90 cards were dual-GPU typically featuring slightly smaller and tweaked GPUs.
But with Nvidia readily admitting they were underwhelmed by AMD's competition this generation, they are capitalizing on it as best they can. They made their 680 using a smaller GPU that likely would have been their 660 or 670, because that ensured their 680 was around the same performance capability of the competition's top card. And they could price it at a point that allowed probably record profit-margin since it wasn't their most biggest chip around the $500 price tag. BigK, if released as the 680, probably would have been $600, and bring in less of a profit-margin just so they could have a competitive price point (but still, probably a healthy margin any way you look at it).
Now, since they are simply pricing and adjusting strategy according to the competition, instead of keeping everything the same and blowing them out of the water (with less profits), they've reserved their largest GPU for the "go big or go home" video card. They'll be able to pull off a beastly 690 if they use BigK for that product, and have the same performance differentials between the x80/x90 products yet do so with a single GPU this time around.

Hopefully they don't price it at $800 like recent x90 cards, although it would be justified for sure based on the current market, and allow them the same very healthy margins they are likely seeing with the GK104 in the 680. They could stand to gain far more market-share if they price it a little more competitively, and still see better margins than if they stuck to their original product strategy. A $650 GTX 690 would be HUGE, especially if the performance matches up to expectations for that chip. Depending I how much I can sell my 560 Ti 2GB cards for, I might even settle for $700 out the gates.
Otherwise I'll probably wait until AMD launches a new generation, and settle for a [hopefully] reduced entry cost for the 690.


Of course, they could do something totally opposite to what I'm predicting, but so far this strategy just seems far too fitting.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Yeah I'm really hoping BigK comes soon, at 1080p I was well beyond any need for more power, however at 5900x1080 it's a whole different story.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
Yeah I'm really hoping BigK comes soon, at 1080p I was well beyond any need for more power, however at 5900x1080 it's a whole different story.

I'd love to crank up BF3 settings to High or Ultra for my surround setup. It looks pretty good at mostly Low/Med settings, but I remember testing these cards at 1080p and I was just shy of maxing out on Ultra settings (dropped a few settings to High) and BF3 was jaw-dropping gorgeous. I'd kill for that capability at this surround resolution, but without absolutely breaking the bank - which is why I've settled for now... don't feel like dropping over $1000/1200 in GPU-power alone just to obtain eye candy, current price of entry was already more than I've usually spent for GPUs. I knew triple-monitor would require some sacrifices in my usually hell-bent search for eye candy.

To that end - this setup, plus single-monitor 1080p allows a nearly maxed-out Witcher 2 experience, and it's beautiful. :D (just picked that game up)
 

Meaker10

Senior member
Apr 2, 2002
370
0
0
Shifting from 32 bit floating point units to 64bit is not going to help gaming performance much.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I can get away with decent settings in BF3, just no AA, FXAA on, no AO, my biggest problem is vram… Otherwise I'd just go TRI honestly, but I see no point with 1.28GB :(
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
You just axed two of the most demanding settings.


At 5900x1080 I can do without deferred AA. I wish I could turn AO back on, but not because it looks bad without it, for the epeen, yo.

When you spent less than $500 two years ago you have to make some sacrifices on IQ since nothing in that price range would do any better.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
The supply situation is getting ridiculous. It's been a month now since the gtx680 has been released, and there is still virtually zero stock anywhere. I do not understand what the rush is to release the dual-part, especially when the margins are going to be a little lower vs. selling two gtx680's.

They need and should get the gtx670 out, and then get the channel supplied with both of those products before moving on.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
The supply situation is getting ridiculous. It's been a month now since the gtx680 has been released, and there is still virtually zero stock anywhere. I do not understand what the rush is to release the dual-part, especially when the margins are going to be a little lower vs. selling two gtx680's.

They need and should get the gtx670 out, and then get the channel supplied with both of those products before moving on.

It looks worse than it is. For instance, look at the number of reviews on Newegg for the gtx 680 cards, especially this one: http://www.newegg.com/Product/Product.aspx?Item=N82E16814130768 and compare that to the number of reviews for the 7970 cards, which have been out for much longer. What's happening apparently is that the cards are dribbling in and selling out immediately; it's not a situation where zero cards are coming in at all if the number of reviews is any indication.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
The supply situation is getting ridiculous. It's been a month now since the gtx680 has been released, and there is still virtually zero stock anywhere. I do not understand what the rush is to release the dual-part, especially when the margins are going to be a little lower vs. selling two gtx680's.

They need and should get the gtx670 out, and then get the channel supplied with both of those products before moving on.

Are you unable to obtain a GTX680? Are you even in the market for one?
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
The supply situation is getting ridiculous. It's been a month now since the gtx680 has been released, and there is still virtually zero stock anywhere. I do not understand what the rush is to release the dual-part, especially when the margins are going to be a little lower vs. selling two gtx680's.

They need and should get the gtx670 out, and then get the channel supplied with both of those products before moving on.

It's not even confirmed to be a dual-part, so... ?

There is a very strong chance the 690 will be BigK, because normally the large chip is the x80. If not used in the x80, what else will it be used for? Unless they plan to not even use BigK in consumer products for the 6XX-series, but that would be very, very strange.

With this precedent, I'm not really sure what to expect for the cheaper products of the series; normally, the x70 is a cut-down version of the GPU in the x80, as opposed to using an entirely different GPU like the x60 parts.

If they do it like that, I wouldn't be clamoring for the 670 just yet, as that will mean more of the GK104 being tied up across multiple products, as opposed to only thus far going into the 680.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Unless they plan to not even use BigK in consumer products for the 6XX-series, but that would be very, very strange.

I disagree. Making BigK a dedicated GPGPU chip and very expensive while letting the gtx 680/90 (stripped of most GPGPU so less wasted silicon) take care of gaming would be a reasonable decision.
 

Don Karnage

Platinum Member
Oct 11, 2011
2,865
0
0
2131c665_211a.jpeg


my_body_is_ready.png
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Are you unable to obtain a GTX680? Are you even in the market for one?


I've obtained 2 so far.

One I got the day after release, the second I snagged a little later on, and it took a week to ship. NewEgg has one or two versions in stock for about 15 minutes each day and they immediately sell out. There's no real way to know if they get 200 or 2 each though.

Amazon has them in stock every now and then as well (newegg taxes my state, amazon does not, despite having a new distribution center here).
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
I disagree. Making BigK a dedicated GPGPU chip and very expensive while letting the gtx 680/90 (stripped of most GPGPU so less wasted silicon) take care of gaming would be a reasonable decision.

Not for those of us who want the performance Nvidia was supposed to bring out this generation.

x80 has always been been silicon, with whatever performance they were able to wring out of it. This time, because the competition failed to craft anything remotely capable of what Nvidia can engineer (they have the benefit of a few generations with this engineering method, AMD did not and it shows), Nvidia capitalized by basically matching price and performance using a "lesser" part than usual for that tier in their brand.

Nvidia has a big chip they engineered exactly for this purpose, and you think they should keep it out of our hands because they don't need to use it to stay competitive?

I don't want a dual GPU card if they have a single GPU card that can do what I need and replace a dual GPU setup (SLI for multimon plus performance - this would perform better and do the same with a single GPU). I can avoid the pitfalls of SLI and still win.
 

MichaelD

Lifer
Jan 16, 2001
31,528
3
76
GTX670 for me, thanks. I'm still only at 19x12 though I likes me ALL of the eye candy. I've got a brand new system (MB/CPU/Memory) waiting for a new GPU...for months now. :(
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Nvidia capitalized by basically matching price and performance using a "lesser" part than usual for that tier in their brand.

I've seen references to this idea before, that Nvidia is holding back or something, but I missed the detailed discussion/justification/explanation. Is there something the archetecture of the 680, or was it insider/employee information that provided the support for this idea? I don't understand how such a position can be maintained without something more than speculation, but I think I missed the 'something more'. I like Nvidia, so I'm not trying to bash, I just keep seeing the idea and feel bad because I missed the detailed explanation etc.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
We also saw speculation that AMD was holding back (on clock speed) due to being first to market.

I think we always see speculation?
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
It looks worse than it is. For instance, look at the number of reviews on Newegg for the gtx 680 cards, especially this one: http://www.newegg.com/Product/Produc...82E16814130768 and compare that to the number of reviews for the 7970 cards, which have been out for much longer. What's happening apparently is that the cards are dribbling in and selling out immediately; it's not a situation where zero cards are coming in at all if the number of reviews is any indication.

I have seen them in stock myself (and it was awfully tempting to purchase), but it's still bad enough that the e-shelf life can be appropriately measured in minutes.

Are you unable to obtain a GTX680? Are you even in the market for one?

I have seen it in stock, but I am currently not in the market to upgrade my GPU. The points I raised are still entirely valid.