7950 vs GK-104

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Arzachel

Senior member
Apr 7, 2011
903
76
91
@Vulgar Display: Thanks, I'm guessing it will be right around GTX580 performance. Maybe 5% over. Either one of us could be wrong though. :D

I kinda doubt they would leave a 20% hole in their lineup, ~14% less sp and a slightly lower clock speed would put it around 15% slower than a HD 7970.

Crap Daddy said:
Was talking about AMD and the last few years.
Yeah that's true, but they have been fighting an uphill battle since the HD 4XXX series. Looks like they'd rather compete on performance instead of price.
 

Justinat0r

Member
Dec 18, 2011
41
0
0
Ok, so the information I've taken from this thread is that the 7950 and the GK-104 are presumed to be rough equivalents of each other in the pricing "scheme", and there will be a 7950 1.5GB model and a 7950 3GB model. The 3GB model of the 7950 is where the $450 figure came from, but the $400 1.5GB version should roughly line up with the GK-104's rumored price.

I guess now the only 'hypothetical' :p decision to make is 3GB vs 1.5GB, and to find out if the GK-104 will have a 3GB version as well.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
Really the only decision you need to make regarding the VRAM on the cards is if you plan on using either 3 monitors, or a 2560x1600 screen.

Anything less than that should be fine at 1.5gb.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Really the only decision you need to make regarding the VRAM on the cards is if you plan on using either 3 monitors, or a 2560x1600 screen.

Anything less than that should be fine at 1.5gb.

And by the time you really need more, the card will be too slow to make use of it probably.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Ok, so the information I've taken from this thread is that the 7950 and the GK-104 are presumed to be rough equivalents of each other in the pricing "scheme", and there will be a 7950 1.5GB model and a 7950 3GB model. The 3GB model of the 7950 is where the $450 figure came from, but the $400 1.5GB version should roughly line up with the GK-104's rumored price.

I guess now the only 'hypothetical' :p decision to make is 3GB vs 1.5GB, and to find out if the GK-104 will have a 3GB version as well.

I'm still pretty confused why you're comparing anything to the GK-104, which as far as we know doesn't exist. If you're going to wait for something that nobody has any information on, EXCEPT fake slides and rumors, heck you might as well wait on 8970 or the GTX 780. The only thing anyone has seen on it is a slide at fudzilla which was proven to be a fake, fan made slide at chiphell and OCN.

I personally go all out when waiting on rumored / non existent parts :p
 

hclarkjr

Lifer
Oct 9, 1999
11,375
0
0
anybody have any idea when the NDA will be lifted on the 7950 so the reviews can be published?
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
February most likely.

I thought the rumors where the 7950 would launch at the end of this month?
So like 2 weeks from now or less, there should be a 7950 on the market.

I think it was from fudzillia.. but any news better than non.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Actually 768 cores is quite unreasonable for the GK104 part baring huge changes to the arch. AMD got 1.7x scaling going from 40nm to 28nm. A midrange part wouldn't be slightly smaller than a GTX580 with slightly lower power draw.

Look at the history. Things are "usually" doubled. Or at the very least increased by 50%.

Start with 5800Ultra with 8 pipes. 6800Ultra with 16 pipes. 7800/7900 with 24 pipes.
Then on to CUDA arch:
8800Ultra 128 CUDA cores. GTX280 240 CUDA cores. GTX480 480 CUDA cores.
Die shrink scaling be damned. Know what I mean?
And as far as this drastic architectural change? We know nothing about it so I would (for now) put a pin in that for later when it's closer to launch and leaks start to manifest.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
I'm still pretty confused why you're comparing anything to the GK-104, which as far as we know doesn't exist. If you're going to wait for something that nobody has any information on, EXCEPT fake slides and rumors, heck you might as well wait on 8970 or the GTX 780. The only thing anyone has seen on it is a slide at fudzilla which was proven to be a fake, fan made slide at chiphell and OCN.

I personally go all out when waiting on rumored / non existent parts :p

It's probably a safe bet there will be the following SKUs for GK series. Mimicking the GF series.
GK100, GK104, GK106, GK108
and then the refresh
GK110, GK114, GK116, GK118

Same as the GTX4xx and GTX5xx series. No reason to think next gen won't be the same. And after Kepler:

GM100, GM104, GM106, GM108
refresh
GM110, GM114, GM116, GM118 etc. etc.
 

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
just buy HD 7950 and be done with it. its useless waiting a product that doesn't exist yet.
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
Look at the history. Things are "usually" doubled. Or at the very least increased by 50%.

Start with 5800Ultra with 8 pipes. 6800Ultra with 16 pipes. 7800/7900 with 24 pipes.
Then on to CUDA arch:
8800Ultra 128 CUDA cores. GTX280 240 CUDA cores. GTX480 480 CUDA cores.
Die shrink scaling be damned. Know what I mean?
And as far as this drastic architectural change? We know nothing about it so I would (for now) put a pin in that for later when it's closer to launch and leaks start to manifest.

The doubling of execution units isn't for free - 175W to 236W to ""250W" TDP with hefty amounts of power tune for the last one. They simply can't afford to have a 250W TDP on their midrange part, because that would leave no space for a GK100/110. I'd love to see Nvidia pull a rabbit out of their hat, but it seems that they too will have to be conservative this round, just like AMD is with Tahiti.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Really can't say much until we learn if they really did remove hot clocks, and of course get more concrete spec leaks.

Until then the only thing we can assume is that Nvidia will bring the best they can within the time allotted to them.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
The doubling of execution units isn't for free - 175W to 236W to ""250W" TDP with hefty amounts of power tune for the last one. They simply can't afford to have a 250W TDP on their midrange part, because that would leave no space for a GK100/110. I'd love to see Nvidia pull a rabbit out of their hat, but it seems that they too will have to be conservative this round, just like AMD is with Tahiti.

How do you explain GF104 or GF114? ;)
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
The doubling of execution units isn't for free - 175W to 236W to ""250W" TDP with hefty amounts of power tune for the last one. They simply can't afford to have a 250W TDP on their midrange part, because that would leave no space for a GK100/110. I'd love to see Nvidia pull a rabbit out of their hat, but it seems that they too will have to be conservative this round, just like AMD is with Tahiti.

Not sure where you're getting your numbers, or if you're excluding the die shrink to 28nm. At any rate, generation after generation, we have the same conversation more or less. "They simply can't afford to do this or that or not do this or that." But guess what, they almost always end up fine with what they do. Even with GTX480, and it's die size and TDP. It pushed limits for sure, but performed accordingly.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Really can't say much until we learn if they really did remove hot clocks, and of course get more concrete spec leaks.

Until then the only thing we can assume is that Nvidia will bring the best they can within the time allotted to them.

Agreed. But I can tell you that they'll take as much time as they need, are on their own schedule regardless of other companies schedules. Shouldn't be too long to wait as long as there are no major hiccups.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
GTX8800 - 90nm - 128 Cores - 690M transistors - TDP 155W
GTX8800 Ultra- 90nm - 128 Cores - 690M transistors - TDP 175W

GTX280 - 60nm - 240 Cores - 1.4B Transistors - TDP 236W
GTX285 - 55nm - 240 Cores - 1.4B Transistors - TDP 204W

GTX480 - 40nm - 480 Cores - 3.2B Transistors - TDP 250+W
GTX580 - 40nm - 512 Cores - 3.0B Transistors - TDP 244W

Edit:
GK100 - 28nm - 1024 Cores - ~6B Transistors - 250+W ??????
 
Last edited:

Arzachel

Senior member
Apr 7, 2011
903
76
91
Not sure where you're getting your numbers, or if you're excluding the die shrink to 28nm. At any rate, generation after generation, we have the same conversation more or less. "They simply can't afford to do this or that or not do this or that." But guess what, they almost always end up fine with what they do. Even with GTX480, and it's die size and TDP. It pushed limits for sure, but performed accordingly.

Look at AtenRa's list. While the 300W Pci-e limit seemed far off just a few generations ago, it hits hard now and will shape this and upcoming generations. It's actually funny that people are bemoaning the lack of G8X-like generational leap, when it increased power draw by 45%. That kind of shit just won't fly anymore.
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Look at AtenRa's list. While the 300W Pci-e limit seemed far off just a few generations ago, it hits hard now and will shape this and upcoming generations. It's actually funny that people are bemoaning the lack of G8X-like generational leap, when it increased power draw by 45%. That kind of shit just won't fly anymore.

It's all about pushing the limits, and I'm all for that 110%. Bring it.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,732
432
126
Yeah, because gtx 460 got smoked by gtx 285...oh, wait...

The GTX 460 didn't smoke the GTX285 either.

perfrel_1920.gif


perfrel_2560.gif


I think it was BFG10K that had a review where he was annoyed that the GTX460 with exotic AA levels was actually slower.

So I wouldn't be shocked if the GK104 was GTX580 performance levels.

But all this depends of the actual architecture and state of the 28nm process.

And if AMD cards are actually a gauge for the 28nm process or if AMD is paying the GPGPU toll NVIDIA paid a few years ago, making them irrelevant for scaling and power consumption NVIDIA can attain from the 28nm process.
 
Last edited:

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Look at AtenRa's list. While the 300W Pci-e limit seemed far off just a few generations ago, it hits hard now and will shape this and upcoming generations. It's actually funny that people are bemoaning the lack of G8X-like generational leap, when it increased power draw by 45%. That kind of shit just won't fly anymore.

right, but you're not accounting for a GTX760 being produced on a process node a full level lower, ie 40nm -> 28nm. With specs that should put it around ~10% faster than a 580, there's no way it'll come close to 250W TDP unless they give it some absolute monster clockrates and/or actually spec it to be that much better than the 580, of which either of the latter scenarios will mean the 760 will be that much faster than the 580.

Which wouldn't really make sense as it would leave very little room for the 780 to stretch its legs unless nVidia pulls a stunt where the 780 is an absolute monster chip that is given very conservative clock rates just to meet TDP and then leaves it up to consumers to clock it beyond spec to get mind blowing performance while shattering traditional TDP levels...not unlike what we've seen with dual GPU card solutions in the past.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Really can't say much until we learn if they really did remove hot clocks, and of course get more concrete spec leaks.

Until then the only thing we can assume is that Nvidia will bring the best they can within the time allotted to them.

Yup. I think the successor to Fermi will be worthy, even if it takes a long time to get here. The success of the 5870 in 2009 didn't seem to phase NV much, although a couple of AIB makers went under or switched.