AMD HD8000 Series [Or: Here we go again...]

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
AMD continued with their small die strategy, and switched from a graphics focus architecture to a more versatile one. This is why it wasn't a huge boost over the 6970 in gaming. In compute however, the difference was enormous. I think it's a bit silly that you seem to think that gamers are the only people that use GPUs.

Nvidia's original flagship GPU outright failed. Thus, they were left with GK104 as their high end instead of their GK100 chip -- unless you think that millions of dollars of R&D were thrown away without good reason.

In both instances, neither company "postponed their flagship" in the sense you speak of. AMD gained compute at the expense of gaming, while Nvidia's chip was faulty.

Compute has now become especially important now that Intel has entered the HPC market on the GPGPU side of things. The clash between GK110, Venus XTX (if that does end up being the codename) and Intel's MIC is going to be huge.

Yes, the prices this generation are outright absurd, but this is largely an issue on two fronts: TSMC's inability to keep up with demand on 28nm with all of its partners, and Nvidia's failure to launch competitors to AMD's 7700 and 7800 series. Also, it seems like AMD's change in management may have had a hand in high prices as well.

So no, your position is completely erroneous. You are pointing your finger in the wrong direction. This was not the result of AMD and Nvidia "postponing their flagships." It's really just a "perfect storm" -- lots of negative factors affecting consumers at the same time. With any luck, TSMC's supply issues will be resolved by the end of the year, we'll have GTX 660 launched this month (and it will hopefully be competitive), and the large die chip(s) we've been waiting for will surface at the beginning of next year.

Im confused. The GK100 existed? it was faulty? where are you getting all this info from?

And you seem to harp about nVIDIA failing to launch their mid range parts but what is the point of releasing those cards if the previous gen cards with price cuts do the job perfectly fine? You seem to argue in a sense that one must release their new mid/low range products for the sake of completing the "product lineup" + because the competition has new cards when financially it doesn't make much sense. i
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
Im confused. The GK100 existed? it was faulty? where are you getting all this info from?
If it didn't exist, then why is GK104 named GK104 and not GK100? Do you not understand Nvidia's chip nomenclature?

Then there's this:
http://semiaccurate.com/2012/02/07/gk110-tapes-out-at-last/

And you seem to harp about nVIDIA failing to launch their mid range parts but what is the point of releasing those cards if the previous gen cards with price cuts do the job perfectly fine? You seem to argue in a sense that one must release their new mid/low range products for the sake of completing the "product lineup" + because the competition has new cards when financially it doesn't make much sense. i
Er, financially it makes tons of sense. The margins on these 28nm GPUs are much larger. Since Nvidia apparently has a lot of old inventory to clear, they're still at fault for overshooting.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Because GK104 is based on GF104, and GK100/110 would be based on GF100 which is the same thing as GF110.

If GK100 failed, then I don't see why they'd go to GK110 since GK110 would be exactly the same as GK100 with some revisions. I'm pretty sure the GF100 that hit stores in 2010 wasn't rev. 1.
 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
GK110 is going to be incredibly memory bandwidth/ROP bound. It's strongly focused on compute at the expense of gaming.

How do you know for sure? The released white paper for Tesla K20 doesn't indicate such. Memory subsystem is going to be 150% of GK104, and the bandwidth for consumer version GK110 will likely depend on how fast of GDDR5 is available at the time of manufacturing.

If anything is going to limit GK110 to reach its full potential on desktop, it will be a power constraint. (250W)

And the same goes for AMD as well. Since they're both on 28nm for foreseeable future, it will be tough to squeeze out even 20% more without redesigning of Tahiti to fit in the power envelope. (again, 250W. Beyond 250W for a single GPU, people will not take it too kindly and you can forget OEMs.)

I'd think Pitcairn may be a better candidate for such a redesign, if AMD wants to be at the top on the desktop.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I'd expect GK100 to be a many core, low clock card.

No "GHz" edition. But that should aid in overclocking so long as Nvidia doesn't gimp it on purpose, the high leakage and notorious power draw of GF100 is what lead to people popping off 50%+ overclocks on water and 35-40% on air, or getting a space heater at stock (sorry for your loss).
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
How do you know for sure? The released white paper for Tesla K20 doesn't indicate such. Memory subsystem is going to be 150% of GK104, and the bandwidth for consumer version GK110 will likely depend on how fast of GDDR5 is available at the time of manufacturing.
Math. Shader count is increasing 90%, bus size is increasing 50%, ROP count is increasing 50%. The bandwidth problem can be negated by not lowering the memory clock as much as the core clock (reducing clocks is inevitable), but ROPs are tied to the core clock.

If GK100 failed, then I don't see why they'd go to GK110 since GK110 would be exactly the same as GK100 with some revisions. I'm pretty sure the GF100 that hit stores in 2010 wasn't rev. 1.
Er, there's a lot larger difference between GK10x and GK11x than a simple leakage stomping revision.

No "GHz" edition.
Please, it's bad enough seeing that moniker applied to AMD cards.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
If it didn't exist, then why is GK104 named GK104 and not GK100? Do you not understand Nvidia's chip nomenclature?

Then there's this:
http://semiaccurate.com/2012/02/07/gk110-tapes-out-at-last/

Er, financially it makes tons of sense. The margins on these 28nm GPUs are much larger. Since Nvidia apparently has a lot of old inventory to clear, they're still at fault for overshooting.

What is there to understand about nomenclatures that doesn't mean anything? For all I know, GK100/GK110 or w/e could be called big daddy sexy by nVIDIA employees starting from today. These can change at any time, and hold no real value/meaning. Being so obsessed at the idea that because its named GK104, must mean GK-w/e exists is such a pointless speculation.

Margins for 28nm GPUs are much larger? how so? Just reading your last statement makes me question your understanding of how the economy works..
 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
Oh geez. Can we talk about hardware, instead of whatever the behind story of corporations
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
What is there to understand about nomenclatures that doesn't mean anything? For all I know, GK100/GK110 or w/e could be called big daddy sexy by nVIDIA employees starting from today. These can change at any time, and hold no real value/meaning. Being so obsessed at the idea that because its named GK104, must mean GK-w/e exists is such a pointless speculation.
You exhausting so much effort into your silly assertion is far more pointless. Why did you bother to write something so devoid of intelligence?

Margins for 28nm GPUs are much larger? how so? Just reading your last statement makes me question your understanding of how the economy works..
It's pretty damn obvious by yours that you have no idea how wafer economy works.
 

borisvodofsky

Diamond Member
Feb 12, 2010
3,606
0
0
Try gaming at 5760x1200. See the thing is that this card is not marketed to you. You don't need it because you probably game at 1080p and you are okay with medium settings or no AA sometimes.

These top of the line cards are marketed for people running 1440p+.

Also what you have stated isn't even really that true. Skyrim with all the mods, AA, and increased draw distance does a pretty good job - and that engine is basically ancient.

Max Payne 3? Metro2033 (although I do agree with the statement below that some of the options are small improvements at a high cost), Battlefield 3, these are some games that can put a hurting on a single high end card even at 1080p currently.

Now try using 3 of those screens and see if the next generation hardware is necessary or not.

what a joker. I've had 2560x1600 since the x1900xtx days. I'm an OG, you brat. :D

You and your ghetto triple monitor can suck it.

Higher resolution is great, but we've hit the diminishing returns on that subject long ago.
 

borisvodofsky

Diamond Member
Feb 12, 2010
3,606
0
0
Why do you guys even listen to the "we don't need more powerful stuff"-whiners? They are on the wrong side of history and they know it. Nobody forces anyone to buy new stuff.
I for one would like to play my games in all their glory with SSAA at a constant 60fps. Go figure :)

I'm not whining about 'we don't need the stuff"

I'm whining about mismatch between hardware and software.

It's great to have hardware, but no company can currently support the creation of software that would take advantage of the hardware. :D
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Math. Shader count is increasing 90%, bus size is increasing 50%, ROP count is increasing 50%. The bandwidth problem can be negated by not lowering the memory clock as much as the core clock (reducing clocks is inevitable), but ROPs are tied to the core clock.


Er, there's a lot larger difference between GK10x and GK11x than a simple leakage stomping revision.


Please, it's bad enough seeing that moniker applied to AMD cards.

You're assuming of course they'll ship 6000Mhz on the new bus.

People are getting gains from GK104 with core overclocks without touching memory, so it can't be totally starved.

Where are you getting your information on GK100 and GK110?
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
You're assuming of course they'll ship 6000Mhz on the new bus.
I'm not assuming that -- my comment explicitly stated otherwise. It is possible that Nvidia is able to maintain the high memory clocks, but going higher than 1500Mhz is unlikely for both AMD and Nvidia.
People are getting gains from GK104 with core overclocks without touching memory, so it can't be totally starved.
There's no such thing as a hard bottleneck.
Where are you getting your information on GK100 and GK110?
GK110 has been fully detailed, with the exception of clock speeds. It's highly likely that clock speeds will be lower in order to keep power consumption and thermals in check.

As far as GK100 goes, there's very little information on it. It was scrapped long before Tahiti ever paper-launched.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
The 2nd generation of the GCN architecture ought to be a fair bit better.
I'd be happy with a 40% across the board performance increase over Tahiti XT.:thumbsup:

I'd actually be surprised. Pleasantly surprised, but surprised, none the less.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
Nvidia's original flagship GPU outright failed. Thus, they were left with GK104 as their high end instead of their GK100 chip -- unless you think that millions of dollars of R&D were thrown away without good reason.

Just because there was a GF100 doesn't mean that there also was a GK100. There is no evidence to support it one way or the other.
 

Aeiou

Member
Jan 18, 2012
51
0
0
I was just thinking the other day we need some new rumours to spark things up into a frenzy, i've been getting bored as hell.

About god damn time (also amd is terrible, there let's get going)
 

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
I was just thinking the other day we need some new rumours to spark things up into a frenzy, i've been getting bored as hell.

About god damn time (also amd is terrible, there let's get going)

haha
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
If this 8000 series GPU is going to be so good, why doesn't CEO Rory move some of the AMD GPU engineers to the CPU side so their high end CPUs compete with Intel?
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
Thats what goes through my head every time I bought a lottery ticket for the past 5 years.
If you're equating a 40% generational leap with "winning the lottery," then you have absolutely no concept of GPU design.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
Funny watching people troll themselves.

1080p 120hz gaming is of real benefit and hardware is lacking in modern titles. You can get close with 2 GPU's, but we don't have the CPU's to push those frames anyway in some modern titles (BF3 especially). For the first time in my gaming life, I am not very interested in these new cards. I can see it now. I install two of them, get super excited, load up BF3 and watch my framerate drop by a few FPS due to slightly more CPU overhead than with the GK104 cards.
Instead of getting 60% GPU usage, i'll be getting 30%. Sounds like something to get real excited about.

Give me a better CPU please.

My favorite quote of all time. It was the CEO of Nvidia who said this several years ago regarding CPU's...."You don't need a fast one anymore". Lawlerz.
 
Last edited: