AMD HD8000 Series [Or: Here we go again...]

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
Actually, perf/W decreased quite a bit with RV870 vs RV970. It will be interesting if AMD will commit to 250+W real world consumption (although the GHz ed. already reaches 250W at times).

Oi...

perfwatt_1920.gif


It's a huge improvement in perf/w.

Average gaming load:
power_average.gif


Peak gaming load:
power_peak.gif


Pretty damn reasonable outside of Furmark or some sort of power virus.
 

OVerLoRDI

Diamond Member
Jan 22, 2006
5,490
4
81
Metro max with DoF.

Dirt Showdown ultra lights.

There's a few others, essentially some games just have performance viruses as "features" that if you enable, you will tank performance for very little visual gains. Only do it if you are a masochist.

Adding another, The Witcher 2 Ubersampling. My god does that stuff look good though. It was almost playable with two 7970s at 1920x1200.
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
I know that the specs outlined in the op are plain speculation, It just plain horrid at even that. There are some things we already know/can be pretty sure about that are outright ignored.

The die size is painfully wrong. For one, AMD themselves have said that Tahiti was developed with redundancy in mind to minimize process defects and get the chip out as soon as possible. Another thing that goes against it is the fact that chip design begins around three years before release. While Rory Reed is pushing for more aggressive marketing and pricing, we won't really see that influencing design for at least a generation, if not two.

But that's the boring part, I want to know what architectural changes AMD has in store. ;)
 

Spjut

Senior member
Apr 9, 2011
933
163
106
Funny watching people troll themselves.

1080p 120hz gaming is of real benefit and hardware is lacking in modern titles. You can get close with 2 GPU's, but we don't have the CPU's to push those frames anyway in some modern titles (BF3 especially).
Give me a better CPU please.

My favorite quote of all time. It was the CEO of Nvidia who said this several years ago regarding CPU's...."You don't need a fast one anymore". Lawlerz.

I've gotten the impresstion that the APIs are a big culprit.

This article about Direct X's overhead mentions the PC's problems with draw calls, something that the consoles don't seem to have
http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/2

Since you mentioned BF3, Repi confirmed that BF3 doesn't use DX11 multithreaded rendering(which was supposed to help with the draw calls problem) because it apparently didn't fit their use case.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
HD5870 was bellow the 200W mark, HD7970 is above. They simple dont have the luxury this time to go any higher, unless they will like to follow the NVIDIA way, Huge monolithic dies of 250W+.

But 20% more shaders over the 389mm2 of Tahiti will not make a huge 500mm+ die.

Just because "they simply don't have the luxury this time to go any higher" doesn't mean we are going to get free performance without a power draw increase. The gtx480 -> gtx580 is the only time when a new architecture on the same node produced higher speeds and lower power draw. But GF100 was a dog, and everyone including Nvidia knew that. I don't think GCN is a dog, or is "broken" in anyway. Sure, there are probably improvements AMD can make, but adding functionality and increasing die size within the same node makes it really, really, really difficult to keep power draw the same. Wishing really hard simply won't be good enough.

That said, I think they can and will definitely improve their performance per watt metric.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
8,501
9,931
136
There isn't much difference between GF100 and GF110, FP16 is faster but has limited effect in actual results since it's use is limited.

Other than that they fixed some leaky transistors by cutting them out and slapping a higher tech cooler on the reference card instead of an old 427 with header pipes coming out the hood.

Only if you count a PR stunt that sounds like a jet on take off. If that was the case Nvidia could just release a couple cherry picked 1350 cards to reviewers and call it the "1.21Gw" edition. AMD flip flops so hard is like watching a politician, one gen it's dual gpu cards that count, the next since they can't feasibly produce a decent dual card due to high power they switch over to single with a PR stunt card that will never go on Newegg.

-A company markets on its strengths, I don't suppose you expect them to do any different. We don't see NV parading around "crappy tacked on physics effects that rob you of performance for marginal visual gain!". Also its important to be able to distinguish what the AMD fanatics are saying vs what the company line is. Sometimes its hard to tell the difference.

You could make the same argument that power draw didn't matter with the GTX280 thru GTX580, where we had multi-page threads with proofs on how little it would cost to run a GTX480 over a HD5870 (a 100W delta, no less!) but now there is a 30W power delta in favor of the GTX680 and the 7970 is a power whore that will bankrupt you.

Holy crap, you're ignorant.

Not mid range card, mid range die. ~300mm2 is not large at all for a modern GPU. Since GT200, Nvidia flagship dies have been 500mm2+. ~300mm2 is an x60 level die size.

If you knew anything about GPUs, you'd know this.

-I completely understand where you're coming from, we all know the GK110 Tesla beast was probably meant to be the "real" GTX680. NV couldn't make it commercially viable, so they dropped down to their next part. Great, we all get it. GK114 then became their high end part, and was released as the GTX680. Therefore, the GK114 is the current generation's high end part. Where it stands relative to prior NV high end chips is irrelevant.

The HD3800/HD4800/HD5800/HD7900's are all mid range dies too then, since the 2900XT was a 420mm^2 and they haven't gotten back up to that die size yet. Disagree?
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
-A company markets on its strengths, I don't suppose you expect them to do any different. We don't see NV parading around "crappy tacked on physics effects that rob you of performance for marginal visual gain!". Also its important to be able to distinguish what the AMD fanatics are saying vs what the company line is. Sometimes its hard to tell the difference.

You could make the same argument that power draw didn't matter with the GTX280 thru GTX580, where we had multi-page threads with proofs on how little it would cost to run a GTX480 over a HD5870 (a 100W delta, no less!) but now there is a 30W power delta in favor of the GTX680 and the 7970 is a power whore that will bankrupt you.

Apples and oranges oh my.

PhysX is amazing, the only problem is not enough games take advantage of it.

Even the lower end bullet and havox physics are awesome and game changing, BFBC2 was more fun than BF3 because of the building destruction.

Knock it cause you don't got it.


Check the performance numbers of the 480 vs the 5870 in modern games, the 5870 is getting trashed by the 470 in several big AAA titles released recently, let alone the 480. Apples and oranges oh my.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
8,501
9,931
136
Apples and oranges oh my.

PhysX is amazing, the only problem is not enough games take advantage of it.

Even the lower end bullet and havox physics are awesome and game changing, BFBC2 was more fun than BF3 because of the building destruction.

Knock it cause you don't got it.


Check the performance numbers of the 480 vs the 5870 in modern games, the 5870 is getting trashed by the 470 in several big AAA titles released recently, let alone the 480. Apples and oranges oh my.

-Finally you and I agree. What DICE's in house engine did for BFBC2, Physx has never done for any game that has used it. Please, show me a game where Physx did not simply add more visual clutter and drastically changed fundamental gameplay. I have used Physx, as you might have noticed the GTX460 in my rig (amazing card), but I wasn't terribly impressed.

And you're just flat out wrong on the 470 vs. 5870 point. Anand's GPU2012 bench shows the 5870 winning more than it loses to the 470. Using the 570 as a stand in for the GTX480, the performance lines up more or less as you'd expect. Arkham City and Civ 5 are the only games that go heavily in favor of NV.

Anyhow, I got suckered into a pointless debate that's wildly off topic. I'll save you the time of writing a response to this topic and I'll just say that you win this test of wills.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
BFBC2 was more fun than BF3 because of the building destruction.

BF3 has building destruction. But DICEs building destruction in BC2 and BF3 has nothing to do with Physics, they are pre-build destruction sequences. Same wall will fall the same way every time, unlike in physics and NVIDIA PhysX that the wall and the environment are counter acting together and physics play a role on how the wall will fall.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
-Finally you and I agree. What DICE's in house engine did for BFBC2, Physx has never done for any game that has used it. Please, show me a game where Physx did not simply add more visual clutter and drastically changed fundamental gameplay. I have used Physx, as you might have noticed the GTX460 in my rig (amazing card), but I wasn't terribly impressed.

And you're just flat out wrong on the 470 vs. 5870 point. Anand's GPU2012 bench shows the 5870 winning more than it loses to the 470. Using the 570 as a stand in for the GTX480, the performance lines up more or less as you'd expect. Arkham City and Civ 5 are the only games that go heavily in favor of NV.

Anyhow, I got suckered into a pointless debate that's wildly off topic. I'll save you the time of writing a response to this topic and I'll just say that you win this test of wills.

This is fun, we'll take them one at the time.

Ghost Recon Advanced Warfigher - Agiea island:
http://www.youtube.com/watch?v=OqldT1BkCXo

Better physcs that any BF game...despite being old.

Fully destructable architechture, dynamic winds (interactive with foilage), shrapnel that kills...thi could have been common in all games....if not AMD (and their followers) decided that the grapes were sour...
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
This is fun, we'll take them one at the time.

Ghost Recon Advanced Warfigher - Agiea island:
http://www.youtube.com/watch?v=OqldT1BkCXo

Better physcs that any BF game...despite being old.

Fully destructable architechture, dynamic winds (interactive with foilage), shrapnel that kills...thi could have been common in all games....if not AMD (and their followers) decided that the grapes were sour...

Better graphics than any BF or PhysX title:

http://www.youtube.com/watch?v=_paAM9bhBNI
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
- Please, show me a game where Physx did not simply add more visual clutter and drastically changed fundamental gameplay. I have used Physx, as you might have noticed the GTX460 in my rig (amazing card), but I wasn't terribly impressed.

Imho,

Understandable based on your grading context. You may need to see drastic changes in fundamental game-play -- how is this going to be possible? You're asking a developer to literally lock out everyone to receive this drastic change in fundamental game-play -- doubt very much one will receive drastic changes in fundamental game-play and more-so fidelity, realism and modest game-play enhancements.

Maybe, just maybe as GPU Physics matures and evolves it may be possible to fundamentally change game-play but for me, still appreciate fidelity, realism and modest game-play enhancements while I wait for the industry to come together and forge some kind of standard to eventually mature and evolve.

GPU PhysX is just a vehicle to try to get the ball rolling to me.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
GPU PhysX is "visual physics" not "gameplay physics". It is a graphical effect, nothing less, nothing more. I thought by now everyone had understood that.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
The days of HavokFX, with both nVidia and ATI evangelizing GPU Physics strongly!

AMD certainly did a 180° after Intel accuired Havok and Killed off HavokFX.

Ever wondered WHY Intel did that?
(Hint Intel's CPU's were getting seriously kicked in that arena...and they had no immidiate counter...AMD's sudden rise was still fresh in Intel's minda wise company comes up with a plan...)

First step was slowing down the GPU physics.
Buy havok, remove HavokFX from the public, leave NVIDIA as sole player...with AMD suddenly opposing GPU physics...due to no Physics API.

Step complete.

Second step

Develop own "GPU" like architechture andintegrate it with Havok (that has gotten the new HavokFX (2.0) slipstrreamed into it

Larrabee failed as a GPU, but don't leave out the notion of Intel using their "Larrabee" IGP's for physics...especially if a) most gamer have a GPU and not using the IGP in their CPU when gaming...thus it sits idle, just looking and waiting for something to do.
Hardware physics could be such a task.

And thus kicking both AMD and NVIDIA in the groin....while they wait for "Terascale" CPU's to be the norm...and thus killing off the need for a GPU entirely.

Need more power?
Just add another CPU.

Intel could sell mobos with 4 sockets, where most people only used one socket.
Got gaming needs?
Install a CPU more.
Got workstation needs?
Install 4 CPU's.

But then again, Intel might have some totally different planned...only time will tell...
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
BF3 has building destruction. But DICEs building destruction in BC2 and BF3 has nothing to do with Physics, they are pre-build destruction sequences. Same wall will fall the same way every time, unlike in physics and NVIDIA PhysX that the wall and the environment are counter acting together and physics play a role on how the wall will fall.

I guess it's just the lack of destructibles in BF3 that got me then.
 

flopper

Senior member
Dec 16, 2005
739
19
76
I guess it's just the lack of destructibles in BF3 that got me then.

BC2 more fun by miles than BF3.
even if scripted, it addds to immersion and gameplay.
I find dice drop the ball big time on BF3, been a fan since BF1942 and pinball, but dudes, they sucked in BF3 design.
wished they hired me as a consult there.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,818
1,553
136
False, try looking into PhysX APEX 1.2:

http://youtu.be/9lCkB77it-M

GPU rigid bodies...notice how the debris...destroys objects?

GPU tier 1 physics in action...sorry.

You'll only ever see that level of PhysX in tech demos though. Nobody is going to make make a game that is only playable by a portion of the PC market using a certain brand of graphics cards. By making PhysX a closed standard Nvidia doomed it to irrelevance.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
AMD certainly did a 180° after Intel accuired Havok and Killed off HavokFX.

Ever wondered WHY Intel did that?
(Hint Intel's CPU's were getting seriously kicked in that arena...and they had no immidiate counter...AMD's sudden rise was still fresh in Intel's minda wise company comes up with a plan...)

First step was slowing down the GPU physics.
Buy havok, remove HavokFX from the public, leave NVIDIA as sole player...with AMD suddenly opposing GPU physics...due to no Physics API.

Step complete.

Second step

Develop own "GPU" like architechture andintegrate it with Havok (that has gotten the new HavokFX (2.0) slipstrreamed into it

Larrabee failed as a GPU, but don't leave out the notion of Intel using their "Larrabee" IGP's for physics...especially if a) most gamer have a GPU and not using the IGP in their CPU when gaming...thus it sits idle, just looking and waiting for something to do.
Hardware physics could be such a task.

And thus kicking both AMD and NVIDIA in the groin....while they wait for "Terascale" CPU's to be the norm...and thus killing off the need for a GPU entirely.

Need more power?
Just add another CPU.

Intel could sell mobos with 4 sockets, where most people only used one socket.
Got gaming needs?
Install a CPU more.
Got workstation needs?
Install 4 CPU's.

But then again, Intel might have some totally different planned...only time will tell...

Imho,

I don't know if ATI/AMD did a 180 and still feel they find GPU processing and GPU Physics important but may not feel investing resources into IHV specific proprietary technologies a wise move -- a very valid view.

While proprietary may bring chaos, division and fragmentation it also brings innovation, awareness and choice as well. nVidia may be doing the right thing for their strategies and AMD may be doing what is right for their strategies.

I agree with nVidia but certainly respect anyone that agrees with AMD's strategy. Personally believe choice, innovation and trying to get the ball rolling is worth some of the negatives of proprietary.