GTX 470 pics *EDIT* possible benchmarks

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
I don't know if I buy those crysis numbers at all. Heaping grain of salt, for sure. I can't find any reviews of the 5870 that would give me even the slightest idea that they are accurate at all.

Even if it is legit review done poorly it will not be representative of the real performance. That is one hell of a dip in min FPS that is not repeated on any top line review site.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
So do you have any particular example of a GPU that has good averages but then is horrible in minimums?

Here is an example:

http://www.hardocp.com/image.html?image=MTI1OTUzMDk0MzJrbFc4SXVKVVhfNl8yX2wuZ2lm

The 5770 has an average frame rate of 55.
The gts 250 has an average frame rate of 59.

So based on the performance delta is within 7%. But when you look at that graph, the 5770 is often much slower than the 7% delta we are led to believe with the frame rate averages.

Here's another one: http://www.hardocp.com/image.html?image=MTI1NTg4NjU5MDdkSGZvS05ZZ21fNV8yX2wuZ2lm

In this one the 5850 average is less than 10% higher than the 4890's, yet look at the graph. The 4890 has several moments where the minimum frame rates were much slower than the 10% average difference indicates.
 

Apocalypse23

Golden Member
Jul 14, 2003
1,467
1
0
Wow, disappointing indeed, everyone in need of a video card might as well get a 5850 when the price drops.

For those of you with SLI boards, be sure to get 2 GTX480s :D
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
Here is an example:

http://www.hardocp.com/image.html?image=MTI1OTUzMDk0MzJrbFc4SXVKVVhfNl8yX2wuZ2lm

The 5770 has an average frame rate of 55.
The gts 250 has an average frame rate of 59.

So based on the performance delta is within 7%. But when you look at that graph, the 5770 is often much slower than the 7% delta we are led to believe with the frame rate averages.

Sure but the 5770 also had the lower minimum - 28 to be more precise, at a single point. The rest was above 30 fps.

In that case all the cards are quite erratic.

Here's another one: http://www.hardocp.com/image.html?image=MTI1NTg4NjU5MDdkSGZvS05ZZ21fNV8yX2wuZ2lm

In this one the 5850 average is less than 10% higher than the 4890's, yet look at the graph. The 4890 has several moments where the minimum frame rates were much slower than the 10% average difference indicates.

Curiously, the 5850 has a lower minimum frame in that graphic charts.



In both cases just looking at average you would think the GTS 250 is faster and the 5850 is faster in that game.

Which is true and proved by the averages.

Now if you looked at the minimums, you would wrongly think the 4890 was faster than the 5850.
 

scooterlibby

Senior member
Feb 28, 2009
752
0
0
It will be kind of sad if it tessellates extremely well but otherwise matches or under performs the 5870. Severe price to pay to be 'ahead of its time,' as tessellation is only on the minds of a few right now. Maybe Fermi 2 will be the ass kicker if the current scenario plays out badly for them.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
don't get me wrong here, I'm not putting the idea down like so many others did when the 5800 Ultra introduced the then absurd notion of taking up more than one slot for cooling

I just find the situation humorous, even if the idea is great and innovative.

Heck, we have the new 5970 "ultras" that are coming with purposefully designed 3 slot coolers

it makes one question whether or not there's a better way

Why is it so absurd that a video card (with up to a 300 watt spec) has a triple slot cooler?

People put much larger Tower coolers on their lower watt spec CPUs all the time.
 

DefRef

Diamond Member
Nov 9, 2000
4,041
1
81
Um, folks... are we "sure" that's a good score? The screenshot shows the GTX470 getting 13264 3DMarks for 3DMark 06 @ default settings on an E8600 stock.

http://service.futuremark.com/compare?3dm06=12604682 - E6750 @ stock speed with a Radeon 4890 pulling 12685 in 3DMark 06. Can't say I'm impressed.
I just benched the rig in my sig and got 21076. (Proof.) How is that ~13000 score supposed to be anything other than fail?
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Remember that the tesselation performance is better in a tessellation benchmark. NV hardware uses shaders to do the tessellation heavy lifting, ATI has dedicated hardware. So in a game the overall frame rate may not higher on an NV card -- the increased tessellation performance would come at the exact same cost in "normal" performance since the same silicon is doing the job.

Can't wait for benchmarks from more trustworthy sources. However, I'm not very optimistic about the price, performance or price/performance of this generation's NV offering. Too bad, I really need a decent upper mid GPU which doesn't force me to give up Linux.
 

ScorcherDarkly

Senior member
Aug 7, 2009
450
0
0
If accurate, the benches posted in this thread...



Granted, it is debatable if these are accurate and we don't have a FRAPS graph, but given the limited data we have the GTX 470 looks like the better card for Crysis specifically with 8xMSAA. A dip into the teens is still pretty bad, but single digits is definitely a slide show.

The point he's trying to make is that just being given a number for minimum frame rate isn't enough to draw any conclusions about the quality of the card. What if the period of 4.96 fps lasted for a total of 1 second out of the entire test, with all other time periods being significantly higher than this minimum? If this was the case, then the minimum frame rate means nothing, as it is not representative of what the most common "slow" speed of the card. This is why having a graph of fps over time is important, as you can see how long a time the absolute minimum occurs, and also see what the "normal" frame rate is on the card.

The best piece of information on these benchmarks is the average, as it represents the entire length of the test. The max and min only represent a single time period, at a minimum. Without seeing a graph, there's no way to know if they represent more than this.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I just benched the rig in my sig and got 21076. (Proof.) How is that ~13000 score supposed to be anything other than fail?

My system scores about 15,000. So I guess my 5750 is faster then a gtx 470?:rolleyes::\

3d mark is worthless with higher end cards.
It is NO indication how fast a card can game.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
I just benched the rig in my sig and got 21076. (Proof.) How is that ~13000 score supposed to be anything other than fail?

3dmark06 (for all practical purposes) is a cpu benchmark when the trial version only uses very low resolution for the tests.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Remember that the tesselation performance is better in a tessellation benchmark. NV hardware uses shaders to do the tessellation heavy lifting, ATI has dedicated hardware.

Are those same shaders responsible for Anti-aliasing performance? (As you can tell I am no hardware expert)
 

bfdd

Lifer
Feb 3, 2007
13,312
1
0
I'm with you in this. I think this fall will have a much more exciting and competitive GPU field. If it does not, then I may start believing all the nvidia-is-abandoning-pc-gaming naysayers.

Which is funny because a few of my friends have been asking me what they should do for a new computer. I told them to make do with what they have until around October when everything new is coming out in full swing. I think this fall is going to be a good time to build a new PC.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
It could be the GTX470 was at 15fps for say 15secs and the 5870 was at 5fps for 1sec, leading to a higher minimum (I know my math is bad)

What card would you go for then?
As you say minimums are more important to you than avgs.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
It could be the GTX470 was at 15fps for say 15secs and the 5870 was at 5fps for 1sec, leading to a higher minimum (I know my math is bad)

What card would you go for then?
As you say minimums are more important to you than avgs.

Only problem with that is the gtx 470 would have a much lower average if it dipped to 15 fps for 15 seconds.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Yeah, I get you, I was just over emphasizing to prove a point. Th 470 would have to spend more time at its higher minimum to get a similar avg
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
The real issue is that minimum means nothing without a plot (or at least more info). Mean represents what it is, and we all understand its significance.

I'm not sure why some reviewers started throwing min and max in there, differentiation from the other guy likely, but it is too vague to be of value unless as part of a set of values with some information on time and method.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
So due to a single bench you are declaring that the GTX 470 is much stronger performer than a 5870...

What is the reason for the dip? Was it a single frame or was during a period of time?

That is not at all what I said. What I said was: given the very small amount of unconfirmed data, the GTX 470 looks to be better than the 5870 specifically in Crysis with 8xMSAA.

We don't know the reason for the dip, but if the benches are valid it's there for the 5870 and not for the GTX 470. Draw whatever conclusions you want to from that.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
That is not at all what I said. What I said was: given the very small amount of unconfirmed data, the GTX 470 looks to be better than the 5870 specifically in Crysis with 8xMSAA.

We don't know the reason for the dip, but if the benches are valid it's there for the 5870 and not for the GTX 470. Draw whatever conclusions you want to from that.

I can't draw any conclusion, other than the 470 seems to take less of a hit with more AA in Crysis that could be explained, for example, by extra memory.

Considering those frame rates, I wouldn't be playing at those setting with any of the cards - 5 fps or 15 fps seems unplayable for me. So does the 23 avg both get.

Additionally that 5 fps dip can't be sustained for long periods, otherwise the average would be considerably lower, or the maximum of 37 would have to be sustainable for a long time.
 
Last edited:

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Remember that the tesselation performance is better in a tessellation benchmark. NV hardware uses shaders to do the tessellation heavy lifting, ATI has dedicated hardware. So in a game the overall frame rate may not higher on an NV card -- the increased tessellation performance would come at the exact same cost in "normal" performance since the same silicon is doing the job.
2 things:

1) I'm not sure where this "NVIDIA doesn't have a tesselator" meme comes from. It's very clear in the GF100 articles that NVIDIA has a hardware tesselator; 16 of them in fact.

2) DX11 tessellation impacts the shaders by design. Hull and Domain shading in the tessellation process runs on the shaders. So you're going to impact the shaders on any architecture when using tessellation.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
It would be nice if every game had a benchmark like FEAR, telling you the % of time the card spend above or below x fps, then we can know if those 5 fps are "real" or just a half second thing