This whole R600/G80 benchmarks thing is nonsense.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

terentenet

Senior member
Nov 8, 2005
387
0
0
Yes, 8800 Ultra WILL be available. In limited quantities maybe, but will be available for those that want to pay for it. I'm not saying who or how, but I was given the opportunity to buy 2 Ultras. Don't ask. Will I take the plunge? I don't know; I have to see what that card can/can't do.
So it's not just for reviewers, but for end users too.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: terentenet
We havent really seen R600 benchmarks yet. Perhaps it's not that bad, perhaps it will give 8800GTX a run for its money. Perhaps it will beat it in all benchmarks. Or perhaps not. Let's not make an assessment over this "G80 is better than R600". NDAs were not lifted yet, surprises may come.
Let's wait and see more benchmarks from different sources. Only then we may say who won the high-end market.

8800 Ultra? A speed bumped GTX, or that's what I was led to believe. I would have been more interested if they mentioned extra shader units. That's where power is.
As a poster above said, all GTX owners can volt mod their cards and reach Ultra speeds. Just have it cooled properly. Water recommended i'd say.

When will 8900GTX come out? I believe it won't be long, it's been 6 months since 8800 is out. 8900 cards should be ready and testing by now. That's really what I'm waiting for. More shader units, higher clocked, lower process; 80nm or even 65nm.
What do we know? Nvidia said G80 is capable of 512bit memory interface. 8900 might bring that to us, forced by R600.

Actually, the R600 is rumored to have only 16 TMU's. That is where it really falls short against g80 (and its a major reason why g7x remained so competitive against r5xx). ATi and their fans have long held the view that shaders are more important than texturing units, something I've never agreed with and which can already be seen in real life (the r600 will drive home the reality of situation though through lack of performance).
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
(and its a major reason why g7x remained so competitive against r5xx).
No, G7x remained "competitive" because it employed sub-par filtering out of the box. When reviewers started using HQ G7x fell flat on its face in modern shader based games.

(the r600 will drive home the reality of situation though through lack of performance).
We'll see about that.
 

terentenet

Senior member
Nov 8, 2005
387
0
0
The TMU thing really leaves me in the fog. Why would the implement that few TMUs' if that is a bottleneck? I'd think that a chip manufacturer tests it's chips much different than we can. I mean, they must be somehow able to see where the bottleneck is inside the chip and balance it.
They've been developing the R600 for some years now. If R600 turns out to not perform as expected, it means one of 2 things.

1. They can't design a chip
2. R600 is good and while designing it, they thought it will be the next best thing since the wheel was discovered. G80 was a kick to the nuts that they didn't expect and since it was launched, they only pushed back their launch and began respinning the silicon to get higher speeds out of it.

Is G80 90nm or 80nm? Which brings me to this question: Why does ATI/AMD want so bad to get 65nm 2900? I guess lower temps and higher clocks.
I don't want to see ATI/AMD take the same road Intel did back in P4 days - push the clock speeds.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
IMO its caused by the engineering culture at ATi and the companies origins (founded by mathematicians, producing chips that specialised in math operations). Texturing has never been the strongest part of any ATi chip.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
I'm pretty sure this 8800 Ultra is just another 7800GTX 512. Very limited in quantities and just created to try to steal ATi's R600 thunder. After all the money is NOT on the high end, is in the midrange. Not everybody can cash more than 300 bucks for a card. If someone with a 8800GTX is thinking to upgrade to such card is just a fanatic. Even with that overclocking, the difference doesn't worth the huge amount of money, but hey, is your money. 0_o
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
http://www.theinquirer.net/default.aspx?article=39262
The Nvidia 8800U samples we saw look just like Theo said with OEM parts going out at 612/1080 clock. The cooler is designed for quieter not cooler operation.

The core OCs decently with about 10% higher achievable, with memory not getting anywhere near that.

On a highly OCed Kentsfield running at about 3.5GHz - all numbers slightly blurred to protect the guilty - 2G of DDR2 running at 1200MHz on a 680i board, the score I can talk about is 22,000+ in 3DMark05.

In the end, it looks like the card is memory bound, not core bound. The Ultra scores better than a GTX with a massive core OC, so it looks like the main upgrade here is the memory. Call me overly skeptical, but is this really worth that much of a premium?



http://www.theinquirer.net/default.aspx?article=39265
YESTERDAY WE said we would keep 8800 Ultra numbers short and sweet, but we got a little long winded.

This time we won't.

Nvidia has told OEMs that the MSRP of the Ultra is $829, and don't expect many. And they all lived happily ever after.

The end.

 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: apoppin
http://www.theinquirer.net/default.aspx?article=39262
The Nvidia 8800U samples we saw look just like Theo said with OEM parts going out at 612/1080 clock. The cooler is designed for quieter not cooler operation.

The core OCs decently with about 10% higher achievable, with memory not getting anywhere near that.

On a highly OCed Kentsfield running at about 3.5GHz - all numbers slightly blurred to protect the guilty - 2G of DDR2 running at 1200MHz on a 680i board, the score I can talk about is 22,000+ in 3DMark05.

In the end, it looks like the card is memory bound, not core bound. The Ultra scores better than a GTX with a massive core OC, so it looks like the main upgrade here is the memory. Call me overly skeptical, but is this really worth that much of a premium?



http://www.theinquirer.net/default.aspx?article=39265
YESTERDAY WE said we would keep 8800 Ultra numbers short and sweet, but we got a little long winded.

This time we won't.

Nvidia has told OEMs that the MSRP of the Ultra is $829, and don't expect many. And they all lived happily ever after.

The end.

LOL best INQ article ever

Here we go 7800GTX 512 all over again
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: ShadowOfMyself
Originally posted by: apoppin
http://www.theinquirer.net/default.aspx?article=39262
The Nvidia 8800U samples we saw look just like Theo said with OEM parts going out at 612/1080 clock. The cooler is designed for quieter not cooler operation.

The core OCs decently with about 10% higher achievable, with memory not getting anywhere near that.

On a highly OCed Kentsfield running at about 3.5GHz - all numbers slightly blurred to protect the guilty - 2G of DDR2 running at 1200MHz on a 680i board, the score I can talk about is 22,000+ in 3DMark05.

In the end, it looks like the card is memory bound, not core bound. The Ultra scores better than a GTX with a massive core OC, so it looks like the main upgrade here is the memory. Call me overly skeptical, but is this really worth that much of a premium?



http://www.theinquirer.net/default.aspx?article=39265
YESTERDAY WE said we would keep 8800 Ultra numbers short and sweet, but we got a little long winded.

This time we won't.

Nvidia has told OEMs that the MSRP of the Ultra is $829, and don't expect many. And they all lived happily ever after.

The end.

LOL best INQ article ever

Here we go 7800GTX 512 all over again

that's why i posted it

--they are parodying themselves ... and it IS funny
[this time]

it *appear* that nvidia is SO confident ... they don't really care to answer HD2900XT[x]
:p
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
It gets even better with the new CFAA anti-aliasing modes.

Just when you thought blurry AA like quincunx was gone forever it shows up again

6x 8x New AA slide CFAA

Note the samples in the outer circles come from outside the current pixel having AA applied (same concept as quincunx was, just more samples). This will almost surely lead to blurring (just like quincunx)

 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
Originally posted by: swtethan
Some people are willing to pay for high end cards.

Nvidia wants to steal ATI's thunder.

i'd DEFINITLY pay $350+ for a kicka$$ graphicscard...and already did quite often...actually more than that.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Gstanfor
The following images are apparently using R600 16x CFAA Anti-aliasing

image1
image2

I did notice what appears to be some sort of rendering artifact on the images (marked with red circle) - discrepancy

LOL, when not magnified, it looks pretty damn nice actually.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Shame about that polyon I circled that stands out like a dogs d*** though. (I'm not responsible for uploading or in any way manipulating the first two images BTW - my only contribution is the discrepancy shot).

source of images used

We could probably dub CFAA "Joan Collins" AA - she used to insist on softening filters when filming closeups as she aged.
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
Originally posted by: keysplayr2003
Originally posted by: Gstanfor
The following images are apparently using R600 16x CFAA Anti-aliasing

image1
image2

I did notice what appears to be some sort of rendering artifact on the images (marked with red circle) - discrepancy

LOL, when not magnified, it looks pretty damn nice actually.

your right.. its look plain awesome as it would look when playing gAMES. using magnification to ruin it is just stupid.
 
Oct 4, 2004
10,515
6
81
Originally posted by: Gstanfor
The following images are apparently using R600 16x CFAA Anti-aliasing

I did notice what appears to be some sort of rendering artifact on the images (marked with red circle) - discrepancy

Originally posted by: Gstanfor
Shame about that polyon I circled that stands out like a dogs d*** though.

I must be blind because after a lot of zooming and peering, I still don't see a doggy d*** in that image :confused:
What kind of supreme vision do you have?
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Just use the discrepancy image as a guide for where to look on the original image. If you can't see a triangle, apex pointed downward, textured with sky and grille, noticeably lighter than the surrounding sky then you are well and truly overdue for a checkup with your eye doctor.

You don't have to zoom anything, in fact you can completely disregard the zoomed image.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
Just when you thought blurry AA like quincunx was gone forever it shows up again
Yet I don't seem to recall you lambasting Quincunx.

Note the samples in the outer circles come from outside the current pixel having AA applied (same concept as quincunx was, just more samples). This will almost surely lead to blurring (just like quincunx)
Quincunx did the sampling from the frame buffer after the image was finished. It seems likely ATi's method will AA during rendering so it should have less blur.

Also assuming the shots weren't taken with AAA then this'll be serious competition for nVidia's SSAA modes as ATi's method might well have better image quality and run faster too.

The fence is clearly getting treatment and also ATi's slides are saying CFAA isn't restricted to polygon edges like MSAA/CSAA is.

I did notice what appears to be some sort of rendering artifact on the images (marked with red circle)
A rendering artifact you say? You mean like this one?
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
The lighter patch isn't meant to be there.

Quincunx was OK at a time when it took two accelerator chips to get 4x AA semi-playable. The world has moved on since then (and the competitor has a superior AA scheme).

Perhaps ATi's drivers aren't quite as perfect as they like to preach that they are.
 

BlizzardOne

Member
Nov 4, 2006
88
0
0
And ATi put that lighter patch there..?? I suppose they put it in the non AA'd pic too, just for consistencies sake..

and where on earth do they preach about perfect drivers, if they were perfect, wtf would they bother with release notes every release?:confused: