This whole R600/G80 benchmarks thing is nonsense.

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: josh6079
sure ... i have 2 million bookmarks all organized into instant access :p

:roll:
Why the sarcasm? I didn't ask for 2 million links, just one. If you don't have it, it's no problem. Just thought I'd ask since I had never thought that still images could reveal shimmering.
if you like, i could spend the next six months looking for a photo for you
Such devotion.

:heart:

The sad part is, I actually believe you would do something like that. I know you wouldn't let a topic (whatever topic it be) fade away. It's just not your style. ;)

i have 2,000,000 bookmarks ... *one* of them may have your pic

--the *best* i could do was that hexus.net review ... with an *iffy* shimmering pic
- i am *done* looking ...

who knows ? ... perhaps i was mistaken :p
:confused:

i know i didn't see much shimmering with my 7800GS -in January
... as i don't seen much [any] with my x1950p now

i think the shimmering issue is mostly resolved ;)

--and by x850xt ... Sacrifice ran pretty well on ATi cards :p
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Gstanfor
? Why has the talk about shimmering just pop out of the blue?
BFG10K in troll mode.

Its a good thing ATi is finally (supposedly) going to impliment decent AF, given they were the ones who cheapened it with R200 and bilinear rather than trilinear texture stages samples, and we consumers have had to live with the fallout ever since.

but 'shimmering' notwithstanding, when comparing g7x to r5xx, ati's AF was far suprerior to nvidia's, so let's not pretend nvidia has a perfect history either... not to mention the performance dive g7x's took when you removed all their default optimizations....
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: CaiNaM
Originally posted by: Gstanfor
? Why has the talk about shimmering just pop out of the blue?
BFG10K in troll mode.

Its a good thing ATi is finally (supposedly) going to impliment decent AF, given they were the ones who cheapened it with R200 and bilinear rather than trilinear texture stages samples, and we consumers have had to live with the fallout ever since.

but 'shimmering' notwithstanding, when comparing g7x to r5xx, ati's AF was far suprerior to nvidia's, so let's not pretend nvidia has a perfect history either... not to mention the performance dive g7x's took when you removed all their default optimizations....

You don't have to tell me that. If you search my posts @ B3D you'll see where I expressed my disappointment in nv4x's AF implimentation very early on, and I'm very pleased indeed it's been abandoned (hopefully for good) in G8x.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: apoppin
Originally posted by: Lord Banshee
Not sure if this was posted yet but here are some tech slides

http://forums.vr-zone.com/showthread.php?t=148492

no one wants to see anything that is remotely "on topic"
what is r600?
:confused:

this IS "video" :p

:roll:

keep them *flames* coming
:laugh:

No, no apoppin, the ATi fanboys (not including you here) don't want to see anything remotely on topic. They keep getting brutal reminders of the old saying "the truth hurts". Frankly I look at what R600 brings to the table and the bandwidth on offer from the memory subsystem and I just wonder what the hell were the ATi engineers thinking?!? Were they thinking? or were they smoking something hallucinogenic? :evil:
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
You don't have to tell me that. If you search my posts @ B3D you'll see where I expressed my disappointment in nv4x's AF implimentation very early on, and I'm very pleased indeed it's been abandoned (hopefully for good) in G8x.
Then why did your previous statement make it out to be "ATi's fault" by saying that:
Its a good thing ATi is finally (supposedly) going to impliment decent AF, given they were the ones who cheapened it with R200 and bilinear rather than trilinear texture stages samples, and we consumers have had to live with the fallout ever since.
Besides, you shouldn't even be talking about AF since you think the more rings the better. :roll:

Shimmering aside, you never answered my question. Is this small, almost insignificant rendering error in HL2's chain-link fence the only graphical problem you can dig up with ATi's unreleased hardware with unreleased drivers?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
Originally posted by: apoppin
Originally posted by: Lord Banshee
Not sure if this was posted yet but here are some tech slides

http://forums.vr-zone.com/showthread.php?t=148492

no one wants to see anything that is remotely "on topic"
what is r600?
:confused:

this IS "video" :p

:roll:

keep them *flames* coming
:laugh:

No, no apoppin, the ATi fanboys (not including you here) don't want to see anything remotely on topic. They keep getting brutal reminders of the old saying "the truth hurts". Frankly I look at what R600 brings to the table and the bandwidth on offer from the memory subsystem and I just wonder what the hell were the ATi engineers thinking?!? Were they thinking? or were they smoking something hallucinogenic? :evil:

read the last page of http://translate.google.com/translate?u...=UTF-8&oe=UTF-8&prev=%2Flanguage_tools

and you may know what they were thinking ;)

very interesting little OnT link ... thanks Lord Banshee
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
No, no apoppin, the ATi fanboys (not including you here) don't want to see anything remotely on topic.
The OP started this thread discussing their dislike for soft launches that do nothing but impress people with benchmarks.

You then tried to make the claim that the R600 won't impress people with it's performance due to lack of TMU's, a piece of information you yourself claimed to be a rumor.

In fact, you labeled nVidia's G7 series less efficient TMU's as "the reason why g7x remained so competitive against R5xx" when in fact it was because the default settings nVidia told websites to run the games on was of terrible IQ and, therefore, not very taxing on performance. (aka. default driver settings with horrible AF).

Then you abandoned that conversation and started linking pics to a "dog's d***" of a rendering error.

The way I see it, you don't want to see anything on topic becuase the topic is somewhat about how ridiculous of a card the G80 ultra is.
Try including "with R200" in your highlight Josh, then go lookup R200's launch date...
Why should I when you said, "ever since." :confused:
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
You are pathetic Josh.

Apoppin, I don't think clock speed will save ATi & R600 anymore than it save Intel & P4 or nvidia & nv3x.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
You are pathetic Josh.
:roll:

I answer the questions you ask me, something you still have yet to do.

I'll ask again: Is this small, almost insignificant rendering error in HL2's chain-link fence the only graphical problem you can dig up with ATi's unreleased hardware with unreleased drivers?
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
No Josh, you don't answer the questions - at least not properly, but then again, I'd expect nothing less from you. As to other problems, I think R600 has quite enough of them to be getting on with for the moment, don't you? When we start seeing images from more credible sources I'll look again, but the rendering error in that HL2 image is something I'm confident no-one has faked in photoshop.

Oh yes, about those unreleased drivers... I (and many others I'm sure) are going to fascinated to discover why ATi needs a driver 350mb in size. I have a few theories as to why they might be that size - theories that I don't think will please the fanatics...

Apoppin, trust me, clock speed won't save the R600. We've been there multiple times in the past with PC hardware and its never an effective solution. In fact I'd say its usually a sign of desperation.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
No Josh, you don't answer the questions - at least not properly, but then again, I'd expect nothing less from you.
What are you even talking about?
As to other problems, I think R600 has quite enough of them to be getting on with for the moment, don't you?
If that is the kind of rendering that it is doing on HL2 immediately before its launch, then I'd have to say it's better than the G80's rendering of HL2 immediately after its launch.

Hell, there's still major rendering errors with certain games and the G80, and I don't see anyone else here blaming them on the card's specs such as temperature, overclocked cores, or "out of spec" memory. But low and behold, a spec of an error in a rendered frame drawn only by unreleased utilities and you're quick to the trigger with nothing but a load of crap for ammo.

As for the R600 having problems, all I'm going to say is that we're not sure if the R600 was delayed because of respins or delayed due to AMD's desire to launch more products together - or both for that matter. I wouldn't be surprised if it was delayed for being faulty, but I also wouldn't be surprised if AMD really did want to launch more products together. Jumping on either theory is simply guessing.
When we start seeing images from more credible sources I'll look again, but the rendering error in that HL2 image is something I'm confident no-one has faked in photoshop.
Firstly, I never said it was a photoshoped image.

Secondly, so you're answer is yes then? That tiny rendering error was all that you could find to serve as a derailment for this thread? A derailment that instead concentrated on the ridiculous launch of the G80 Ultra.

To reiterate, the OP's primary complaint was:
Originally posted by: coolpurplefan
What I'm pointing out is that I think it's a total farce a pure PR (public relations) for these companies to release the kind of information they want, when they want. I just think these moves imply that they think we're unbelievably stupid and we don't know what's going on. Come on, a SOFT LAUNCH to impress people with benchmarks? Come on.
While I don't totally agree, it does seem naive to think that gamers don't expect competition. The vendor doing the best wants to launch a card - however ridiculous of a card it be - to "steal" the competitor's "thunder". The G80 Ultra launching very close to the R600 is a prime example of this. Gamers know this, the other vendor knows this, and the vendor of preference should know this too. However, the part that I don't agree with the OP on is that even if everyone's not stupid and knows what kind of cards will be launching, why does that mean they shouldn't continue to do it? If they didn't launch a card simply for benchmarks with a limited availability, they wouldn't be carrying the status or the option for the best, however ridiculous it is. It's just that nVidia seems to have failed to "steal the thunder" from even their previous high-end card, the 8800 GTX.

While the Ultra fell short in that regard, the "previews" for the X2900XTX that I've seen were also a fall short from expectation. Whether or not it does perform well we shall see when the real reviews poor out.

Gstanfor, it's not that your claims may be wrong or right. I honestly don't know if that small rendering error could mean an overclocked core, unstable temperatures, or out of spec memory. But I don't pretend to know either. A visual hiccup as small as that could also very well be a driver problem. Troubleshooting almost insignificant errors such as that can give you a wide range of possibilities. But clutching to a minor problem while ignoring all others from a different manufacturer is what is annoying about you. That and the fact that you are merely clutching to that problem to distract from nVidia's recent launch.

Also, the mere fact that a simple artifact like that was all that website could find, I think that's rather a good sign considering the normal attributes with new hardware and new drivers. After all, it could be worse. And that small of a problem is negligible.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Actually artifacting of any kind tends be be a very large problem indeed for video cards since it normally indicates they are dying. Given R600's power and thermal envelopes this can hardly be encouraging news for anyone insane enough to be considering purchasing one of these cards.

No *you* didn't say the images had been tampered with, however, there are R600 images out there (such as the stalker ones) that look as though they have had their tonal range tampered with. You'll notice I've said nothing on that subject and won't until I'm certain of what is going on.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Gstanfor
clock speed won't save the R600. We've been there multiple times in the past with PC hardware and its never an effective solution. In fact I'd say its usually a sign of desperation.

It's not an effective solution when the architecture is designed to sell purely on MHz specs. Or when the ASIC is cut down on too many processing units and is designed to sell based on marketing. But clockspeeds are a part of the engineering decisions made in the design of the architecture, and is a valid way to improve performance. Maybe it took a 500mhz r420 to equal a 400mhz nv40, but the r420 can reach 500mhz and higher, while the nv40 can't. And in the end, nv40 lost in performance.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
It seems like the double-speed shading processors on the G80 took ATI by surprise and they were unable to respond effectively.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
I'd say many aspects of the g80 took most people by surprise. ;)

But Ati must have their own performance targets for next gen HW, and if they were't planning on making the r600 ~2x as fast as the r580, then it's a mistake on their part, with or without the g80.
 

Lord Banshee

Golden Member
Sep 8, 2004
1,495
0
0
ok it is stupid to say clock speed wont help them.

Are you an electrical engineer? My guess is no.

Let me enlighten you there are many designs to take and one way isn't the only way. If we didn't care about clock speed at all then we wouldn't even have pipeline processors. This idea came directly from the idea of creating a processor to execute the same amount of instructions as a single cycle per clock but at a much greater clock speed.

I am not going to say i know much about engineering a GPU but the design is relatively new compared to CPUs and only real contributors to this technology is Ati/AMD and Nvidia so i have faith and both of their engineering abilities and you should too.

If the reviews show otherwise, well it isn't like all technologies are perfect the first or even the second time around. I remember the P4 going though many phases before they got the "core" cpu out for desktops, workstations and laptops.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Why has the talk about shimmering just pop out of the blue?
Gstanfor telling us he has superior vision when in reality he's as blind a bat to nVidia problems.

When we start seeing images from more credible sources I'll look again, but the rendering error in that HL2 image is something I'm confident no-one has faked in photoshop.
You mean like nVidia confirming the HL2 fog issue that was only fixed in drivers about a month ago?

Actually artifacting of any kind tends be be a very large problem indeed for video cards since it normally indicates they are dying.
So all G80s are dying due to past fog issues in HL2?
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
nvidia doesn't have IQ problems when the appropriate settings and drivers are used. Some of us aren't allergic to beta drivers. You are being retarded (as per usual) regarding the artifacts - please post up some images of g80's displaying incorrectly rendered polygons relative to those around them as the result of a source engine bug nvidia was forced to fix because lard-arse Newell was too busy sucking up to Orton and co.

ok it is stupid to say clock speed wont help them.
No it isn't. They are already on the thermal and power edge with R600, just trying to get it where it currently is. One can only imagine the heat and power consumption of an 800Mhz R600. I suspect by that point it could probably embarrass a small steel foundry on both metrics.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
nvidia doesn't have IQ problems when the appropriate settings and drivers are used. You are being retarded (as per usual)
Utter lies.

And before you claim it's somehow a "system issue" I would like to point out that multiple people have been able to replicate the various problems I've posted up.

Now be a good little troll and crawl back under your AEG bridge.

please post up some images of g80's displaying incorrectly rendered polygons relative to those around them as the result of a source engine bug nvidia was forced to fix because lard-arse Newell was too busy sucking up to Orton and co.
Leading question logical fallacy.