This whole R600/G80 benchmarks thing is nonsense.

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Gstanfor
3dmark isn't a game, Doom3 is. I'm not disputing that nvidia replaced shaders there - they made several public statements about their intention to do so at the time.

Um... No, they didn't. After their static clipping planes, ignoring back buffer clearing and vertex/pixel shader replacements were discovered by ExtremeTech, Nvidia claimed it was simply a "driver bug" that mysteriously disappeared entirely when 3DMark was renamed. Nvidia added EIGHT separate detection routines that were only activated when the 3DMark03 executable was running with its original filename.

And yes, before you start blathering on about ATI doing it too, they did have one suspect shader replacement in one test that resulted in a net 1.9% increase in overall 3DMark points unlike Nvidia which enjoyed a 24% overall increase with their driver "optimizations" in place. ATI also had the infamous Quack "optimizations".

So please stop trying to whitewash Nvidia. There are those of us who know better.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Have I also mentioned that the fact that this problem affects ATi's beloved Half-life 2 makes the problem all the more entertaining to me?
Should we bring up the problems that were plaguing Splinter Cell: Double Agent on G80's? Would that be a valid derailment since it's issues were great and it was a TWIMTBP title?
No, I d on't, [have the driver]
Then you're claim about it being 320 MB's is simply bullshit.
As for Red Faction, I've got that installed on one of my drives somewhere. Haven't touched it in years. Post up a savegame in a problem area and I'l ltest it out for you.
You're right Gstanfor, loading up a game on your 7 series rig will really show that the G80's won't have a problem with it...

By the same logic we might as well load up HL2 on any X19k series and see if that same chain-link artifact is there.
You really should prove such claims too (something the fantaics never have).
Like how you "proved" the driver being used is 320 MB's?

Like how you "proved" that the more color rings the better the AF?

Like how you "proved" that HDR+AA can be achieved in Far Cry on any 6 or 7 series nVidia card? etc, etc.
I hate idiots.....
Me too, especially the ones who don't know what they're talking about and constantly try to pretend like they do.

And the ones who try to say that the G80's don't have problems just because the G71's can render the same thing without issue.

And the ones that try to put up false images to support their HDR+AA theory when using hardware that can't do it in said title.

And the ones that derail countless threads in a vendetta to misinform, whether it be out of spite or out of marketing commission.

And the ones that can't count colored rings. (But I should really use the singular version and say "one" because there's only one of us here who can't do that)
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
quote:
Have I also mentioned that the fact that this problem affects ATi's beloved Half-life 2 makes the problem all the more entertaining to me?


Should we bring up the problems that were plaguing Splinter Cell: Double Agent on G80's? Would that be a valid derailment since it's issues were great and it was a TWIMTBP title?
was nvidia involved in a grotesque love affair with the developer like ATi was with HL2?

No, I d on't, [have the driver]


Then you're claim about it being 320 MB's is simply bullshit.
It's not my claim, its DailyTech's.

You are very welcome to post an x1950 comparison picture if you so desire (i'll bet you don't though).
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
quote:
Have I also mentioned that the fact that this problem affects ATi's beloved Half-life 2 makes the problem all the more entertaining to me?


Should we bring up the problems that were plaguing Splinter Cell: Double Agent on G80's? Would that be a valid derailment since it's issues were great and it was a TWIMTBP title?
was nvidia involved in a grotesque love affair with the developer like ATi was with HL2?

oh yes ... the entire twiimtbp programme is a *grotesque* attempt to get ALL the developers optimizing for nvidia

grotesque is dependent an PoVs ;)

*everything* ATi does is "grotesque" --to you
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Show me the TWIMTBP game that received $5 million in funding and had the developer slagging off the opposing IHV's GPU's? (lets not forget a certain other action RPG based upon th earlier source build that oh so mysteriously got leaked around the time this deal took place - funnily enough it demonstrably had no issues coping with things like _PP etc... This title was forced to wait over a year after being completed because valves ego would not permit any source engine title other than their own to be released first. Those who lament the demise of Troika should bear that in mind.)

For that matter show me *ANY* TWIMTBP title where cash changed hands between developer/publisher and nvidia (good luck with that...)
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Oh, and for those clinging to the hope that overclocking/mhz will somehow save R600:

Link
Today, 01:06 AM
Brent_Justice (H) Video Card Managing Editor

Brent_Justice is offline
All I'm gonna say is, 3dmark scores mean jack in relation to real gaming performance that I am experiencing.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
Show me the TWIMTBP game that received $5 million in funding and had the developer slagging off the opposing IHV's GPU's? (lets not forget a certain other action RPG based upon th earlier source build that oh so mysteriously got leaked around the time this deal took place - funnily enough it demonstrably had no issues coping with things like _PP etc...)

For that matter show me *ANY* TWIMTBP title where cash changed hands between developer/publisher and nvidia (good luck with that...)

so now it's nvidia's "cash trail" you want us to follow?
:roll:

you think their twiimtbp program is *free* ?
:confused:

===================
Originally posted by: PC Surgeon
Before long, Gstanfor will surpass Rollo's reputation and become a legend of his own.

he had a unique reputation established long before Rollo

what he *doesn't get* is that his one-sidedness and rabid unreasoning hatred for "things ATi" ... including wishing for death and harm to ATi employees -

has *Driven* many of the "formerly undecided" at ATF right into ATi's "camp"
:Q

ATi has got to *love* Gstanfor ...
:heart:

i know nvidia wishes he would StFU :p
:shocked:

--think about it ;)

:laugh:

... and i am going out on a limb here ... from what i am gathering ...

... i am wiling to bet that the HD2900XT is gonna kick the GTX butt in DX10

--few more days
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
so now it's nvidia's "cash trail" you want us to follow?


you think their twiimtbp program is *free* ?
No, thats not what I said, try reading it again. I said *unlike* ATi, nvidia doesn't pay developers cash, sometimes they don't even provide as much equipment (developer cards) as others do.

educate yourself
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: apoppin
blabla...

... i am wiling to bet that the HD2900XT is gonna kick the GTX butt in DX10

--few more days

I agree... Judging by how forward thinking ATi is, that makes a lot of sense
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
so now it's nvidia's "cash trail" you want us to follow?


you think their twiimtbp program is *free* ?
No, thats not what I said, try reading it again. I said *unlike* ATi, nvidia doesn't pay developers cash, sometimes they don't even provide as much equipment (developer cards) as others do.

educate yourself

thank you for brightening my day with your nonsense ... that *others* comment is "rich"

yeah, nvidia runs their twiimtbp program on a shoestring budget
RotFL

maybe you missed it ... HD2900xt is gonna DESTROY the GTX in DX10

you can pick up the pieces when crysis comes out
-if you can find any :p


 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Who cares? (if its even true) as you are so fond of pointing out there are no Dx10 games, and when there are, there will be faster refreshes or even a new gen of cards to run them on.

People buy G80 for its Dx9 performance.

Just in case you missed it before:
Originally posted by: Gstanfor
Oh, and for those clinging to the hope that overclocking/mhz will somehow save R600:

Link
Today, 01:06 AM
Brent_Justice (H) Video Card Managing Editor

Brent_Justice is offline
All I'm gonna say is, 3dmark scores mean jack in relation to real gaming performance that I am experiencing.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
Who cares? (if its even true) as you are so fond of pointing out there are no Dx10 games, and when there are, there will be faster refreshes or even a new gen of cards to run them on.

People buy G80 for its Dx9 performance.

not when they see the DX10 benches

and there *will* be DX10 benches - unlike the sloppy-ass "review" at DT

http://www.theinquirer.net/default.aspx?article=39448
we learned that [Call of Juarez] the company will ship a DirectX 10 benchmark really soon, probably in next couple of weeks.

Given the current lack of DirectX 10 titles that would enable hacks to benchmark the bloody graphics cards with API they support natively, this is a welcoming news.

Since the patch should be free, if you own a DX10 graphics card, you could consider buy the game

Perhaps the 'ultra' doesn't looks so good - for over $800 :p


 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
If R600 can't convincingly best g80 in Dx9 I'd love to know what makes you think it will in Dx10. Nevermind that I still think nvidia has yet to play its full hand on several fronts. I personally think the Vista "troubles" are little more than a ruse to keep ATi guessing.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
lol.. an elaborate rouse? conspiracy!

i hardly doubt there's anything more to it than it required more man hours than nvidia could throw at it.. new architecture, new api, and too many combinations (directx, ogl, sli, backward compatibility and so on) and frankly, i haven't had any vista "troubles" with my GTS in a month, and vista is now my only OS on this pc.

as for r600, i find it amusing people cling to the conjecture that somehow it will miraculously excel in DX10.... there's absolutely not a damn thing to base that on. logic should dictate it will be a little better or a little worse than G80, but not substantially different one way or the other.
 

soybeast

Senior member
Apr 26, 2006
255
0
76
Originally posted by: CaiNaM
lol.. an elaborate rouse? conspiracy!

i hardly doubt there's anything more to it than it required more man hours than nvidia could throw at it.. new architecture, new api, and too many combinations (directx, ogl, sli, backward compatibility and so on) and frankly, i haven't had any vista "troubles" with my GTS in a month, and vista is now my only OS on this pc.

as for r600, i find it amusing people cling to the conjecture that somehow it will miraculously excel in DX10.... there's absolutely not a damn thing to base that on. logic should dictate it will be a little better or a little worse than G80, but not substantially different one way or the other.

I'm not sure what kind of logic you're appealing to. Unless you can dissect all the engineering behind the two architectures as well as the API, it would seem your assumption based on your logic is just as empty as those believing the r600 will excel in dx10.
 

palindrome

Senior member
Jan 11, 2006
942
1
81
Originally posted by: Gstanfor
Originally posted by: palindrome
Originally posted by: Gstanfor
The following images are apparently using R600 16x CFAA Anti-aliasing

image1
image2

I did notice what appears to be some sort of rendering artifact on the images (marked with red circle) - discrepancy

Please tell me you really aren't THAT stupid....nvm this is another R600 thread, I guess I should come to expect this now...

(hint: the "artifact" you see in the 16xCSAA is actually a white object in the backround, it can clearly be seen in both pictures)

I hate trolls...

Keep reading palindrome. Check out my comparison image. There is a savegame to go along with that as well as other images, if you care to actually read...

There is no "white object in the background" and if there where it would be extremely unlikely to be composed of only one polygon.

I hate idiots.....

I saw your comparason image, its very pretty. But you were talking about how the CSAA is fvcked up. The white spot is CLEARLY in both pictures. Keep trolling, buddy, just keep trolling...
 

palindrome

Senior member
Jan 11, 2006
942
1
81
Originally posted by: Gstanfor
If R600 can't convincingly best g80 in Dx9 I'd love to know what makes you think it will in Dx10. Nevermind that I still think nvidia has yet to play its full hand on several fronts. I personally think the Vista "troubles" are little more than a ruse to keep ATi guessing.

Who pays you to write this crap? Hey, if you are getting free stuff to write this, just PM me, I wanna join too. ;)
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Holy cow, he thinks NV is creating driver problems on purpose? Thats just, well, dumb.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Gstanfor
If R600 can't convincingly best g80 in Dx9 I'd love to know what makes you think it will in Dx10. Nevermind that I still think nvidia has yet to play its full hand on several fronts. I personally think the Vista "troubles" are little more than a ruse to keep ATi guessing.

DirectX10 is a very different beast than DirectX9, arguably this is the biggest change in DX we have seen in a while. It would only be a repeating of past history if the R600 was significantly faster than nVidia in DX10 games. The GeForce FX wasn't terrible at DX8 games, but in DX9 games it was useless. I'm not saying the GeForce 8xxx's are going to be nearly as bad as the FX series, but still I believe while they lead the pack in DX9 performance, they will not lead in DX10 performance.

ATI has consistantly made architectural decisions based on NOT ONLY current performance, but also future performance and I can't see any reason this would be different. Take the X1800 vs 7800 for example... the 7800GTX 512MB destroyed the X1800XT in 2005 games, but the X1800XT is significantly faster than either the 7800GTX 256MB or 512MB version in 2006 games such as Oblivion. nVidia is all about winning benchmarks to make their cards sound good.

As for nVidia not fixing the problems many users experience with Vista drivers to "keep ATI guessing," if that's the truth, then I would never want to do business with nVidia.
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
Originally posted by: Ackmed
Holy cow, he thinks NV is creating driver problems on purpose? Thats just, well, dumb.

ROFLMAO
oh boy. i can always rely on gstanfor after a long hard day. thanks buddy
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: apoppin
maybe you missed it ... HD2900xt is gonna DESTROY the GTX in DX10

Link, please?
Contrary to what you may think, I really need to know and am not just trying to disprove you because I'm an NVIDIA fanboy, like some people on here are. :confused:

I really need to know, because if it's true, i need to sell my GTX ASAP to make enough money on an R600.

Although, I am curious as to why you went from being completely disappointed in and dumping on AMD/ATI just a week ago to saying that an XT will destroy a GTX today. Were you just sour that your old computer couldn't even physically take either an R600 and G80 that you looked for scapegoats that weren't promoting competition in any way? Probably an ATI fanboy all along, as I suspected... :disgust:
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Nightmare225
Originally posted by: apoppin
maybe you missed it ... HD2900xt is gonna DESTROY the GTX in DX10

Link, please?
Contrary to what you may think, I really need to know and am not just trying to disprove you because I'm an NVIDIA fanboy, like some people on here are. :confused:

I really need to know, because if it's true, i need to sell my GTX ASAP to make enough money on an R600.

Although, I am curious as to why you went from being completely disappointed in and dumping on AMD/ATI just a week ago to saying that an XT will destroy a GTX today. Were you just sour that your old computer couldn't even physically take either an R600 and G80 that you looked for scapegoats that weren't promoting competition in any way? Probably an ATI fanboy all along, as I suspected... :disgust:

There's no hard evidence that the R600XT will "destroy" the GTX in DX10, but there is info to suggest it will be ahead. According to Crysis developers the XT is slightly faster than the 8800GTX in the current build of Crysis. That's pretty much the only info, but there is evidence that the R600 architecture is more suited to DX10 than DX9. It doesn't sound like much, but considering the HD 2900XT will retail for $400 and below, the fact that it will prove to be faster than a $500+ GTX is pretty good IMO.

 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
I wouldnt make any bets, or claims on DX10 performance either way. If the past has told us anything, its that both cards will be too slow to run it very well. But to make any claim that one card will run it faster at this junction is pretty foolish.