Something in the HDR article...

linkgoron

Platinum Member
Mar 9, 2005
2,598
1,238
136
We've also shown that ATI seems to handle Valve's implementation of HDR better than NVIDIA, and if we could have tested with one of the next-gen ATI cards, the 7800 GTX would have assuredly been beaten out for the highest fps.

I was wondering, if Anand already has a x1800XT, and this means they actually tested them and they're trying to give us some info without actually saying it, or is this speculation?
 

Steelski

Senior member
Feb 16, 2005
700
0
0
with 5 days to go till the NDA lift, you bet your behind they have a card. especially if ATI are aiming to ship the cards.
Can anyone here break into someones warehouse and verify this for us?
 

MBrown

Diamond Member
Jul 5, 2001
5,726
35
91
Its just one game. Does ATI usualy do better with Valve anyway? I am not a fanboy. I almost waited longer to get one of these but I couldnt wait.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
So what can we infer from this statement as far as the r520 is concerned, besides the obvious? If they're so sure that it would beat the 7800gtx at 16x12 nonetheless, that contradicts what Sassen's "benchmarks" have shown. So, I'm guessing is shader-heavy situations the 16 pipe r520 is more than able to keep up with the 24 pipe gtx. Granted, HL2 is bound to run better on Ati HW, but in other modern games I'd expect the r520 to offer no worse performance than the gtx.
 

shabby

Diamond Member
Oct 9, 1999
5,782
45
91
Originally posted by: linkgoron
We've also shown that ATI seems to handle Valve's implementation of HDR better than NVIDIA, and if we could have tested with one of the next-gen ATI cards, the 7800 GTX would have assuredly been beaten out for the highest fps.

I was wondering, if Anand already has a x1800XT, and this means they actually tested them and they're trying to give us some info without actually saying it, or is this speculation?

The problem with that statement is that they said "we've also shown", what exactly have they shown in that article that would show ati beating nvidia in hdr? It was the other way around.
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
Yeah, I saw that too...there was little to actually back up the statement that ATi handled it better since there wasn't really a comparable set-up in the benchmarks...the 7800GT/X were out in the front and were obviously CPU limited, since enabling HDR barely cost them anything, and then the only other nV card was the 6600GT, which in my opinion isn't a card you buy to enable big performance hit features anyway. I just don't quite understand Anand's reasoning in not having an 6800GT or Ultra to compare and yet having both a x800XT and an x850XT...it's the same card + 20 mhz, there's no reason to double that up and then leave out the competing product entirely (unless considering the prices on the 7xxx series Anand considers them the direct competitors to the x8xx series...but that's somewhat silly).
 

compgeek89

Golden Member
Dec 11, 2004
1,860
0
76
ATi's next gen will only stay with the GTX if they have an efficient SM 3.0 implementation. nVidia's 6 series lose quite a few more fps than the 7 series (like 10-15 instead of 3-5) which is due to nVidia's more efficient HDR architecture on the 7 series.

Will ATi's first batch of SM 3.0 chips already have this?
 

MBrown

Diamond Member
Jul 5, 2001
5,726
35
91
I wanna see some other games with HDR and have them go head to head. Obviously ATI is going to be better in this game right?
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126

We've also shown that ATI seems to handle Valve's implementation of HDR better than NVIDIA, and if we could have tested with one of the next-gen ATI cards, the 7800 GTX would have assuredly been beaten out for the highest fps.

They said "if" then it most like would have beat it out just by guessing the performance between the X800 and 7800GTX.

The problem with that argument is at that resolution and settings the 7800 barely gets to show its muscle. Just looking at the numbers I am guessing the 7800 is CPU limited. It drops a whole 4% going from no HDR to full HDR. The X850XT drops 23% in the same instance.

So in reality we have no idea how the R520 can compare because it is obvious the 7800 is CPU limited.
 

linkgoron

Platinum Member
Mar 9, 2005
2,598
1,238
136
Originally posted by: Genx87

We've also shown that ATI seems to handle Valve's implementation of HDR better than NVIDIA, and if we could have tested with one of the next-gen ATI cards, the 7800 GTX would have assuredly been beaten out for the highest fps.

They said "if" then it most like would have beat it out just by guessing the performance between the X800 and 7800GTX.

The problem with that argument is at that resolution and settings the 7800 barely gets to show its muscle. Just looking at the numbers I am guessing the 7800 is CPU limited. It drops a whole 4% going from no HDR to full HDR. The X850XT drops 23% in the same instance.

So in reality we have no idea how the R520 can compare because it is obvious the 7800 is CPU limited.
They're under NDA(that's what it's called right?), they can't say "Well the R520 ownz the G70". So they add an IF...

 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Genx87

We've also shown that ATI seems to handle Valve's implementation of HDR better than NVIDIA, and if we could have tested with one of the next-gen ATI cards, the 7800 GTX would have assuredly been beaten out for the highest fps.

They said "if" then it most like would have beat it out just by guessing the performance between the X800 and 7800GTX.

The problem with that argument is at that resolution and settings the 7800 barely gets to show its muscle. Just looking at the numbers I am guessing the 7800 is CPU limited. It drops a whole 4% going from no HDR to full HDR. The X850XT drops 23% in the same instance.

So in reality we have no idea how the R520 can compare because it is obvious the 7800 is CPU limited.

I dont understand why ppl think the 7800gtx is cpu limited at 16x12. If you look at these benches of HL2 http://www.techreport.com/reviews/2005q2/geforce-7800gtx/index.x?pg=8, you can tell that the 7800gtx isn't stressed a lot, but nevertheless its fps does go down at 16x12. That's not what I'd call CPU limited. It might be limited by other aspects in the gfx pipeline, like the ROPs' fillrate, and althouth the game is cpu-heavy, you cant blame the fps qt 16x12 on the cpu anymore.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: munky
Originally posted by: Genx87

We've also shown that ATI seems to handle Valve's implementation of HDR better than NVIDIA, and if we could have tested with one of the next-gen ATI cards, the 7800 GTX would have assuredly been beaten out for the highest fps.

They said "if" then it most like would have beat it out just by guessing the performance between the X800 and 7800GTX.

The problem with that argument is at that resolution and settings the 7800 barely gets to show its muscle. Just looking at the numbers I am guessing the 7800 is CPU limited. It drops a whole 4% going from no HDR to full HDR. The X850XT drops 23% in the same instance.

So in reality we have no idea how the R520 can compare because it is obvious the 7800 is CPU limited.

I dont understand why ppl think the 7800gtx is cpu limited at 16x12. If you look at these benches of HL2 http://www.techreport.com/reviews/2005q2/geforce-7800gtx/index.x?pg=8, you can tell that the 7800gtx isn't stressed a lot, but nevertheless its fps does go down at 16x12. That's not what I'd call CPU limited. It might be limited by other aspects in the gfx pipeline, like the ROPs' fillrate, and althouth the game is cpu-heavy, you cant blame the fps qt 16x12 on the cpu anymore.

Losing ~3.5% by turning on HDR tells me it is limited by the CPU. That could simply be a sampling error.

this is right from your link

"Half-Life 2 is largely CPU or engine bound, not graphics bound. Any of these cards can play Half-Life 2 smoothly and easily. The SLI rigs even turn out to be slower some of the time, probably due to the additional overhead of driving a second graphics card. "