• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

If you could have either the x800 xt pe or 6800 ultra which would you take?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Hindsight and blind speculation still equals zero in the end. The only real education would be/newbe is to trust what this /the greatest hardware site tries to extrapolate out of contantantly varying results. It's simple whether its a 5800/9700 debate or x800pe/6800ue your not going to regret your vote in your purchase as long as you trust what Anand and his boys say.

I'd buy a 6800 now just because it is a fresh devlopement, with hopefully exceedingly tangible performance increases forthcoming(speculation). I make sure my decision is one that will keep enthralled no matter what E.O.Cers do or what any naysayer may prophet to dissuede me.

thanks AT and its valuable contributors.
 
Originally posted by: Rollo

LOL- how many guys that paid $500 for a 9800XT are going to still have that this fall when my $300 6800NU will likely own it in HL2? (and likely cost $250 by then?) Or how many guys with 9600XTs are going to keep those sad little cards when they'll be able to buy $150 9800Pros by then?


The people who paid for a 9800XT, 9600XT, and any other card will still get the game free. Better than paying for a FX card that slower, and wont give you the game for free. I still have my coupon, but sold my 9800XT. The only FX card that I recall that gave you a good game, was the BFG 5900, CoD was in some of them. I got it with my 5900NU, but not after I already paid $50 for it.

The fact is, it will still save the people with the coupon around $50 when the game comes out. How you see that as a bad thing, I dont know.
 
I would think you would be all about the 6800 BFG, just to have the most advanced tech if for no other reason.
The XT is still faster in Far Cry and to me that game directly represents the future. Also ATi doesn't require such an extravagent cooling or power supply solution.

On the flip-side nVidia has better drivers, a better control panel and SM 3.0.

You forget about temporal AA.
With good reason. Really, that feature blow chunks.

The truth about who has the features you'll want to have in the next 12 months
Rewind back a year and you'd see identical comments posted about SM 2.0, yet those comments were dismissed by both you and others. So before features were irrelevant and now suddenly you can't live without them.
 
Originally posted by: BFG10K
I would think you would be all about the 6800 BFG, just to have the most advanced tech if for no other reason.
The XT is still faster in Far Cry and to me that game directly represents the future. Also ATi doesn't require such an extravagent cooling or power supply solution.

On the flip-side nVidia has better drivers, a better control panel and SM 3.0.

You forget about temporal AA.
With good reason. Really, that feature blow chunks.

The truth about who has the features you'll want to have in the next 12 months
Rewind back a year and you'd see identical comments posted about SM 2.0, yet those comments were dismissed by both you and others. So before features were irrelevant and now suddenly you can't live without them.


We don't know yet if speed enhancements of SM3 will make the 6800U as fast as the X800XT at Far Cry.

A year ago myself and the others were right about SM2. I'm only speculating, but as "Wallet Raider: Angel of Square Cans" was the only SM2 game available in 2003, it won't be hard for Sm3 to have a greater impact in 2004.
 
Originally posted by: MemberSince97
Hindsight and blind speculation still equals zero in the end. The only real education would be/newbe is to trust what this /the greatest hardware site tries to extrapolate out of contantantly varying results. It's simple whether its a 5800/9700 debate or x800pe/6800ue your not going to regret your vote in your purchase as long as you trust what Anand and his boys say.

I'd buy a 6800 now just because it is a fresh devlopement, with hopefully exceedingly tangible performance increases forthcoming(speculation). I make sure my decision is one that will keep enthralled no matter what E.O.Cers do or what any naysayer may prophet to dissuede me.

thanks AT and its valuable contributors.

Philosophy major?

I always just answered "why" to every question and got A's.😛

Seriously your right one can't go wrong with either choice, proof is the close poll.
 
my vote is for the X800 XT PE...

Although I do wonder, what is everyone's opinions on longevity of the two cards? I mean in one year, which card will have most likely taken the lead? I keep hearing about the 6800U and the sm3.0, but no games currently use it ( I may be wrong...). I still think I will go with the X800 XT just because its cheaper.

Thanks for any info
 
Originally posted by: marffeus
my vote is for the X800 XT PE...

Although I do wonder, what is everyone's opinions on longevity of the two cards? I mean in one year, which card will have most likely taken the lead? I keep hearing about the 6800U and the sm3.0, but no games currently use it ( I may be wrong...). I still think I will go with the X800 XT just because its cheaper.

Thanks for any info

Thats because there are more x800xt pe out now than 6800 ultras. they have the same MSRP worth.
 
Originally posted by: GeneralGrievous
Half life and its mods were by far the biggest and best selling game of all time. Quality over quantity I say.
Actually, I think The Sims or Myst is the best selling game of all time. Also, Valve didn't write Counter Strike and wasn't the original HL based on the Quake2 engine?
 
Originally posted by: Rollo
Originally posted by: keysplayr2003
I say let the ATI fans chant about 3Dc. It is after all, all they have.


They can chant about it all they want, but what do the guys who actually make the games they'll be playing say? Oh yeah:

The truth about who has the features you'll want to have in the next 12 months
"As DOOM 3 development winds to a close, my work has turned to development of the next generation rendering technology. The GeForce 6800 is my platform of choice due to its support of very long fragment programs, generalized floating point blending and filtering, and the extremely high performance."

John Carmack, President and Technical Director, id Software

?We expect the NVIDIA GeForce 6 Series to become an integral part of our visual effects arsenal. Not only is it extremely fast, it allows us to take our games to the next generation of cinematic realism. The advanced shader capabilities alone opens up a whole new realm of possibilities for us.?

Steven Lux, Vice President of US Marketing, Codemasters



"With the GeForce 6800, NVIDIA brings long-awaited true high dynamic range imaging to gaming. The ability to perform blending and filtering on floating-point formats (combined with enormous processing power!) really makes for a wonderful feature that we're definitely looking forward to fully exploit in our Serious Engine?."

Dean Sekulic, Programmer, Croteam



"We have been eagerly awaiting this newest generation of graphics technology because it allows us to do things we could never do before. The GeForce 6800 is a flexible platform that allows us to fully exploit DirectX 9.0 shader technology. We were able to achieve a very beautiful, polished game with Far Cry, but now we have the ability to push technologies like high-dynamic range, floating point blending and dynamic branching to render pixels at double the speed, and that's a particularly exciting prospect as we begin to focus on the advancement of the CryEngine for new big games!"

Cevat Yerli, CEO and President, Crytek.



"The Grafan game engine's use of high dynamic range lighting, multiple real-time shadows, and multipass rendering techniques requires a high performance graphics card. We're currently working with the GeForce 6800 Ultra GPU and using pixel shader 3.0; all we can say is 'wow'."

Herb Marselas, Co-founder and Chief Executive Officer, Emogence, LLC

?The GeForce 6800 is a great leap forward in PC graphics, bringing next-generation DirectX 9.0 rendering performance to new heights while continuing forward with the high-definition floating-point rendering, driver performance and stability NVIDIA is famous for. As the first GPU supporting the new pixel shader 3.0 programming model, the GeForce 6800 enables games to use entirely new rendering approaches and obtain higher performance per-pixel lighting, shadowing, and special effects than was ever possible before.?

Tim Sweeney, Founder and President, Epic Games



?With 16 pixel pipelines and comprehensive support of Shader Model 3.0, the GeForce 6800 offers a wide spectrum of new features at the highest level of performance. We're looking forward to combining two new capabilities?vertex texture sampling and stream frequency dividing?to pack our mesh data, thus minimizing our video memory usage. Furthermore, OpenEXR support in texture filtering and pixel blending will finally allow us to efficiently implement high dynamic range rendering, while also allowing our artists to fully express themselves in this extended domain.?

Janos Boudet, Senior Engine Programmer, Eugensystems



"We are really impressed by NVIDIA's GeForce 6800. Eutechnyx current technology, as seen in the forthcoming SRS Street Racing Syndicate for Namco, has made High Dynamic Range Rendering a realistic proposition. Developer tools, such as FX Composer, will no doubt prove to be invaluable to us with their full support for the most advanced shading features of Microsoft DirectX 9.0, such as vertex shader 3.0/pixel shader 3.0. Being at the forefront of technology, NVIDIA is an ideal partner."

Brian Jobling, CEO, Eutechnyx



?The GeForce 6800 is actually where dreams become reality. We?ve been waiting for a hardware platform to arrive that supports pixel and vertex 3.0 shaders so we can take the Torque Shader Engine to its fullest potential and showcase technology that we?ve only imagined before.?

Jay Moore, Evangelist, Garage Games

?The vertex stream instancing enabled by the GeForce 6800 is a killer feature that allows our SpeedTree technology to drive forests of unprecedented depth and density. Coupled with raw performance, we fully expect our customers? games to boast far richer and more immersive environments.?

Chris King, President, IDV, Inc.


There's more, but you get the idea. How about a list of big game developers from ATIs page saying how they can't wait to code for 3Dc?

<crickets chirping>

LOL

Seriously, does ATI have anyone other than Gabe to trot out and say he likes their 2 year old tech? Literally everydeveloper I've heard of besides Gabe is hyped about SM3/DX9c. I haven't seen any of them saying "We'll be working hard to make your gaming experience just as good in PS2!".

ok, rollo.. i have but one question to ask. you didn't buy into any of the sm2 marketing propoganda, why on earth would you buy into sm3 propoganda.

it's just as likely (perhaps moreso) it will be of little/no relevance in this generation of cards as it it will be of benefit. it's all speculation at best, and the reality of it is that while sm3 could offer some benefits, it simply does not offer substantial benefit over sm2.

from all the reports/tech previews it's "possible" it may lower the performance cost of rendering features, to time cost of programming them, but it remains to be seen where we, as end users, will see any noticeable benefits.
 
Originally posted by: darXoul
6800 Ultra.
Why?
1. I still trust nVidia more in terms of compatibility.
2. Drivers could still offer more headroom for improvements, and performance is very close even now.
3. SM 3.0 might become useful in some games.
4. I'm sure many manufacturers will make really quiet 6800s so noise is not an issue.
5. Cooling size is not an issue either, I just have one PCI slot occupied anyway.
6. Power is not an issue since nVidia lowered the official recommendation and I'm going to get a good Antec case with a quality PSU.


However, since for me money IS an issue in Europe where PC parts are really expensive, I'll be probably getting 6800 GT.

not those are good reasons to go with nvidia this round 🙂
 
Ok, just a few things:

1. I'd choose the X800XT PE for the sole reason that I've had ATi and it hasn't treated me wrong yet. The 6800 looks to be a good card I just like to stick with what I know. I'm sure most of the other fanATIcs and Nvidiots feel the same.

2. If you can't get more than 20fps in Quake 2, you are retarded and the cause is an id10t error. As for the CS thing, the sub 30fps thing happens to about 1 in every 1000 people if that. I had it once but I fixed it by doing a fresh install of windows and the drivers. Now I get 100fps no matter what.

3. DOOM3 is hard capped at 60fps. Won't matter which card will render 100fps because they only need to do 60 and odds are they can both do it at max detail.

4. Odds are SM3.0 will not catch on as fast as SM2.0 for the sole fact that less than 5% of the market would be affected by it.

5. Rollo will never give up, he will never surrender.

6. 3Dc does make games look better but ATi isn't the only one that can do it. As far as I know anyone can use it. Could be wrong so don't quote me on this.

7. Can't remember what else I was gonna say so I'll leave it at this. Please forgive any spelling/gramatical errors you may find. 😛 😀
 
Originally posted by: CaiNaM
ok, rollo.. i have but one question to ask. you didn't buy into any of the sm2 marketing propoganda, why on earth would you buy into sm3 propoganda.
I find the complete 180 quite remarkable myself.

Last year when ATi had PS 2.0 and nV had Ps 1.4, it was:

Why bother with PS 2.0 when no current games (that I like) support it? You shouldn't buy hardware based on future games.

PS2.0 is no big deal anyway since the visuals are not impressive. "ooohhh shiny water....ooohhh shiny pipes.....spank...spank..."

It doesn't matter what technology the cards use as long as the performance is the same. A 128 bit PS 1.4 card is just as good as a 256 bit PS 2.0 card.

No complaints of nVidias optimizations, including brilinear (which could not be disabled when it first came out)

fast forward to today
PS 3.0 for future games (that dont exist today) is a critical part of the equation. You'd be crazy be buy a card based on older technology (even though performance is ~ the same).

Outrage over Ati's similar (but a little better) brilinear type optimization

Seems to boil down to

Whatever nVidia does or does not do = good
Whatever ATi does or does not do = bad.


Rollo used to claim to be, and acted somewhat unbiased, but unfortunately, that went away quite awhile ago.
 
Originally posted by: oldfart
Originally posted by: CaiNaM
ok, rollo.. i have but one question to ask. you didn't buy into any of the sm2 marketing propoganda, why on earth would you buy into sm3 propoganda.
I find the complete 180 quite remarkable myself.

Last year when ATi had PS 2.0 and nV had Ps 1.4, it was:

Why bother with PS 2.0 when no current games (that I like) support it? You shouldn't buy hardware based on future games.

PS2.0 is no big deal anyway since the visuals are not impressive. "ooohhh shiny water....ooohhh shiny pipes.....spank...spank..."

It doesn't matter what technology the cards use as long as the performance is the same. A 128 bit PS 1.4 card is just as good as a 256 bit PS 2.0 card.

No complaints of nVidias optimizations, including brilinear (which could not be disabled when it first came out)

fast forward to today
PS 3.0 for future games (that dont exist today) is a critical part of the equation. You'd be crazy be buy a card based on older technology (even though performance is ~ the same).

Outrage over Ati's similar (but a little better) brilinear type optimization

Seems to boil down to

Whatever nVidia does or does not do = good
Whatever ATi does or does not do = bad.


Rollo used to claim to be, and acted somewhat unbiased, but unfortunately, that went away quite awhile ago.

Is there any wonder Rollo switched sides when alot of you ATidiots started slagging him off for purchasing a video card he wanted? 😉
 
Funny how people claim 3Dc is meaningless, and PS3.0 is the holy grail. We dont know yet what one, or both of them means for us, the consumers, for some time.

Which technology do you expect will eventually have widespread support and which wont?

scam? so the source hack was bs?

10 days before the release of the game Gabe told people at the shadey days to buy ATI video cards because the game was going to be released. He never once made any mention of not making the deadline. Judging by the quality of the stolen code there was no way they were going to make the deadline.

Code stealing might has contributed to it. But do you honestly think it would have added a full year?

Last year when ATi had PS 2.0 and nV had Ps 1.4, it was:

I must have missed the memo on the FX not being SM2.0 compliant?
 
Originally posted by: nemesismk2 Is there any wonder Rollo switched sides when alot of you ATidiots started slagging him off for purchasing a video card he wanted? 😉
I dont understand the whole "sides" (fan boy) thing. Why must you be on a "side". Its just a video card, after all. Why the constant negative posting about one co, while claiming the other can do no wrong? I swear, if nVidia made the X800 and ATi the 6800, some around here would have the exact opposite opinion of the cards as they do now just because of the fan boy factor. Its actually worse than the Intel Vs AMD bickering.

Just judge the cards based on the technology, performance and the price. Pretty simple.
 
We don't know yet if speed enhancements of SM3 will make the 6800U as fast as the X800XT at Far Cry.
SM 3.0 probably won't make the baseline faster. It'll simply make the new effects run faster than those same new effects would run under SM 2.0.

A year ago myself and the others were right about SM2.
No you weren't. You simply dimissed the features with inept comments but now when nVidia has the feature advantage you're acting like nobody can live without SM 3.0.

I'm only speculating, but as "Wallet Raider: Angel of Square Cans" was the only SM2 game available in 2003,
(1) I'm afraid you're sadly mistaken.
(2) There are zero games that currently use SM 3.0.
 
Sigh.

I am no more biased today than I was then, which is "not at all"?

You're right, a year ago I said over and over that PS2 was irrelevant because there were no PS2 games. I also said that by the time there were PS2 games, the cards a year ago would seem slow at PS2.

Hmmmm. That all became historical truth. Until Far Cry and Colin McCrae came out this April, there really weren't any PS2 games that mattered. The cards that you can buy now far outperform the first gen PS2 parts.

The difference with SM3 is there are two good games out right now (Far Cry/Painkiller) that will be patched to SM3 in a couple months. There are nine more games in development that will be SM3, some of which will come out in the next year.

At the very high level of performance of the R420 and nV40 cards, I don't see a reason to deny yourself the ability to see what SM3 offers to get a few more frames at a level of performance you won't be able to tell the difference anyway.

I don't see anything "biased" about that, it seems like common sense.

As far as brilinear goes, I still don't care much about it. I think it's nice nVidia gives you the option to turn it off in the control panel. I think it's deceptive ATI advised reviewers to disable it for comparison, and when called on their fraud, stated they consider their brilinear true trilinear.

More radical stuff. :roll:
 
I've voted 6800U, preferably an MSI. 😀

Originally posted by: ShawnD1
A few days ago I tested my GF2 Ti against my 9600XT in Neverwinter Nights: SOU. All in-game settings were at their highest (except antialiasing), AA was off, AF was 2x quality. Resolution was 1024x768. Setup was an Athlon 3200+, 1gb RAM.
[...]
That is pathetic performance for a card that should be at least 4x as fast.
This kinda shows your ignorance about the game engine. It is NOT a first person shooter. It does NOT aim to get 100+FPS on latest gen hardware.
Perhaps you shouldn't buy ANY new graphics cards, as NONE of them are 4x as fast

The rest of your post is seemingly a collection of posts about people who have UNVERIFIED problems. By unverified I mean, one or two people have an issue, but three or four others do not.

Originally posted by: Rollo
The difference with SM3 is there are two good games out right now (Far Cry/Painkiller) that will be patched to SM3 in a couple months. There are nine more games in development that will be SM3, some of which will come out in the next year.
So applying your commentary on SM2, obviously buying a 6800 is a waste of time since the 2nd gen SM3 parts will stomp the 6800s, right? The inconsistency of your logic is why people think you are biased.

At the very high level of performance of the R420 and nV40 cards, I don't see a reason to deny yourself the ability to see what SM3 offers to get a few more frames at a level of performance you won't be able to tell the difference anyway.
The same logic applied before. We had "SM2.0 games coming", you said don't bother, the next gen will be way bettter at SM2.0. Now we are again having "SM3.0 games coming", but now you are saying "buy NVidia! They support SM3.0 now!". Doesn't seem to track, and yes, it does make you seem biased, whether this is the case or not.

As far as brilinear goes, I still don't care much about it. I think it's nice nVidia gives you the option to turn it off in the control panel. I think it's deceptive ATI advised reviewers to disable it for comparison, and when called on their fraud, stated they consider their brilinear true trilinear.
ATI and NVidia's approach to trillinear is different in its base function. If ATI's statements were incorrect, some actual hard evidence would have been found about it. All we've seen are people misconfiguring their games and complaining, or driver bugs. I'm all for ripping on a company for something they deserve to be ripped on, but IMO there is nothing dishonest about saying "look, we have a brand new AF algo, we want you to disable our opponent's mixed mode AF in order to compare real trilinear performance on our two cards, because our trilinear algorithm is now way better". Algorithm optimizations can make a significant difference without cost to the end result, this is why algorithms get copyrighted/patented.
 
Originally posted by: chsh1ca
Originally posted by: ShawnD1
A few days ago I tested my GF2 Ti against my 9600XT in Neverwinter Nights: SOU. All in-game settings were at their highest (except antialiasing), AA was off, AF was 2x quality. Resolution was 1024x768. Setup was an Athlon 3200+, 1gb RAM.
[...]
That is pathetic performance for a card that should be at least 4x as fast.
This kinda shows your ignorance about the game engine. It is NOT a first person shooter. It does NOT aim to get 100+FPS on latest gen hardware.
Now let me paraphrase what you just said. You just said it's ok that ATI drops the ball and half-asses the performance since it's not a shooter game. You must have a really crappy computer that isn't capable of rendering more frames per second than the monitor can show, because if you didn't, you wouldn't be saying that 36 FPS is reasonable. 36 FPS is never ok.

Don't be an idiot. My GF2 gets 27.8fps at 1024x768 with no AA and 2x AF. The first benchmark you linked to show cards rendering 66% more pixels, 4x AA instead of 0x, and 8x AF instead of 2x. The GF2 can't even use settings that high, but if it could, it probably wouldn't even be able to render as many as 5 frames per second.

My original complaint was that ATI cards really suck at OpenGL. The link you posted actually confirms that. At 1280x1024, the 5950U is 30% faster than the 9800XT. At 1600x1200, the 5950U is 32% faster than the 9800XT. In D3D games, the 9800XT is marginally faster than the 5950U, but in OpenGL games, the 5950U crushes the 9800XT. Why do you think that is?

Originally posted by: chsh1ca
The rest of your post is seemingly a collection of posts about people who have UNVERIFIED problems. By unverified I mean, one or two people have an issue, but three or four others do not.
I've already explained the reason for that. It's due to different driver versions. When I first got my 9600XT around February or so, the latest ATI drivers had a major problem with Half-Life and lag. Since then, ATI has fixed the problem with Half-Life but created problems with other games. For example, Neverwinter Nights is ok with 3.10 drivers, but lags really bad on 4.6 drivers.
 
Originally posted by: ShawnD1
Now let me paraphrase what you just said. You just said it's ok that ATI drops the ball and half-asses the performance since it's not a shooter game. You must have a really crappy computer that isn't capable of rendering more frames per second than the monitor can show, because if you didn't, you wouldn't be saying that 36 FPS is reasonable. 36 FPS is never ok.
Good twisting of the words and lack of reading comprehension. I said nothing about ATI whatsoever, what I said is that 36FPS is reasonable on a game that is not based on the framerate. Sure, it'd be nice if the 9600XT was faster at OpenGL, but to say that because it isn't getting 108 FPS (your claim of 4x) that they're dropping the ball is a little bit ignorant. It's quite obvious you don't have a good grasp of how different game engines are written.

Don't be an idiot. My GF2 gets 27.8fps at 1024x768 with no AA and 2x AF. The first benchmark you linked to show cards rendering 66% more pixels, 4x AA instead of 0x, and 8x AF instead of 2x. The GF2 can't even use settings that high, but if it could, it probably wouldn't even be able to render as many as 5 frames per second.
Yes, and they are nowhere close to being capable of the 108FPS goal you have set for a mid-range card. I'm doubtful the engine itself could achieve 108FPS, it is likely capped at 60. I will check after I have posted this.
EDIT:
Okay, so after playing with NWN for a couple minutes (standing still albeit, in the Temple of Tyr in Beorunna's Well, with full NWN HotU 1.62 patch applied), my average FPS at:
800x600 0xAA/2xAF: 34.5FPS
1024x768 0xAA/2xAF: 29.8FPS
1280x1024 0xAA/2xAF: 18.5FPS
800x600 4xAA/2xAF: 15.5FPS
1024x768 4xAA/2xAF: 9.6FPS
1280x1024 4xAA/2xAF: 5.5FPS
Rig is Optimus in my rigs.
Even if the 6800 didn't follow a similar delta as my GeForce 3, and it doubled its FPS going from 1280x1024 to 1024x768 (extremely doubtful, given the nature of the game) at best it's 92FPS, still nowhere near your 4x the FPS claim -- and that's on hardware that is supposed to be a generation ahead and one class of hardware above (2 steps more powerful a core) than the 9600XT. Framerate matters up to a minimum, but it doesn't matter in all games for the same reasons, nor is the minimum the same. I play at 1024x768 and the game is perfectly playable. It's quite rare the framerate drops below 10, but it does happen.

My original complaint was that ATI cards really suck at OpenGL. The link you posted actually confirms that. At 1280x1024, the 5950U is 30% faster than the 9800XT. At 1600x1200, the 5950U is 32% faster than the 9800XT. In D3D games, the 9800XT is marginally faster than the 5950U, but in OpenGL games, the 5950U crushes the 9800XT. Why do you think that is?
Neverwinter Nights has been shown time and again to run significantly better on NVidia offerings, it is specific to the game or possibly the company (as KOTOR exhibits the same bias). One OpenGL developing company giving 30% to one platform over another doesn't equate to ATI sucking at OpenGL. That is a rather naive view of the situation. Each game will work differently based on the optimizations applied in each game.
 
In D3D games, the 9800XT is marginally faster than the 5950U, but in OpenGL games, the 5950U crushes the 9800XT.

Check out call of duty, 5950 gets beat by 9800 Pro

with AA AF, still loses to the XT

Gets shat on in TRAOD

Beat by the 9800 Pro again

and again.

and again, this time in a synthetic benchmark

owned in Halflife 2

same with AA/AF on


I think it's more like: In some opengl games the 5950 is marginally faster than the 9700/9800s, but in direct3d it gets owned by those cards. In Farcry the 5xxx runs a lower shader path and it still gets beat.
 
Back
Top