HardOCP reviews the the 5600 and 5200 Ultra in an apples to apples comparison to ATI

Snoop

Golden Member
Oct 11, 1999
1,424
0
76
Browsing through the benchmark results, it seems ridiculous to see last year's GeForce4 Ti4200 remaining so competitive with the GeForce FX 5600 Ultra. Granted, the only case where this happens is at various resolutions where no enhanced image quality settings are enabled. In all fairness, NVIDIA did place the large majority of their focus upon FSAA and Anisotropic performance.
This is exactly what I think. I realize Nvidia is focusing on image quality but to have (over?)1 year old card match and beat your new card in around 50% of the benchmarks is crazy. For me, since I tend to not mess with the standard settings (which I think is true of 90% of the video card market), what incentive do I have to pay 60-70 dollars more for this card instead of a TI4200?
 

thorin

Diamond Member
Oct 9, 1999
7,573
0
0
Originally posted by: Snoop
Browsing through the benchmark results, it seems ridiculous to see last year's GeForce4 Ti4200 remaining so competitive with the GeForce FX 5600 Ultra. Granted, the only case where this happens is at various resolutions where no enhanced image quality settings are enabled. In all fairness, NVIDIA did place the large majority of their focus upon FSAA and Anisotropic performance.
This is exactly what I think. I realize Nvidia is focusing on image quality but to have (over?)1 year old card match and beat your new card in around 50% of the benchmarks is crazy. For me, since I tend to not mess with the standard settings (which I think is true of 90% of the video card market), what incentive do I have to pay 60-70 dollars more for this card instead of a TI4200?
I kinda agree with you. My plan is either to wait for the FXs to hit and buy a 4200/4600 or wait till the fall when the NV35 (nv30 refresh) comes out because that's when nVidia will fix the performance of the FX series now that they've got the image quality pretty much nailed.

The whole ATi vs nVidia thing seems moot to me. ATi is about to go through all the 0.13um problems nVidia just went through and nVidia is now taking the performance steps that ATi just finished. (If you get what I'm sayin).

Thorin
 

Czar

Lifer
Oct 9, 1999
28,510
0
0
doubt ATI will go through the same problems since it wasnt exactly nvidia problems either, it was TMCS problem, which is now fixed so ATI made a smart move to wait
 

GTaudiophile

Lifer
Oct 24, 2000
29,776
31
81
ATi is ready to roll with its .13mu RV350 (Radeon 9600 and M10). Granted, nVidia and ATi will probably face more issues as they proceed with NV40 and R400, especially if TSMC doesn't have their low-k process up and running.

Edit: I wouldn't call this an apples-to-apples comparison. Not until the Radeon 9600 is thrown in the mix.
 

bluemax

Diamond Member
Apr 28, 2000
7,182
0
0
Whoops..... that 9500 Pro sure delivered quite a spanking! Imagine a 9700 or better in those tests..... :p
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
The conclusion says it all:
Placing all of the new features aside, how on earth can anyone look at a card whose image quality must be lowered to remain competitive and declare it "cinematic"? Furthermore, it is a bit alarming to see NVIDIA claim that gamers will be "experiencing cinematic graphics the way it?s meant to be played" with their $79 card.

Although NVIDIA has taken two steps forward in terms of features and functionality, they have seemingly taken a step backwards for both image quality and performance. As we score things, that would make NVIDIA's current position stagnant at best. Should the "cinematic computing era" be taking place, it is happening with ATI's product line, as we have seen first hand that the current GeForceFX product line simply cannot compete with ATI products released last year.
:p

:Q

 

mjolnir2k

Senior member
Apr 25, 2001
862
0
0
Originally posted by: apoppin
The conclusion says it all:
Placing all of the new features aside, how on earth can anyone look at a card whose image quality must be lowered to remain competitive and declare it "cinematic"? Furthermore, it is a bit alarming to see NVIDIA claim that gamers will be "experiencing cinematic graphics the way it?s meant to be played" with their $79 card.

Although NVIDIA has taken two steps forward in terms of features and functionality, they have seemingly taken a step backwards for both image quality and performance. As we score things, that would make NVIDIA's current position stagnant at best. Should the "cinematic computing era" be taking place, it is happening with ATI's product line, as we have seen first hand that the current GeForceFX product line simply cannot compete with ATI products released last year.
:p

:Q

Have YOU ever been happier? This must be like Christmas to you...;)

Ps. In your sig you mention that you are "Proud of your stupid opinion". Don't you really mean that you are "Proud of others stupid opinions which you will plagarize or cut and paste". :p
 

Czar

Lifer
Oct 9, 1999
28,510
0
0
Originally posted by: mjolnir2k
Originally posted by: apoppin
The conclusion says it all:
Placing all of the new features aside, how on earth can anyone look at a card whose image quality must be lowered to remain competitive and declare it "cinematic"? Furthermore, it is a bit alarming to see NVIDIA claim that gamers will be "experiencing cinematic graphics the way it?s meant to be played" with their $79 card.

Although NVIDIA has taken two steps forward in terms of features and functionality, they have seemingly taken a step backwards for both image quality and performance. As we score things, that would make NVIDIA's current position stagnant at best. Should the "cinematic computing era" be taking place, it is happening with ATI's product line, as we have seen first hand that the current GeForceFX product line simply cannot compete with ATI products released last year.
:p

:Q

Have YOU ever been happier? This must be like Christmas to you...;)

Ps. In your sig you mention that you are "Proud of your stupid opinion". Don't you really mean that you are "Proud of others stupid opinions which you will plagarize or cut and paste". :p
its not stupid, its actually very true, the last years Radeon 9500 Pro beats the GeforceFX 5600 easily
 

Mem

Lifer
Apr 23, 2000
21,476
13
81
its not stupid, its actually very true, the last years Radeon 9500 Pro beats the GeforceFX 5600 easily


The real test is the ATi 9600 against the FX 5600 Ultra,remember the 9500 Pro is being phased out.

 

Czar

Lifer
Oct 9, 1999
28,510
0
0
Originally posted by: Mem
its not stupid, its actually very true, the last years Radeon 9500 Pro beats the GeforceFX 5600 easily


The real test is the ATi 9600 against the FX 5600 Ultra,remember the 9500 Pro is being phased out.

yes and it would be sad for the whole industry if the 9600 was slower than the 9500, its bad enough when one company does that;)
 
Feb 10, 2000
30,029
66
91
Jeez louise - what a slaughter! I'm not sure what is more disappointing, the fact that nVidia is so distantly in second place, or the fact that they have shamelessly encouraged reviewers to use the "Aggressive" mode that produces dreadful image quality, in an effort to mislead the public into believing their performance is on par with ATi's. I must say, even though I acquired my 9700 Pro months after its release, I am feeling better than ever about my acquisition. I skipped right from a GF3ti200, and at this point I literally can't imagine buying an nVidia card.
 

Mem

Lifer
Apr 23, 2000
21,476
13
81
yes and it would be sad for the whole industry if the 9600 was slower than the 9500, its bad enough when one company does that

The interesting thing is nobody knows how well it will do,my bet is it will be slightly slower then the 9500 Pro but probably edge the FX 5600 Ultra.

In the end you`ve plenty of models from both sides to choose from,not like the old days where there was only one model from each company ;).

:)
 

PraetorianGuards

Golden Member
Oct 1, 2002
1,290
0
0
I was thinking about getting the 5600, especially because of its price. But now that the 9500 performs roughly the same and will be going down in price, it's ATI for me!
Also, I liked HardOCP's review. Very, very, thorough IMHO.
 

0roo0roo

No Lifer
Sep 21, 2002
64,862
84
91
not too horrible with the 5200. i don't like spending much more then 80 dollars on a video card anyways u see:p have a 9000 pro right now:p
 

First

Lifer
Jun 3, 2002
10,518
271
136
Originally posted by: Czar
Originally posted by: mjolnir2k
Originally posted by: apoppin
The conclusion says it all:
Placing all of the new features aside, how on earth can anyone look at a card whose image quality must be lowered to remain competitive and declare it "cinematic"? Furthermore, it is a bit alarming to see NVIDIA claim that gamers will be "experiencing cinematic graphics the way it?s meant to be played" with their $79 card.

Although NVIDIA has taken two steps forward in terms of features and functionality, they have seemingly taken a step backwards for both image quality and performance. As we score things, that would make NVIDIA's current position stagnant at best. Should the "cinematic computing era" be taking place, it is happening with ATI's product line, as we have seen first hand that the current GeForceFX product line simply cannot compete with ATI products released last year.
:p

:Q

Have YOU ever been happier? This must be like Christmas to you...;)

Ps. In your sig you mention that you are "Proud of your stupid opinion". Don't you really mean that you are "Proud of others stupid opinions which you will plagarize or cut and paste". :p
its not stupid, its actually very true, the last years Radeon 9500 Pro beats the GeforceFX 5600 easily

Not really. When the 5600 Ultra is in Aggressive mode it completely overtakes the 9500 Pro. Remember, not all people care about IQ, it all depends on the game. When I play first person shooter games, for example, I could really care less about IQ, which means the 5600 Ultra is a much better choice for me than a 9500 Pro. But if I want to play an RTS (like Age of Empires for example), IQ is more important, and the 9500 Pro would be the best card for me. It all depends on the game, and of course the preference of the gamer. In general though, you aren't going to find a lot of gamers that care about IQ as much as the HardOCP article implies.

Granted, I am (as is Anand) disappointed that NVIDIA can barely match ATi's Quality mode in Application mode, where fps takes a huge hit. This is especially true considering NVIDIA's whole "cinematic" GeForceFX marketing campaign, which is clearly not accurate when you look at ATi's Quality mode.

In fact, I saw a great thread over at Aceshardware today on this exact same issue, here.
 
Feb 10, 2000
30,029
66
91
Your point is well-taken, Evan, but the fact remains that nVidia's Aggressive mode produces pug-fugly images, and presumably if ATi decided to offer a similarly kludgy IQ setting, the 9500 would eat the 5600's lunch at it.

For what it's worth, I am a FPS nut, and still really appreciate the increased IQ available with my 9700 Pro (I had owned nothing but nVidia cards until now, except for my Voodoo 2 SLI setup back in the day). FPSs tend to pave the way with cutting-edge graphics anyway, and I find the games more immersive when the IQ settings are ratcheted up.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,414
8,356
126
Originally posted by: Evan Lieb
Originally posted by: Czar
Originally posted by: mjolnir2k
Originally posted by: apoppin
The conclusion says it all:
Placing all of the new features aside, how on earth can anyone look at a card whose image quality must be lowered to remain competitive and declare it "cinematic"? Furthermore, it is a bit alarming to see NVIDIA claim that gamers will be "experiencing cinematic graphics the way it?s meant to be played" with their $79 card.

Although NVIDIA has taken two steps forward in terms of features and functionality, they have seemingly taken a step backwards for both image quality and performance. As we score things, that would make NVIDIA's current position stagnant at best. Should the "cinematic computing era" be taking place, it is happening with ATI's product line, as we have seen first hand that the current GeForceFX product line simply cannot compete with ATI products released last year.
:p

:Q

Have YOU ever been happier? This must be like Christmas to you...;)

Ps. In your sig you mention that you are "Proud of your stupid opinion". Don't you really mean that you are "Proud of others stupid opinions which you will plagarize or cut and paste". :p
its not stupid, its actually very true, the last years Radeon 9500 Pro beats the GeforceFX 5600 easily

Not really. When the 5600 Ultra is in Aggressive mode it completely overtakes the 9500 Pro. Remember, not all people care about IQ, it all depends on the game. When I play first person shooter games, for example, I could really care less about IQ, which means the 5600 Ultra is a much better choice for me than a 9500 Pro. But if I want to play an RTS (like Age of Empires for example), IQ is more important, and the 9500 Pro would be the best card for me. It all depends on the game, and of course the preference of the gamer. In general though, you aren't going to find a lot of gamers that care about IQ as much as the HardOCP article implies.

Granted, I am (as is Anand) disappointed that NVIDIA can barely match ATi's Quality mode in Application mode, where fps takes a huge hit. This is especially true considering NVIDIA's whole "cinematic" GeForceFX marketing campaign, which is clearly not accurate when you look at ATi's Quality mode.

In fact, I saw a great thread over at Aceshardware today on this exact same issue, here.

do what?
since theres a serparate test for AA and aniso what is that test running?
 

NFS4

No Lifer
Oct 9, 1999
72,647
26
91
Remember, not all people care about IQ, it all depends on the game. When I play first person shooter games, for example, I could really care less about IQ, which means the 5600 Ultra is a much better choice for me than a 9500 Pro.

I don't know about the rest of you, but image quality became a BIG factor to me once these games started breaking 100 - 200 FPS. Who buys a $200+ video card JUST to watch frames go by fast?? If you want that, go buy a Ti4200 or a cheap Ti4400.

If you want fast AND pretty, ATI's DX9 lineup is hard to beat.
 

m1ke101

Platinum Member
Mar 30, 2001
2,825
0
0
Originally posted by: NFS4
Remember, not all people care about IQ, it all depends on the game. When I play first person shooter games, for example, I could really care less about IQ, which means the 5600 Ultra is a much better choice for me than a 9500 Pro.
I don't know about the rest of you, but image quality became a BIG factor to me once these games started breaking 100 - 200 FPS. Who buys a $200+ video card JUST to watch frames go buy fast?? If you want that, go buy a Ti4200 or a cheap Ti4400. If you want fast AND pretty, ATI's DX9 lineup is hard to beat.

well put
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: mjolnir2k
Originally posted by: apoppin
The conclusion says it all:
Placing all of the new features aside, how on earth can anyone look at a card whose image quality must be lowered to remain competitive and declare it "cinematic"? Furthermore, it is a bit alarming to see NVIDIA claim that gamers will be "experiencing cinematic graphics the way it?s meant to be played" with their $79 card.

Although NVIDIA has taken two steps forward in terms of features and functionality, they have seemingly taken a step backwards for both image quality and performance. As we score things, that would make NVIDIA's current position stagnant at best. Should the "cinematic computing era" be taking place, it is happening with ATI's product line, as we have seen first hand that the current GeForceFX product line simply cannot compete with ATI products released last year.
:p

:Q

Have YOU ever been happier? This must be like Christmas to you...;)

Ps. In your sig you mention that you are "Proud of your stupid opinion". Don't you really mean that you are "Proud of others stupid opinions which you will plagarize or cut and paste". :p
Actually, I never cared much for Xmas (too commercial) . . .

But No, I am NOT happy with nVidia's inability to compete with ATI this product cycle . . . it really doesn't mean anything - YET - as I think nVidia will be back "in the game" certainly by NV40. But it meant ATI didn't have to "work hard" to refresh the 9700 into the (not so impressive) 9800 as the Ultra DustBuster didn't even put up much of a fight against the 9700Pro.

It also means that there is no "incentive" for ATI to drop their prices to compete - they are only concerned with selling their old stock to make way for new product.

As to my sig - it means my opinion is "just as stupid as yours" and "I am just as proud of my meaningless opinions as you are of yours". ;)

rolleye.gif


:D
 

First

Lifer
Jun 3, 2002
10,518
271
136
Originally posted by: ElFenix
Originally posted by: Evan Lieb
Originally posted by: Czar
Originally posted by: mjolnir2k
Originally posted by: apoppin
The conclusion says it all:
Placing all of the new features aside, how on earth can anyone look at a card whose image quality must be lowered to remain competitive and declare it "cinematic"? Furthermore, it is a bit alarming to see NVIDIA claim that gamers will be "experiencing cinematic graphics the way it?s meant to be played" with their $79 card.

Although NVIDIA has taken two steps forward in terms of features and functionality, they have seemingly taken a step backwards for both image quality and performance. As we score things, that would make NVIDIA's current position stagnant at best. Should the "cinematic computing era" be taking place, it is happening with ATI's product line, as we have seen first hand that the current GeForceFX product line simply cannot compete with ATI products released last year.
:p

:Q

Have YOU ever been happier? This must be like Christmas to you...;)

Ps. In your sig you mention that you are "Proud of your stupid opinion". Don't you really mean that you are "Proud of others stupid opinions which you will plagarize or cut and paste". :p
its not stupid, its actually very true, the last years Radeon 9500 Pro beats the GeforceFX 5600 easily

Not really. When the 5600 Ultra is in Aggressive mode it completely overtakes the 9500 Pro. Remember, not all people care about IQ, it all depends on the game. When I play first person shooter games, for example, I could really care less about IQ, which means the 5600 Ultra is a much better choice for me than a 9500 Pro. But if I want to play an RTS (like Age of Empires for example), IQ is more important, and the 9500 Pro would be the best card for me. It all depends on the game, and of course the preference of the gamer. In general though, you aren't going to find a lot of gamers that care about IQ as much as the HardOCP article implies.

Granted, I am (as is Anand) disappointed that NVIDIA can barely match ATi's Quality mode in Application mode, where fps takes a huge hit. This is especially true considering NVIDIA's whole "cinematic" GeForceFX marketing campaign, which is clearly not accurate when you look at ATi's Quality mode.

In fact, I saw a great thread over at Aceshardware today on this exact same issue, here.

do what?
since theres a serparate test for AA and aniso what is that test running?

Are you asking what mode the 5200 Ultra and 5600 Ultra were running in? (FYI it's Balanced mode).

Originally posted by: NFS4
Remember, not all people care about IQ, it all depends on the game. When I play first person shooter games, for example, I could really care less about IQ, which means the 5600 Ultra is a much better choice for me than a 9500 Pro.

I don't know about the rest of you, but image quality became a BIG factor to me once these games started breaking 100 - 200 FPS. Who buys a $200+ video card JUST to watch frames go by fast?? If you want that, go buy a Ti4200 or a cheap Ti4400.

Well which first person shooter games are you speaking of? You won't be getting anywhere near 200 fps in ANY farily recent first person shooter game with AA and AF enabled, no matter what card you're using. At 1024x768 with AA and AF enabled in UT2K3 for example, smoothness becomes much more important to me because I'm not getting 200 fps. In this situation, fps is king. The same idea can be applied to many other first person shooter games.

Originally posted by: NFS4
If you want fast AND pretty, ATI's DX9 lineup is hard to beat

Absolutely, without question.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,414
8,356
126
Originally posted by: Evan Lieb
Originally posted by: ElFenix

do what?
since theres a serparate test for AA and aniso what is that test running?

Are you asking what mode the 5200 Ultra and 5600 Ultra were running in? (FYI it's Balanced mode).

it looks like the page i linked to was AA and AF off. and the 5600 got hammered.