This is exactly what I think. I realize Nvidia is focusing on image quality but to have (over?)1 year old card match and beat your new card in around 50% of the benchmarks is crazy. For me, since I tend to not mess with the standard settings (which I think is true of 90% of the video card market), what incentive do I have to pay 60-70 dollars more for this card instead of a TI4200?Browsing through the benchmark results, it seems ridiculous to see last year's GeForce4 Ti4200 remaining so competitive with the GeForce FX 5600 Ultra. Granted, the only case where this happens is at various resolutions where no enhanced image quality settings are enabled. In all fairness, NVIDIA did place the large majority of their focus upon FSAA and Anisotropic performance.
I kinda agree with you. My plan is either to wait for the FXs to hit and buy a 4200/4600 or wait till the fall when the NV35 (nv30 refresh) comes out because that's when nVidia will fix the performance of the FX series now that they've got the image quality pretty much nailed.Originally posted by: Snoop
This is exactly what I think. I realize Nvidia is focusing on image quality but to have (over?)1 year old card match and beat your new card in around 50% of the benchmarks is crazy. For me, since I tend to not mess with the standard settings (which I think is true of 90% of the video card market), what incentive do I have to pay 60-70 dollars more for this card instead of a TI4200?Browsing through the benchmark results, it seems ridiculous to see last year's GeForce4 Ti4200 remaining so competitive with the GeForce FX 5600 Ultra. Granted, the only case where this happens is at various resolutions where no enhanced image quality settings are enabled. In all fairness, NVIDIA did place the large majority of their focus upon FSAA and Anisotropic performance.
Placing all of the new features aside, how on earth can anyone look at a card whose image quality must be lowered to remain competitive and declare it "cinematic"? Furthermore, it is a bit alarming to see NVIDIA claim that gamers will be "experiencing cinematic graphics the way it?s meant to be played" with their $79 card.
Although NVIDIA has taken two steps forward in terms of features and functionality, they have seemingly taken a step backwards for both image quality and performance. As we score things, that would make NVIDIA's current position stagnant at best. Should the "cinematic computing era" be taking place, it is happening with ATI's product line, as we have seen first hand that the current GeForceFX product line simply cannot compete with ATI products released last year.
Originally posted by: apoppin
The conclusion says it all:
Placing all of the new features aside, how on earth can anyone look at a card whose image quality must be lowered to remain competitive and declare it "cinematic"? Furthermore, it is a bit alarming to see NVIDIA claim that gamers will be "experiencing cinematic graphics the way it?s meant to be played" with their $79 card.
Although NVIDIA has taken two steps forward in terms of features and functionality, they have seemingly taken a step backwards for both image quality and performance. As we score things, that would make NVIDIA's current position stagnant at best. Should the "cinematic computing era" be taking place, it is happening with ATI's product line, as we have seen first hand that the current GeForceFX product line simply cannot compete with ATI products released last year.
:Q
its not stupid, its actually very true, the last years Radeon 9500 Pro beats the GeforceFX 5600 easilyOriginally posted by: mjolnir2k
Originally posted by: apoppin
The conclusion says it all:
Placing all of the new features aside, how on earth can anyone look at a card whose image quality must be lowered to remain competitive and declare it "cinematic"? Furthermore, it is a bit alarming to see NVIDIA claim that gamers will be "experiencing cinematic graphics the way it?s meant to be played" with their $79 card.
Although NVIDIA has taken two steps forward in terms of features and functionality, they have seemingly taken a step backwards for both image quality and performance. As we score things, that would make NVIDIA's current position stagnant at best. Should the "cinematic computing era" be taking place, it is happening with ATI's product line, as we have seen first hand that the current GeForceFX product line simply cannot compete with ATI products released last year.
:Q
Have YOU ever been happier? This must be like Christmas to you...
Ps. In your sig you mention that you are "Proud of your stupid opinion". Don't you really mean that you are "Proud of others stupid opinions which you will plagarize or cut and paste".![]()
its not stupid, its actually very true, the last years Radeon 9500 Pro beats the GeforceFX 5600 easily
Originally posted by: Mem
its not stupid, its actually very true, the last years Radeon 9500 Pro beats the GeforceFX 5600 easily
The real test is the ATi 9600 against the FX 5600 Ultra,remember the 9500 Pro is being phased out.
yes and it would be sad for the whole industry if the 9600 was slower than the 9500, its bad enough when one company does that
Originally posted by: Czar
its not stupid, its actually very true, the last years Radeon 9500 Pro beats the GeforceFX 5600 easilyOriginally posted by: mjolnir2k
Originally posted by: apoppin
The conclusion says it all:
Placing all of the new features aside, how on earth can anyone look at a card whose image quality must be lowered to remain competitive and declare it "cinematic"? Furthermore, it is a bit alarming to see NVIDIA claim that gamers will be "experiencing cinematic graphics the way it?s meant to be played" with their $79 card.
Although NVIDIA has taken two steps forward in terms of features and functionality, they have seemingly taken a step backwards for both image quality and performance. As we score things, that would make NVIDIA's current position stagnant at best. Should the "cinematic computing era" be taking place, it is happening with ATI's product line, as we have seen first hand that the current GeForceFX product line simply cannot compete with ATI products released last year.
:Q
Have YOU ever been happier? This must be like Christmas to you...
Ps. In your sig you mention that you are "Proud of your stupid opinion". Don't you really mean that you are "Proud of others stupid opinions which you will plagarize or cut and paste".![]()
Originally posted by: Evan Lieb
Originally posted by: Czar
its not stupid, its actually very true, the last years Radeon 9500 Pro beats the GeforceFX 5600 easilyOriginally posted by: mjolnir2k
Originally posted by: apoppin
The conclusion says it all:
Placing all of the new features aside, how on earth can anyone look at a card whose image quality must be lowered to remain competitive and declare it "cinematic"? Furthermore, it is a bit alarming to see NVIDIA claim that gamers will be "experiencing cinematic graphics the way it?s meant to be played" with their $79 card.
Although NVIDIA has taken two steps forward in terms of features and functionality, they have seemingly taken a step backwards for both image quality and performance. As we score things, that would make NVIDIA's current position stagnant at best. Should the "cinematic computing era" be taking place, it is happening with ATI's product line, as we have seen first hand that the current GeForceFX product line simply cannot compete with ATI products released last year.
:Q
Have YOU ever been happier? This must be like Christmas to you...
Ps. In your sig you mention that you are "Proud of your stupid opinion". Don't you really mean that you are "Proud of others stupid opinions which you will plagarize or cut and paste".![]()
Not really. When the 5600 Ultra is in Aggressive mode it completely overtakes the 9500 Pro. Remember, not all people care about IQ, it all depends on the game. When I play first person shooter games, for example, I could really care less about IQ, which means the 5600 Ultra is a much better choice for me than a 9500 Pro. But if I want to play an RTS (like Age of Empires for example), IQ is more important, and the 9500 Pro would be the best card for me. It all depends on the game, and of course the preference of the gamer. In general though, you aren't going to find a lot of gamers that care about IQ as much as the HardOCP article implies.
Granted, I am (as is Anand) disappointed that NVIDIA can barely match ATi's Quality mode in Application mode, where fps takes a huge hit. This is especially true considering NVIDIA's whole "cinematic" GeForceFX marketing campaign, which is clearly not accurate when you look at ATi's Quality mode.
In fact, I saw a great thread over at Aceshardware today on this exact same issue, here.
Remember, not all people care about IQ, it all depends on the game. When I play first person shooter games, for example, I could really care less about IQ, which means the 5600 Ultra is a much better choice for me than a 9500 Pro.
Originally posted by: NFS4
I don't know about the rest of you, but image quality became a BIG factor to me once these games started breaking 100 - 200 FPS. Who buys a $200+ video card JUST to watch frames go buy fast?? If you want that, go buy a Ti4200 or a cheap Ti4400. If you want fast AND pretty, ATI's DX9 lineup is hard to beat.Remember, not all people care about IQ, it all depends on the game. When I play first person shooter games, for example, I could really care less about IQ, which means the 5600 Ultra is a much better choice for me than a 9500 Pro.
Actually, I never cared much for Xmas (too commercial) . . .Originally posted by: mjolnir2k
Originally posted by: apoppin
The conclusion says it all:
Placing all of the new features aside, how on earth can anyone look at a card whose image quality must be lowered to remain competitive and declare it "cinematic"? Furthermore, it is a bit alarming to see NVIDIA claim that gamers will be "experiencing cinematic graphics the way it?s meant to be played" with their $79 card.
Although NVIDIA has taken two steps forward in terms of features and functionality, they have seemingly taken a step backwards for both image quality and performance. As we score things, that would make NVIDIA's current position stagnant at best. Should the "cinematic computing era" be taking place, it is happening with ATI's product line, as we have seen first hand that the current GeForceFX product line simply cannot compete with ATI products released last year.
:Q
Have YOU ever been happier? This must be like Christmas to you...
Ps. In your sig you mention that you are "Proud of your stupid opinion". Don't you really mean that you are "Proud of others stupid opinions which you will plagarize or cut and paste".![]()
Originally posted by: ElFenix
Originally posted by: Evan Lieb
Originally posted by: Czar
its not stupid, its actually very true, the last years Radeon 9500 Pro beats the GeforceFX 5600 easilyOriginally posted by: mjolnir2k
Originally posted by: apoppin
The conclusion says it all:
Placing all of the new features aside, how on earth can anyone look at a card whose image quality must be lowered to remain competitive and declare it "cinematic"? Furthermore, it is a bit alarming to see NVIDIA claim that gamers will be "experiencing cinematic graphics the way it?s meant to be played" with their $79 card.
Although NVIDIA has taken two steps forward in terms of features and functionality, they have seemingly taken a step backwards for both image quality and performance. As we score things, that would make NVIDIA's current position stagnant at best. Should the "cinematic computing era" be taking place, it is happening with ATI's product line, as we have seen first hand that the current GeForceFX product line simply cannot compete with ATI products released last year.
:Q
Have YOU ever been happier? This must be like Christmas to you...
Ps. In your sig you mention that you are "Proud of your stupid opinion". Don't you really mean that you are "Proud of others stupid opinions which you will plagarize or cut and paste".![]()
Not really. When the 5600 Ultra is in Aggressive mode it completely overtakes the 9500 Pro. Remember, not all people care about IQ, it all depends on the game. When I play first person shooter games, for example, I could really care less about IQ, which means the 5600 Ultra is a much better choice for me than a 9500 Pro. But if I want to play an RTS (like Age of Empires for example), IQ is more important, and the 9500 Pro would be the best card for me. It all depends on the game, and of course the preference of the gamer. In general though, you aren't going to find a lot of gamers that care about IQ as much as the HardOCP article implies.
Granted, I am (as is Anand) disappointed that NVIDIA can barely match ATi's Quality mode in Application mode, where fps takes a huge hit. This is especially true considering NVIDIA's whole "cinematic" GeForceFX marketing campaign, which is clearly not accurate when you look at ATi's Quality mode.
In fact, I saw a great thread over at Aceshardware today on this exact same issue, here.
do what?
since theres a serparate test for AA and aniso what is that test running?
Originally posted by: NFS4
Remember, not all people care about IQ, it all depends on the game. When I play first person shooter games, for example, I could really care less about IQ, which means the 5600 Ultra is a much better choice for me than a 9500 Pro.
I don't know about the rest of you, but image quality became a BIG factor to me once these games started breaking 100 - 200 FPS. Who buys a $200+ video card JUST to watch frames go by fast?? If you want that, go buy a Ti4200 or a cheap Ti4400.
Originally posted by: NFS4
If you want fast AND pretty, ATI's DX9 lineup is hard to beat
Originally posted by: Evan Lieb
Originally posted by: ElFenix
do what?
since theres a serparate test for AA and aniso what is that test running?
Are you asking what mode the 5200 Ultra and 5600 Ultra were running in? (FYI it's Balanced mode).