NVIDIA 9800GTX+ Review Thread

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Here we really see Nvidias problem though... The thing is, 9800 GTX and GTX 260 are already too close in performance for them to have anything in between... They just wanted to milk the new products for as long as possible, but seeing that the new AMD cards are so good, their plans were ruined, so the 9800GTX+ is the best they can do

Thats pretty much it
 

shangshang

Senior member
May 17, 2008
830
0
0
Ok after having tested the 4850 and 8800GT, I have finally can make my OWN conclusions! Borrow the 8800GT from a friend, and bought the 4850 myself from Best Buy.

System specs: Quad core Q6666, 4 GB ram, WinXP-sp2, Intel P35 chipset (Asus P5k mobo)

Game tested: CRYSIS!!!
resolution: 1600 x 1200 (note the res is not widescreen)
game settings: high, 2x AA (unplayable)
game settings: medium, no AA (playable on both cards)

With the 8800GT, framerate was slow, unplayable. With the 4850, framerate was also slow, unplayable. I didn't run any "benchmark" test, because to me, if a card doesn't play in a real game, then it's not what I'm looking to get. And since Crysis is the ultimate indication of what future (next 1.5 years) game releases might demand, I just tested Crysis for now.

I was a bit disappointed, was hoping that the 4850 would be enough to take me over the top in Crysis at 1600x1200 quality high 2xAA.

Hmm, so what Zod96 posted earlier, I have now verified for myself in actual game play. The 4850 may be faster than the 8800GT when looking at their benchmarks on the web, but apparently not enough performance gain to make much of a difference in Crysis.

Hmmm, right now, I'm thinking of returning the 4850 all together, and may even bypass the GTX+ too. Might just wait for the GT260 or 4870 to drop in price to jump on those, because as far as I'm concerned the 8800GT, GTS G92, 4850 are all in the same league of performance and that at 1600x1200 gaming resolution, you won't notice much of a performance gain in actual game play (Crysis in my case). Furthermore, the 4850 runs hot. All the report about it running hot is not exaggerated! Just what the hell was ATI thinking by not allowing the fan to run faster than 14% when it's a sigle slot cooler???? I understand the chip has an operating temperature tolerance, but a single slot cooler without the ability to adjust the fan to more than 14% is stupidly stupid. Anyway, I'm still thinking if I wanna keep this thing. Well there you go.
 

raddreamer3kx

Member
Oct 2, 2006
193
0
0
Originally posted by: shangshang
Ok after having tested the 4850 and 8800GT, I have finally can make my OWN conclusions! Borrow the 8800GT from a friend, and bought the 4850 myself from Best Buy.

System specs: Quad core Q6666, 4 GB ram, WinXP-sp2, Intel P35 chipset (Asus P5k mobo)

Game tested: CRYSIS!!!
resolution: 1600 x 1200 (note the res is not widescreen)
game settings: high, 2x AA (unplayable)
game settings: medium, no AA (playable on both cards)

With the 8800GT, framerate was slow, unplayable. With the 4850, framerate was also slow, unplayable. I didn't run any "benchmark" test, because to me, if a card doesn't play in a real game, then it's not what I'm looking to get. And since Crysis is the ultimate indication of what future (next 1.5 years) game releases might demand, I just tested Crysis for now.

I was a bit disappointed, was hoping that the 4850 would be enough to take me over the top in Crysis at 1600x1200 quality high 2xAA.

Hmm, so what Zod96 posted earlier, I have now verified for myself in actual game play. The 4850 may be faster than the 8800GT when looking at their benchmarks on the web, but apparently not enough performance gain to make much of a difference in Crysis.

Hmmm, right now, I'm thinking of returning the 4850 all together, and may even bypass the GTX+ too. Might just wait for the GT260 or 4870 to drop in price to jump on those, because as far as I'm concerned the 8800GT, GTS G92, 4850 are all in the same league of performance and that at 1600x1200 gaming resolution, you won't notice much of a performance gain in actual game play (Crysis in my case). Furthermore, the 4850 runs hot. All the report about it running hot is not exaggerated! Just what the hell was ATI thinking by not allowing the fan to run faster than 14% when it's a sigle slot cooler???? I understand the chip has an operating temperature tolerance, but a single slot cooler without the ability to adjust the fan to more than 14% is stupidly stupid. Anyway, I'm still thinking if I wanna keep this thing. Well there you go.

Crysis is a poorly coded game, not the best game to do testing on. I might return my 4850 that has not arrived yet and wait for the 4870, not sure what to do yet.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
Originally posted by: raddreamer3kx
Crysis is a poorly coded game, not the best game to do testing on. I might return my 4850 that has not arrived yet and wait for the 4870, not sure what to do yet.

The HD4850 may give you good boost in performance depending on what card you have now. It is about 30% faster than the 8800GT, but if you only play Crysis and want the highest performance, then you need a GX2, GTX+ SLI, or HD4870 CF :)
 

shangshang

Senior member
May 17, 2008
830
0
0
Ok another test: COD4
game settings: 1600x1200, high quality

Here, it's playable on both cards. So the "observed" end result gameplay for me is: the 8800GT and 4850 are the same. So for my gaming need at 1600x1200, the 8800gt and 4850 are equals, didn't observed any improvement in framerate gain or smoothness gain from the 4850.

At this point my decision boils down to this. Because I got the 4850 for cheap, $149 at BB, then I might just keep it for this reason.. However, had I paid $199 for it, then I would definitely get the GTX+ for $20-$30 bux more. My reasons:

1) GTX+ runs cooler, big fan
2) 4850 runs hot, fan sucks can't modify it for now, may not o/c as high as GTX+ due to fan & heat issue
3) GTX+ has Physx. May not be important now, but it's a definte value-added factor to consider.
http://www.nzone.com/object/nzone_physxgames_home.html
http://www.evilavatar.com/forums/showthread.php?t=55177

So my personal stance is if you can get the 4850 for $50+ LESS THAN the GTX+, then get it. Otherwise, get the GTX+ for more value. And if you have an 8800GT already, just skip these two and wait for either the 4870 or gt260. I defintely would not get the 4850 at $199 if the GTX+ is to had at $230 or less. For the additional $20-$30 bux, you get a better fan, cooler card, maybe higher o/c potential, and Physx. GTX+ has a lot of added values and none of the hassle of modding this modding that! I might just keep the 4850 for 6 months and then Ebay it and jump on the GT260 then.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Originally posted by: shangshang
Ok after having tested the 4850 and 8800GT, I have finally can make my OWN conclusions! Borrow the 8800GT from a friend, and bought the 4850 myself from Best Buy.

System specs: Quad core Q6666, 4 GB ram, WinXP-sp2, Intel P35 chipset (Asus P5k mobo)

Game tested: CRYSIS!!!
resolution: 1600 x 1200 (note the res is not widescreen)
game settings: high, 2x AA (unplayable)
game settings: medium, no AA (playable on both cards)

With the 8800GT, framerate was slow, unplayable. With the 4850, framerate was also slow, unplayable. I didn't run any "benchmark" test, because to me, if a card doesn't play in a real game, then it's not what I'm looking to get. And since Crysis is the ultimate indication of what future (next 1.5 years) game releases might demand, I just tested Crysis for now.

I was a bit disappointed, was hoping that the 4850 would be enough to take me over the top in Crysis at 1600x1200 quality high 2xAA.

Hmm, so what Zod96 posted earlier, I have now verified for myself in actual game play. The 4850 may be faster than the 8800GT when looking at their benchmarks on the web, but apparently not enough performance gain to make much of a difference in Crysis.

Hmmm, right now, I'm thinking of returning the 4850 all together, and may even bypass the GTX+ too. Might just wait for the GT260 or 4870 to drop in price to jump on those, because as far as I'm concerned the 8800GT, GTS G92, 4850 are all in the same league of performance and that at 1600x1200 gaming resolution, you won't notice much of a performance gain in actual game play (Crysis in my case). Furthermore, the 4850 runs hot. All the report about it running hot is not exaggerated! Just what the hell was ATI thinking by not allowing the fan to run faster than 14% when it's a sigle slot cooler???? I understand the chip has an operating temperature tolerance, but a single slot cooler without the ability to adjust the fan to more than 14% is stupidly stupid. Anyway, I'm still thinking if I wanna keep this thing. Well there you go.

I think the benches were very clear on not being able to activate any AA on Crysis with a 4850 and the game staying playable. You need either a 4850, G260 or 2x 4850.
 

shangshang

Senior member
May 17, 2008
830
0
0
Originally posted by: Kuzi
Originally posted by: raddreamer3kx
Crysis is a poorly coded game, not the best game to do testing on. I might return my 4850 that has not arrived yet and wait for the 4870, not sure what to do yet.

The HD4850 may give you good boost in performance depending on what card you have now. It is about 30% faster than the 8800GT, but if you only play Crysis and want the highest performance, then you need a GX2, GTX+ SLI, or HD4870 CF :)

Now I'm a recreational gamer, but not a fanatic gamer, not to the point of justifying the spending of 2 video cards! The video card industry will have to improve their multi-gpu architecture a lot more before I'll plunk my money down for 2 cards. Things are moving so fast that in 12 months, a 2-card setup is beaten by 1 card. For now, I tink the best value for consumer is to stick with a "mid graded" card and then Ebay it in 12 months to upgrade to the next mid-grade card! Nvidia has a lot of products in their pipelines, so this fast pace rate of increase will continue for at least 18-24 months.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Originally posted by: shangshang
Ok another test: COD4
game settings: 1600x1200, high quality

Here, it's playable on both cards. So the "observed" end result gameplay for me is: the 8800GT and 4850 are the same. So for my gaming need at 1600x1200, the 8800gt and 4850 are equals, didn't observed any improvement in framerate gain or smoothness gain from the 4850.

At this point my decision boils down to this. Because I got the 4850 for cheap, $149 at BB, then I might just keep it for this reason.. However, had I paid $199 for it, then I would definitely get the GTX+ for $20-$30 bux more. My reasons:

1) GTX+ runs cooler, big fan
2) 4850 runs hot, fan sucks can't modify it for now, may not o/c as high as GTX+ due to fan & heat issue
3) GTX+ has Physx. May not be important now, but it's a definte value-added factor to consider.
http://www.nzone.com/object/nzone_physxgames_home.html
http://www.evilavatar.com/forums/showthread.php?t=55177

So my personal stance is if you can get the 4850 for $50+ LESS THAN the GTX+, then get it. Otherwise, get the GTX+ for more value. And if you have an 8800GT already, just skip these two and wait for either the 4870 or gt260. I defintely would not get the 4850 at $199 if the GTX+ is to had at $230 or less. For the additional $20-$30 bux, you get a better fan, cooler card, maybe higher o/c potential, and Physx. GTX+ has a lot of added values and none of the hassle of modding this modding that! I might just keep the 4850 for 6 months and then Ebay it and jump on the GT260 then.

Physx will likely amount to nothing. Developers have been using Havoc for years.
 

shangshang

Senior member
May 17, 2008
830
0
0
forgot one more setting in Crysis: high quality, with no AA

In this case, game is playable in both cards but with some scenes slowing down a bit, but still playable so that you could go past such sticky points! I was hoping that in these "sticking points" the 4850 would outperfom the 8800GT, but it didn't. I think with a lower resolution, like 1680 x 1050 (typical 22" widescreen), then the 4850 may have made a difference in these sticky scenes, but at 1600x1200 that I'm playing at, both cards are observed to be the same.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
Originally posted by: shangshang
Ok another test: COD4
game settings: 1600x1200, high quality

Here, it's playable on both cards. So the "observed" end result gameplay for me is: the 8800GT and 4850 are the same. So for my gaming need at 1600x1200, the 8800gt and 4850 are equals, didn't observed any improvement in framerate gain or smoothness gain from the 4850.

At this point my decision boils down to this. Because I got the 4850 for cheap, $149 at BB, then I might just keep it for this reason.. However, had I paid $199 for it, then I would definitely get the GTX+ for $20-$30 bux more. My reasons:

1) GTX+ runs cooler, big fan
2) 4850 runs hot, fan sucks can't modify it for now, may not o/c as high as GTX+ due to fan & heat issue
3) GTX+ has Physx. May not be important now, but it's a definte value-added factor to consider.
http://www.nzone.com/object/nzone_physxgames_home.html
http://www.evilavatar.com/forums/showthread.php?t=55177

So my personal stance is if you can get the 4850 for $50+ LESS THAN the GTX+, then get it. Otherwise, get the GTX+ for more value. And if you have an 8800GT already, just skip these two and wait for either the 4870 or gt260. I defintely would not get the 4850 at $199 if the GTX+ is to had at $230 or less. For the additional $20-$30 bux, you get a better fan, cooler card, maybe higher o/c potential, and Physx. GTX+ has a lot of added values and none of the hassle of modding this modding that! I might just keep the 4850 for 6 months and then Ebay it and jump on the GT260 then.

At 1600x1200 resolution, CoD won't make any difference even if you buy a GTX260. Check the anandtech review, 8800GT 58fps, HD4850 78fps, GTX260 84fps.

The HD4850 is 20 fps faster than the 8800GT at 1600x1200 res. You might be the type of person that can't tell 20 fps difference at this fps range (58 to 78). In my case I can tell the difference between frame rates at 50s or frame rates at 70s, some people can't.

Obviously I won't be able to tell the difference between GTX260 frame rates @ 84, and HD4850 78fps. That's why it would be a horrible value to buy a GTX260 to run at this resolution. The HD4870 will most likely run faster than the GTX260, so again, no one should buy that card only to run at 1600x1200 res.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
Originally posted by: shangshang
Originally posted by: Kuzi
Originally posted by: raddreamer3kx
Crysis is a poorly coded game, not the best game to do testing on. I might return my 4850 that has not arrived yet and wait for the 4870, not sure what to do yet.

The HD4850 may give you good boost in performance depending on what card you have now. It is about 30% faster than the 8800GT, but if you only play Crysis and want the highest performance, then you need a GX2, GTX+ SLI, or HD4870 CF :)

Now I'm a recreational gamer, but not a fanatic gamer, not to the point of justifying the spending of 2 video cards! The video card industry will have to improve their multi-gpu architecture a lot more before I'll plunk my money down for 2 cards. Things are moving so fast that in 12 months, a 2-card setup is beaten by 1 card. For now, I tink the best value for consumer is to stick with a "mid graded" card and then Ebay it in 12 months to upgrade to the next mid-grade card! Nvidia has a lot of products in their pipelines, so this fast pace rate of increase will continue for at least 18-24 months.

Wow, your first post that I agree with.

That's why I see the GTX+ and HD4850/4870 as the best value right now. The GTX260 would be nice too if Nvidia decides to lower the price, $300-$350 range.
 

shangshang

Senior member
May 17, 2008
830
0
0
Originally posted by: sxr7171
Originally posted by: shangshang
Ok another test: COD4
game settings: 1600x1200, high quality

Here, it's playable on both cards. So the "observed" end result gameplay for me is: the 8800GT and 4850 are the same. So for my gaming need at 1600x1200, the 8800gt and 4850 are equals, didn't observed any improvement in framerate gain or smoothness gain from the 4850.

At this point my decision boils down to this. Because I got the 4850 for cheap, $149 at BB, then I might just keep it for this reason.. However, had I paid $199 for it, then I would definitely get the GTX+ for $20-$30 bux more. My reasons:

1) GTX+ runs cooler, big fan
2) 4850 runs hot, fan sucks can't modify it for now, may not o/c as high as GTX+ due to fan & heat issue
3) GTX+ has Physx. May not be important now, but it's a definte value-added factor to consider.
http://www.nzone.com/object/nzone_physxgames_home.html
http://www.evilavatar.com/forums/showthread.php?t=55177

So my personal stance is if you can get the 4850 for $50+ LESS THAN the GTX+, then get it. Otherwise, get the GTX+ for more value. And if you have an 8800GT already, just skip these two and wait for either the 4870 or gt260. I defintely would not get the 4850 at $199 if the GTX+ is to had at $230 or less. For the additional $20-$30 bux, you get a better fan, cooler card, maybe higher o/c potential, and Physx. GTX+ has a lot of added values and none of the hassle of modding this modding that! I might just keep the 4850 for 6 months and then Ebay it and jump on the GT260 then.

Physx will likely amount to nothing. Developers have been using Havoc for years.

This will change. Havoc has the might of Intel behind it, hence it draws developers just by the sheer size of Intel. Physx, before Nvdia acquired it, was under Ageia, a small company nowhere near the size of Intel. With Nvidia promoting Physx now, you better believe things are gonna change. Nvidia is a well managed ompany, focused, and don't think they acquired Physx to let it rot on the shelves! Microsoft might do this, but not Nvidia.

Nvidia acquired Ageai's Physx in Feb 2008, and at the time of acquisition, I recall reading people were saying it would take Nvidia 8-12 month to incorporate Physx into its product. Here are 4 months later in June, Nvida has already officially supported Physx in the 9800 series or higher cards, and even unofficially in the 8800GT/GTS series with a simple hack.

Time will tell if Nvidia's vision that Physx will play an important role in visual computing will come true.
 

shangshang

Senior member
May 17, 2008
830
0
0
Originally posted by: Kuzi

At 1600x1200 resolution, CoD won't make any difference even if you buy a GTX260. Check the anandtech review, 8800GT 58fps, HD4850 78fps, GTX260 84fps.

The HD4850 is 20 fps faster than the 8800GT at 1600x1200 res. You might be the type of person that can't tell 20 fps difference at this fps range (58 to 78). In my case I can tell the difference between frame rates at 50s or frame rates at 70s, some people can't.

Obviously I won't be able to tell the difference between GTX260 frame rates @ 84, and HD4850 78fps. That's why it would be a horrible value to buy a GTX260 to run at this resolution. The HD4870 will most likely run faster than the GTX260, so again, no one should buy that card only to run at 1600x1200 res.

1) I ain't getting the GT260, so don't care about it at the moment. Maybe get it when faster demands it later down the road, and when the price drops a bit.

2) Now you could tell the difference between 50 and 70 fps?? The mininum framerate required for the human eyes to perceive "smooth" motion is 25 fps. (Hence filming is done mainly between 25-30 fps.). You would need to have a strobe light before your eyes. I would agree that in "sticky, heavy scenes", the higher performance card can make the scene look smoother if the scene is bordering the 23-25 fps, but at 50-70 fps, it's a scientific fact that the human eyes can't distinguish!
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Originally posted by: bryanW1995
Originally posted by: HOOfan 1
my guess is to allow higher overclocking on the 9800GTX+

Apparently Legit Reviews found that it is an insane overclocker

well, it's already clocked at 738 core, a decent oc on a g92(b). I've heard numerous reports of 800 core on g92 systems, but not a lot higher than that + stable. let's assume that the 55nm lets you get another 50 mhz out of it, that puts it at 850 core, for a 112 mhz oc. that's a 15.18% oc. not bad, but not as nice as a large % of 8800gt owners got.



Im sitting at ~810 core, and have benched it at ~820. Now memory clock.....thats a different story.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Originally posted by: shangshang
Originally posted by: sxr7171
Originally posted by: shangshang
Ok another test: COD4
game settings: 1600x1200, high quality

Here, it's playable on both cards. So the "observed" end result gameplay for me is: the 8800GT and 4850 are the same. So for my gaming need at 1600x1200, the 8800gt and 4850 are equals, didn't observed any improvement in framerate gain or smoothness gain from the 4850.

At this point my decision boils down to this. Because I got the 4850 for cheap, $149 at BB, then I might just keep it for this reason.. However, had I paid $199 for it, then I would definitely get the GTX+ for $20-$30 bux more. My reasons:

1) GTX+ runs cooler, big fan
2) 4850 runs hot, fan sucks can't modify it for now, may not o/c as high as GTX+ due to fan & heat issue
3) GTX+ has Physx. May not be important now, but it's a definte value-added factor to consider.
http://www.nzone.com/object/nzone_physxgames_home.html
http://www.evilavatar.com/forums/showthread.php?t=55177

So my personal stance is if you can get the 4850 for $50+ LESS THAN the GTX+, then get it. Otherwise, get the GTX+ for more value. And if you have an 8800GT already, just skip these two and wait for either the 4870 or gt260. I defintely would not get the 4850 at $199 if the GTX+ is to had at $230 or less. For the additional $20-$30 bux, you get a better fan, cooler card, maybe higher o/c potential, and Physx. GTX+ has a lot of added values and none of the hassle of modding this modding that! I might just keep the 4850 for 6 months and then Ebay it and jump on the GT260 then.

Physx will likely amount to nothing. Developers have been using Havoc for years.

This will change. Havoc has the might of Intel behind it, hence it draws developers just by the sheer size of Intel. Physx, before Nvdia acquired it, was under Ageia, a small company nowhere near the size of Intel. With Nvidia promoting Physx now, you better believe things are gonna change. Nvidia is a well managed ompany, focused, and don't think they acquired Physx to let it rot on the shelves! Microsoft might do this, but not Nvidia.

Nvidia acquired Ageai's Physx in Feb 2008, and at the time of acquisition, I recall reading people were saying it would take Nvidia 8-12 month to incorporate Physx into its product. Here are 4 months later in June, Nvida has already officially supported Physx in the 9800 series or higher cards, and even unofficially in the 8800GT/GTS series with a simple hack.

Time will tell if Nvidia's vision that Physx will play an important role in visual computing will come true.

Why would anyone change what they are used to using unless there are serious benefits to do so? Physx has been around for a year or two only to be snubbed by the community. Havoc has been chugging along without Intel just fine, and now with Intel they are going to get even better. All I'm saying is that Havoc at least can hit the ground running, physx needs to get over a huge hump.

Even you say "time will tell," hence nVidia supporting Physx is not a real selling point... yet.

 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: ShadowOfMyself
Here we really see Nvidias problem though... The thing is, 9800 GTX and GTX 260 are already too close in performance for them to have anything in between... They just wanted to milk the new products for as long as possible, but seeing that the new AMD cards are so good, their plans were ruined, so the 9800GTX+ is the best they can do

Thats pretty much it

Since when was the GTX 260 close to 9800GTX? A GTX260 is faster than the HD3870X2 (i.e much faster than the 9800GTX).
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: HOOfan 1
Originally posted by: bryanW1995
Originally posted by: HOOfan 1
my guess is to allow higher overclocking on the 9800GTX+

Apparently Legit Reviews found that it is an insane overclocker

well, it's already clocked at 738 core, a decent oc on a g92(b). I've heard numerous reports of 800 core on g92 systems, but not a lot higher than that + stable. let's assume that the 55nm lets you get another 50 mhz out of it, that puts it at 850 core, for a 112 mhz oc. that's a 15.18% oc. not bad, but not as nice as a large % of 8800gt owners got.

according to legitreview they got the core to 855Mhz 2.2Ghz on the shaders.

Apparently the overclock also increased the performance by 10%

nice, so I was right. of course, I'm sure that nvidia just sent a random sample to legit reviews for inspection, there will probably be many users getting a much higher overclock...either that or 855 could end up being pretty close to the high end with a typical clock of 820-840.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: shangshang
Ok another test: COD4
game settings: 1600x1200, high quality

Here, it's playable on both cards. So the "observed" end result gameplay for me is: the 8800GT and 4850 are the same. So for my gaming need at 1600x1200, the 8800gt and 4850 are equals, didn't observed any improvement in framerate gain or smoothness gain from the 4850.

At this point my decision boils down to this. Because I got the 4850 for cheap, $149 at BB, then I might just keep it for this reason.. However, had I paid $199 for it, then I would definitely get the GTX+ for $20-$30 bux more. My reasons:

1) GTX+ runs cooler, big fan
2) 4850 runs hot, fan sucks can't modify it for now, may not o/c as high as GTX+ due to fan & heat issue
3) GTX+ has Physx. May not be important now, but it's a definte value-added factor to consider.
http://www.nzone.com/object/nzone_physxgames_home.html
http://www.evilavatar.com/forums/showthread.php?t=55177

So my personal stance is if you can get the 4850 for $50+ LESS THAN the GTX+, then get it. Otherwise, get the GTX+ for more value. And if you have an 8800GT already, just skip these two and wait for either the 4870 or gt260. I defintely would not get the 4850 at $199 if the GTX+ is to had at $230 or less. For the additional $20-$30 bux, you get a better fan, cooler card, maybe higher o/c potential, and Physx. GTX+ has a lot of added values and none of the hassle of modding this modding that! I might just keep the 4850 for 6 months and then Ebay it and jump on the GT260 then.

I'm impressed that you actually went out and bought the video card so that you could verify what was already written about it. Do you really think that you would see any difference going from 4850 to 9800gtx+ in a game like crysis at that resolution? If anything, the 4850 would probably be faster at marginally playable resolutions/AA levels. Let's examine your "reasons" for wanting a GTX+.
1. cooler with bigger fan: yes it's cooler, but it has a bigger fan because it has a lot more heat to dissipate. a 10db difference in noise is HUGE. I'd stick with a 4850 for the lower power consumption and therefore lower fan requirement if I were you.
2. as soon as w1zzard gets rivatuner up and running for 4850 you'll be able to set the fan at any speed you choose. Because it's at a lower core clock it doesn't need as much of an overclock to achieve the same level of performance increse, too. Also, higher clocks won't help 9800gtx+'s AA issues nearly enough to counter 4850's better/newer architecture.
3. hardware physix could become important in the future, but it probably won't be important for the current generation of cards. Of much greater concern is g92's architectural deficiencies when compared with rv770. That impacts ALL games RIGHT NOW, not theoretical games at some point in the future.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
The numbers I posted were default clocked 8800GT SLI setup, it is really that much faster then a HD4850. Benches. They have a part that easily and handily outperforms the HD4850 for $60 more, it isn't like you need to jump to the $400 260 before you can find an offering that does that. I know a lot of people want to limit all comparisons to the newly released parts, but it still seems that for anyone considering that price range the 8800GTs in SLI has the best price/performance ratio by a considerable margin.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
no doubt those numbers are accurate. however, most users dont have sli mobos, certainly at least a lot lower % than have xfire mobos. how much extra cost do you figure for an sli mobo, $60-70? $50? let's say it's $39...putting you squarely up against an hd 4870. obviously 8800gt sli will outperfrom a 4870, but by how much? also, how many people this weekend spent $300 at best buy to get a 4850 crossfire setup? Which would you rather have, old tech g92 sli or new architecture crossfire of cards that average 30% faster in single card setup for $40 total extra?
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Probably by a fair bit i suppose. 8800GT SLi should be close to a 9800GX2. HD4870 is no where near 9800GX2 performance according to rumours/specs and whats confirmed out there in the interweb.

RV770 isn't a new architecture. Whats there is based on RV670. Id say its more of a RV670 on steroids with a few minor adjustments.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Cookie Monster
Probably by a fair bit i suppose. 8800GT SLi should be close to a 9800GX2. HD4870 is no where near 9800GX2 performance according to rumours/specs and whats confirmed out there in the interweb.

RV770 isn't a new architecture. Whats there is based on RV670. Id say its more of a RV670 on steroids with a few minor adjustments.

8AA performance is significantly improved. 800 shaders from 320 is a lot of steroids. Even with minor adjustments you mention, ATI cards are more technologically advanced than NV when it comes to DX10.1 and being able to support 8 channel HD audio.

At the end of the day, performance per $ is what matters and ATI delivers a good alternative. Having said that, my friend just picked up 8800GT for $120 CDN. Nothing on the market can touch that including 9800GTX or 4850. 2x8800GTs for $240 is by far the best videocard deal around.
 

JPB

Diamond Member
Jul 4, 2005
4,064
89
91
Originally posted by: bryanW1995
Originally posted by: BenSkywalker
I can't remember where I saw it but NV's own PowerPoint slide showed they had nothing to offer between $230~$400.

What if nVidia did in fact have an offering for, let's say, $260 that beat the HD4850 by, let's make up some random numbers for different games(or something like that)-

51% Crysis

57% COD4

22% ET:QW

47% Assasin's Creed

45% Oblivion

71% The Witcher

25% Bioshock

For a ~30% price premium you would get between 22% and 71% performance increase, do you think that 'hypothetical' offering would be tempting?

what resolution is that, does it include AA, AF, etc? That sort of performance bump is not gtx 260, that's not even gtx 280, that's some sort of sli configuration. For $260 it would have to be a couple of oc edition 8800gt cards, right?

 

Kuzi

Senior member
Sep 16, 2007
572
0
0
Originally posted by: shangshang
Originally posted by: Kuzi

At 1600x1200 resolution, CoD won't make any difference even if you buy a GTX260. Check the anandtech review, 8800GT 58fps, HD4850 78fps, GTX260 84fps.

The HD4850 is 20 fps faster than the 8800GT at 1600x1200 res. You might be the type of person that can't tell 20 fps difference at this fps range (58 to 78). In my case I can tell the difference between frame rates at 50s or frame rates at 70s, some people can't.

Obviously I won't be able to tell the difference between GTX260 frame rates @ 84, and HD4850 78fps. That's why it would be a horrible value to buy a GTX260 to run at this resolution. The HD4870 will most likely run faster than the GTX260, so again, no one should buy that card only to run at 1600x1200 res.

1) I ain't getting the GT260, so don't care about it at the moment. Maybe get it when faster demands it later down the road, and when the price drops a bit.

2) Now you could tell the difference between 50 and 70 fps?? The mininum framerate required for the human eyes to perceive "smooth" motion is 25 fps. (Hence filming is done mainly between 25-30 fps.). You would need to have a strobe light before your eyes. I would agree that in "sticky, heavy scenes", the higher performance card can make the scene look smoother if the scene is bordering the 23-25 fps, but at 50-70 fps, it's a scientific fact that the human eyes can't distinguish!

You know for some games, say RPG or RTS games, 30 fps can be perfect and anything higher than that is not needed/noticeable. But trust me when playing FPS, Racing, and Actions games, many people can tell the difference in "smoothness" of the game, between frame rates at 50s or 70s.

Remember 58 fps is an average, it means in many cases the fps can be much lower than that, the frame rate can dip to the 30-40 fps, while the other card will dip to 50-60 fps. Now ask any hardcore FPS gamers does fps affect his gameplay, is the smoothness noticeable if the game is running below 50-60 fps or over 70-80 fps, they will say yes.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
the higher performance card can make the scene look smoother if the scene is bordering the 23-25 fps, but at 50-70 fps, it's a scientific fact that the human eyes can't distinguish!

Not sure where you went to school for science, but you should get a lawyer and sue, you certainly didn't get your money's worth. Get ahold of fpstest and a remotely decent monitor(when that can push 150Hz or higher) and see for yourself. Even people with extremely poor vision have no problem whatsoever spotting the difference between 50 and 70 fps. People with good vision can spot varriances well into the 100s.

JPB- You bolded let's make up some numbers, I put in brackets or something like that. Those numbers were pulled from the most die hard ATi fan site that I know of. 8800GTs in SLI is what I was comparing to the 4850, in terms of money versus performance that combination utterly smokes the current pricing on the 4850.