Originally posted by: taltamir
they shouldn't really... because when you are playing at 20fps on bioshock or coh at max settings you are never gonna enable AA/AF. You will have to disable way too many other things to make the game playable and it is just not worth it.
Originally posted by: SlowSpyder
Originally posted by: Rusin
tcool93:
Basically all tests say that there are huge difference when AA is enabled. If that difference is like 30-60%.. theres huge difference.
Therk:
Competitive without AA enabled (which is still needed at 1680x1050..at least).
Since 512MB became too small for decent frame buffer. In some cases 8800 GTS 640MB can beat 8800 GT 512MB just because of that extra 128MB frame buffer.
I'm sure I'm in the minority, but I never use AA. Even when I'm playing an older game that I wouldn't even see a performance penalty, I don't use it. I use a 22" monitor, 1680x1050. To me the slow down it adds for the little benefit (my opinion only) is not worth it. I'm sure most people like it, but to me it's just not a big deal.
Originally posted by: bryanW1995
let's face it, all of the reviews show that 3870 is slower than 8800gt, the only question is by how much? Based upon all that I've seen/heard for the past month, I think that the following is a good synopsis:
1. 3870 perf was somehow leaked to nvidia late in the development process.
2. 8800gt clocks were bumped up significantly to ensure continued nvidia superiority.
3. cooling solutions for 8800gt were NOT improved when clocks were bumped.
4. amd chose to go the "quiet" and "slow" route instead of getting into a speed war with nvidia.
5. 8800gt stole amd's thunder b/c amd couldn't/wouldn't attempt to compete on performance.
6. 8800gt was a paper launch. 38x0 is allegedly NOT going to be a paper launch, but we'll find out soon enough.
7. Nvidia has, as usual, found a way to look superior to AMD.
I almost convinced myself to go for an 8800gt this afternoon. I'm glad I held out and read the vr zone and tweaktown reviews, however. 3870 is not the same as hd2900xt, it's not supposed to be the uber 1337 card. It's supposed to be the quiet, low power, typical amd offering that will do better than 2900xt in future games (as evidenced by dx10 performance). Also, based on recent history, it is also reasonable to assume that 3870 will get closer to 8800gt in performance as amd releases new drivers. Finally, I can xfire next year if I'm really hungry for more performance. The only thing that might move me back to 8800gt now is price. Even taking the heat/noise into consideration, it would be hard to justify the 3870 if prices were identical or nearly so. If amd really has 250,000 cards available at launch and nvidia continues to have problems meeting demand, however, my decision will be made for me.
Originally posted by: ItsAlive
Im confused, I was under the impression that a die shrink and moving down in transistor count from 700 million to 666 million would decrease operating temps??? Those readings show an 18 degree increase in load temps.....😕
Weren't R600 supposed to be 65nm? And when they had problems with that they had to make it with 80nm process and for obvious reasons clock frequencies were much much lower than planned.Originally posted by: Azn
How much? It should be about same as 2900xt. No added texture units, no extra rops, similar clocks. It's basically 2900xt with 256bit memory controller. That's what a 2900xt should have been in the first place. That 512bit memory controller was a big waste on 2900xt. I don't know what ATI engineers were thinking at the time.
For me there ain't big performance hit when using for example 4xAA (in Unreal Tournament 3 1680x1050 + everything as high as it goes + 8xAA 16xAF works fine somehow (HQ AA slows it down right away).Originally posted by: SlowSpyder
I'm sure I'm in the minority, but I never use AA. Even when I'm playing an older game that I wouldn't even see a performance penalty, I don't use it. I use a 22" monitor, 1680x1050. To me the slow down it adds for the little benefit (my opinion only) is not worth it. I'm sure most people like it, but to me it's just not a big deal.
look at the tweaktown review. the last few pages of it show a significantly higher fps for 3870 than 2900xt in wic and crysis. if this is typical of games going forward then it will be a huge boost for amd.Originally posted by: Azn
Originally posted by: bryanW1995
let's face it, all of the reviews show that 3870 is slower than 8800gt, the only question is by how much? Based upon all that I've seen/heard for the past month, I think that the following is a good synopsis:
1. 3870 perf was somehow leaked to nvidia late in the development process.
2. 8800gt clocks were bumped up significantly to ensure continued nvidia superiority.
3. cooling solutions for 8800gt were NOT improved when clocks were bumped.
4. amd chose to go the "quiet" and "slow" route instead of getting into a speed war with nvidia.
5. 8800gt stole amd's thunder b/c amd couldn't/wouldn't attempt to compete on performance.
6. 8800gt was a paper launch. 38x0 is allegedly NOT going to be a paper launch, but we'll find out soon enough.
7. Nvidia has, as usual, found a way to look superior to AMD.
I almost convinced myself to go for an 8800gt this afternoon. I'm glad I held out and read the vr zone and tweaktown reviews, however. 3870 is not the same as hd2900xt, it's not supposed to be the uber 1337 card. It's supposed to be the quiet, low power, typical amd offering that will do better than 2900xt in future games (as evidenced by dx10 performance). Also, based on recent history, it is also reasonable to assume that 3870 will get closer to 8800gt in performance as amd releases new drivers. Finally, I can xfire next year if I'm really hungry for more performance. The only thing that might move me back to 8800gt now is price. Even taking the heat/noise into consideration, it would be hard to justify the 3870 if prices were identical or nearly so. If amd really has 250,000 cards available at launch and nvidia continues to have problems meeting demand, however, my decision will be made for me.
How much? It should be about same as 2900xt. No added texture units, no extra rops, similar clocks. It's basically 2900xt with 256bit memory controller. That's what a 2900xt should have been in the first place. That 512bit memory controller was a big waste on 2900xt. I don't know what ATI engineers were thinking at the time.
Originally posted by: Rusin
Weren't R600 supposed to be 65nm? And when they had problems with that they had to make it with 80nm process and for obvious reasons clock frequencies were much much lower than planned.Originally posted by: Azn
How much? It should be about same as 2900xt. No added texture units, no extra rops, similar clocks. It's basically 2900xt with 256bit memory controller. That's what a 2900xt should have been in the first place. That 512bit memory controller was a big waste on 2900xt. I don't know what ATI engineers were thinking at the time.
Originally posted by: Azn
Originally posted by: bryanW1995
let's face it, all of the reviews show that 3870 is slower than 8800gt, the only question is by how much? Based upon all that I've seen/heard for the past month, I think that the following is a good synopsis:
1. 3870 perf was somehow leaked to nvidia late in the development process.
2. 8800gt clocks were bumped up significantly to ensure continued nvidia superiority.
3. cooling solutions for 8800gt were NOT improved when clocks were bumped.
4. amd chose to go the "quiet" and "slow" route instead of getting into a speed war with nvidia.
5. 8800gt stole amd's thunder b/c amd couldn't/wouldn't attempt to compete on performance.
6. 8800gt was a paper launch. 38x0 is allegedly NOT going to be a paper launch, but we'll find out soon enough.
7. Nvidia has, as usual, found a way to look superior to AMD.
I almost convinced myself to go for an 8800gt this afternoon. I'm glad I held out and read the vr zone and tweaktown reviews, however. 3870 is not the same as hd2900xt, it's not supposed to be the uber 1337 card. It's supposed to be the quiet, low power, typical amd offering that will do better than 2900xt in future games (as evidenced by dx10 performance). Also, based on recent history, it is also reasonable to assume that 3870 will get closer to 8800gt in performance as amd releases new drivers. Finally, I can xfire next year if I'm really hungry for more performance. The only thing that might move me back to 8800gt now is price. Even taking the heat/noise into consideration, it would be hard to justify the 3870 if prices were identical or nearly so. If amd really has 250,000 cards available at launch and nvidia continues to have problems meeting demand, however, my decision will be made for me.
How much? It should be about same as 2900xt. No added texture units, no extra rops, similar clocks. It's basically 2900xt with 256bit memory controller. That's what a 2900xt should have been in the first place. That 512bit memory controller was a big waste on 2900xt. I don't know what ATI engineers were thinking at the time.
Originally posted by: bryanW1995
look at the tweaktown review. the last few pages of it show a significantly higher fps for 3870 than 2900xt in wic and crysis. if this is typical of games going forward then it will be a huge boost for amd.Originally posted by: Azn
Originally posted by: bryanW1995
let's face it, all of the reviews show that 3870 is slower than 8800gt, the only question is by how much? Based upon all that I've seen/heard for the past month, I think that the following is a good synopsis:
1. 3870 perf was somehow leaked to nvidia late in the development process.
2. 8800gt clocks were bumped up significantly to ensure continued nvidia superiority.
3. cooling solutions for 8800gt were NOT improved when clocks were bumped.
4. amd chose to go the "quiet" and "slow" route instead of getting into a speed war with nvidia.
5. 8800gt stole amd's thunder b/c amd couldn't/wouldn't attempt to compete on performance.
6. 8800gt was a paper launch. 38x0 is allegedly NOT going to be a paper launch, but we'll find out soon enough.
7. Nvidia has, as usual, found a way to look superior to AMD.
I almost convinced myself to go for an 8800gt this afternoon. I'm glad I held out and read the vr zone and tweaktown reviews, however. 3870 is not the same as hd2900xt, it's not supposed to be the uber 1337 card. It's supposed to be the quiet, low power, typical amd offering that will do better than 2900xt in future games (as evidenced by dx10 performance). Also, based on recent history, it is also reasonable to assume that 3870 will get closer to 8800gt in performance as amd releases new drivers. Finally, I can xfire next year if I'm really hungry for more performance. The only thing that might move me back to 8800gt now is price. Even taking the heat/noise into consideration, it would be hard to justify the 3870 if prices were identical or nearly so. If amd really has 250,000 cards available at launch and nvidia continues to have problems meeting demand, however, my decision will be made for me.
How much? It should be about same as 2900xt. No added texture units, no extra rops, similar clocks. It's basically 2900xt with 256bit memory controller. That's what a 2900xt should have been in the first place. That 512bit memory controller was a big waste on 2900xt. I don't know what ATI engineers were thinking at the time.
wow, 30-40% slower on 1920x1200. that's pretty good for GT. Too bad it's false except on a few hand-picked titles. 3870 also handily beats 8800gt on call of juarez, but nobody's claiming that it is faster overall. take a good avg diff in frames. also, compare in dx10 games which will be pretty much everything going forward. check out the vrzone and tweaktown reviews, both are much more credible than legionhardware.Originally posted by: taltamir
on PAPER the 3870 is a beast... on everything it just screams "more" then the GT (like the DDR4 instead of 3, and 3 times the stream processors, DX10.1)... however in reality it gets 17% slower then the GT on low res (on ati favoring games), 30% slower on 1920x1200, and 40% slower on 1920x1200 with AA/AF enabled...
Also on MOST things it beats the 2900xt... but on a few tests the 2900xt came on top...
Currently etailers allow you to preorder the 3870 for 250$... the GT is now 270$... thats a 20$ difference for 17% difference at BEST, 40% at worst... Oh, and I am playing on a 24 inch 700$ dell monitor that has a native 1920x1200 res... so for me it is closer to the 40%...
At those resolutions 3850 (50$ less the the 3870) isn't even in the picture... with only 256MB of ram it simply FAILS... the GT beating it by 170% (27fps vs 10fps!)
It also runs about 88 degrees celsius, just like the GT...
Originally posted by: taltamir
WTF is called of juarez? I played the demo for 10 mintutes and then deleted it, it is not even worth pirating, much less buying a video card for.
Bioshock, World in Conflict, and Company of Heroes are not "a few handpicked titles"... they are the best DX10 games out there right now, if not the best games ever made!
Originally posted by: taltamir
testing on some game that heavily favors ATI but suckes donkey does not qualify as a "good test" for me... it is as artificial as running 3dmark.
Test on 3 of the best games of all time... (and heaviest)... thats realistic.
At Tweaktown they also hand picked for AA test Half-Life 2: Lost Coast which is known for it's Nvidia-favorism...60% difference in favor of 8800 GT.Originally posted by: bryanW1995
wow, 30-40% slower on 1920x1200. that's pretty good for GT. Too bad it's false except on a few hand-picked titles. 3870 also handily beats 8800gt on call of juarez, but nobody's claiming that it is faster overall. take a good avg diff in frames. also, compare in dx10 games which will be pretty much everything going forward. check out the vrzone and tweaktown reviews, both are much more credible than legionhardware.
Thing is that with R600 and HD2900XT power consumption set limits for clock frequency. This time that ain't the case.Originally posted by: Azn
From 80nm to 55nm ATI only increased their clock by 33mhz. I don't see how ATI was going to vamp up the clock speed with 65nm part. Why doesn't ATI add more texturing units and lower the clock speed some? I really don't understand why ATI is not adding more TMU. Their engineers are able to whoop out 512bit memory controllers but not more tmu to the radeon core? Doesn't make sense. A while Nvidia is doubling tmu count every year or so. They are whooping AMD in the high end market.
Originally posted by: Rusin
At Tweaktown they also hand picked for AA test Half-Life 2: Lost Coast which is known for it's Nvidia-favorism...60% difference in favor of 8800 GT.Originally posted by: bryanW1995
wow, 30-40% slower on 1920x1200. that's pretty good for GT. Too bad it's false except on a few hand-picked titles. 3870 also handily beats 8800gt on call of juarez, but nobody's claiming that it is faster overall. take a good avg diff in frames. also, compare in dx10 games which will be pretty much everything going forward. check out the vrzone and tweaktown reviews, both are much more credible than legionhardware.
DX10 is the thing, but why there haven't been AA-tests for that.. in DX9 8800 GT "slaughters" HD3870 in AA.