hd3870 vs 9600gt

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AzN

Banned
Nov 26, 2001
4,112
2
0
3870 is stronger card and it only matches 9600gt with AA. Currently 3870>9600gt raw performance but in the end it will be more better.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
So basically 9600 GT can run games better with best playable settings (AA on)..
 

soonerproud

Golden Member
Jun 30, 2007
1,874
0
0
Originally posted by: Rusin
So basically 9600 GT can run games better with best playable settings (AA on)..

Correct

Turn AA off and the 3870 will be slightly faster than the 9600GT.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: Rusin
So basically 9600 GT can run games better with best playable settings (AA on)..

That might be changing though with release of SP1. Look at Assassin's Creed for instance.
 

golem

Senior member
Oct 6, 2000
838
3
76
Originally posted by: Azn
Originally posted by: Rusin
So basically 9600 GT can run games better with best playable settings (AA on)..

That might be changing though with release of SP1. Look at Assassin's Creed for instance.

The developers for Assassin's Creed removed the Dx10.1 path in the latest patch because of incorrect rendering. So the increased speed from for Dx10.1 might or might not be there.
 

Piuc2020

Golden Member
Nov 4, 2005
1,716
0
0
The point is the benchmarks proved DX10.1 gave a pretty good increase in the shader-based AA of the HD 3870.

To be honest, I'd pick a 8800GT over a 3870 any day but I'll pick the 3870 over the 9600GT simply because the 9600GT is going to face a serious problem when games need more than 64 shader units and by the looks of it everyone seems to be unloading operations onto shader units (AA with DX10.1 and CUDA physx just to name two of the most important ones).
 

golem

Senior member
Oct 6, 2000
838
3
76
Originally posted by: Piuc2020
The point is the benchmarks proved DX10.1 gave a pretty good increase in the shader-based AA of the HD 3870.

.

It depends if the benchmark increases came from a bump in speed for shader based AA or if the increase came from skipping a rendering pass. We won't know until more information comes out.
 

Piuc2020

Golden Member
Nov 4, 2005
1,716
0
0
Originally posted by: golem
Originally posted by: Piuc2020
The point is the benchmarks proved DX10.1 gave a pretty good increase in the shader-based AA of the HD 3870.

.

It depends if the benchmark increases came from a bump in speed for shader based AA or if the increase came from skipping a rendering pass. We won't know until more information comes out.

That's true, in any case, I still rather have a 3870 than a 9600GT.
 

idiotekniQues

Platinum Member
Jan 4, 2007
2,572
0
76
update:

i picked up a 50ft hdmi cable and hooked up my 3870 to my 42" 1080i panny plasma. worked like a charm. after a lil fiddlin around i got the tv to clone the desktop in theatre mode. a simple switch of the audio device in control panel to ATI REAR HD - fed the audio right out to the tv easy peasy right over the hdmi cable.

v for vendetta (720p) and world in conflict looked gorgeous outputting on that plasma. having the audio on one cable is quite convenient for the usage i want this for. it was also really cool to be able to look at photos from my 30d on the plasma, i can show family when they come over - they can all sit on the couch in the living room and get a photo show. with music.

good stuff.
 

Drsignguy

Platinum Member
Mar 24, 2002
2,264
0
76
Couldn't be happier with my HD 3870 OC Edition.I got it for the sale price of $130.00 at Bust Buy right here in its home state of Minnesota. A rare thing for me to do but for that price, couldn't pass it up. :beer:.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Interesting, I have seeing that latest driver development has pushed the Radeon HD 3870 closer to the 8800GT than in the past, suddenly looks even more attractive than it used to be.

http://www.tomshardware.com/re...gtx-review,1800-6.html

http://www.bjorn3d.com/read.php?cID=1278&pageID=4849 <<Company of Heroes and World in Conflict performance is still anemic though...

http://www.overclockersclub.co...iews/xfx_9800gtx/6.htm <<Abnormally low scores on Crysis with the HD 3870 and abnormally high scores with the same card in Call of Duty 4, strangely the HD 3870 runs performs slighly faster in World in Conflict than the 8800GT, something that is the opposite in the bjorn3d review. Both reviews uses the same CPU but bjorn3d is overclocked, so that might be the explain the performance differences...

Is quite a feat that the HD 3870 is able to do Shader Anti Aliasing and being playable at the same time, look at Call of Juarez, it could run even better if it was DX10.1. Seems that it's complex architecture will be more suitable for the future, I would take the HD 3870 over the 9600GT anyday, specially now that it's known that the 9600GT will faint and will show it's weaknesses in future games, but not a bad card considering that's is almost a half of a 8800GT and is and it's performance is close to it, great feat.

http://www.beyond3d.com/content/reviews/47/8

So did AMD really build a better filter?
Arguably they didn't, in terms of overall image quality. The linear nature of the tent down-filter and the fact it takes into account contribution from neighbouring pixels means that image blurring is a natural side effect. There are simply better filter kernels to be applied, especially when you're looking outside the pixel, which is the key property of what AMD did. However balancing those filters against computational cost and hardware limitations is likely a key reason why tent was chosen.

That said, the key advantage comes not in terms of being able to apply these filters in the first instance; NVIDIA or anyone else is quite free to implement the same filters to give equivalent image quality, if that's what the customer is looking for.

No, the advantage comes because they built their hardware with a fast decompression path into their unified shader core. Samples and sample locations can be made known to the compute-heavy core of their latest chips in an efficient manner, in order for the filter to be computed and final pixel colour written. Current competing hardware doesn't have that luxury.