Nvidia equivalent to x800 pro?

ktehmok

Diamond Member
Aug 4, 2001
4,326
0
76
I wanna get rid of this card. I can't stand the drivers. What is the performance equivalent in an Nvidia card? I've been out of the hardware loop for a few years so I'm clueless.

It's a x800 pro w/ 256mb ram. I'd have to pull it to see who made it.

Thanks
 

evenmore1

Senior member
Feb 16, 2006
369
0
0
6800GS = X850 Pro

but I dunno about yours, probably also about a 6800GS as X850pro is only faster than X800 Pro by clock speed
 

AzNPinkTuv

Senior member
Nov 29, 2005
659
0
76
!@$!@% he got to it before me!


6800gs doesnt equal a 850pro though...

locked at stock the 6800gs is slower then both
 

Sentry2

Senior member
Mar 21, 2005
820
0
0
Find you a used 6800GT. Should be a bit of an upgrade even. I'm assuming you are running AGP. If you want a new AGP card I would suggest maybe a 7600GT or a used 7800GS. Either one would be a nice upgrade.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
For a fact, a 6800GT is faster than a X800pro overall by 5~10%. One reason why NV made a huge comeback.

Now if your talking about a 6800GS PCI-e, this card is faster than the 6800GT by a few fps. (High clocked, using 7800GT PCB).

Not quite sure about X850pro, but both 6800GS/GT PCI-e is faster than a X800pro. 6800GS AGP is abit different. They use either the NV40 chip or the NV42 with the HSI chip. Only AGP ver clocked at 350mhz compared to PCi-e Ver 450mhz (PCI-e).

If your looking for an AGP card, a 7600GT AGP will do you very nicely. Small, less heat, smaller power draw, better features/IQ/performance compared to the X800XL.
 

Captante

Lifer
Oct 20, 2003
30,340
10,859
136
X850XT was faster then the 6800GS/GT (& even the Ultra) but as far as I recall the Pro's were lower clocked & only had 12 pipes, so unless they were unlockable they would lose out.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
a good replacement is 7600GT with 12 pipes around $110-130. Other than that you are looking at $190-200 7900GS with 20 pipes, 7900gt for $225 with 24 pipes, and $250 X1900XT 256mb with 48 pixel shaders. Any will be faster than X800Pro, and ready for Vista with PS3.0.

cheers.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: akshayt

6800GT/GS will compete that card and IMO are much better due to SM3.0/HDR support.

Not really, in newer and more graphically advanced games the 6800 series are slower than the X8X0 series, you can see it in the Tom's Hardware VGA Charts, in games like F.E.A.R, Oblivion etc. So is useless having 3.0 and HDR without the horsepower to run it, anyways there's nothing different in image quality between 2.0b and 3.0, you can see it in games like Age of Empires, Far Cry etc. Anyways, an equivalent to the X800 PRO in AGP should be the 6800GS, the X850 PRO is faster and should compete with the 6800GT. The 6800 Ultra competes with the X800XT and blah blah blah.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: evolucion8
Originally posted by: akshayt

6800GT/GS will compete that card and IMO are much better due to SM3.0/HDR support.

Not really, in newer and more graphically advanced games the 6800 series are slower than the X8X0 series, you can see it in the Tom's Hardware VGA Charts, in games like F.E.A.R, Oblivion etc. So is useless having 3.0 and HDR without the horsepower to run it, anyways there's nothing different in image quality between 2.0b and 3.0, you can see it in games like Age of Empires, Far Cry etc. Anyways, an equivalent to the X800 PRO in AGP should be the 6800GS, the X850 PRO is faster and should compete with the 6800GT. The 6800 Ultra competes with the X800XT and blah blah blah.

Hmm wrong.

AOE3 is a VERY good example of the difference.

Since you mentioned THG,

S.M 3.0 cards can activate better IQ in tomb raider legend

Difference of PS 3.0 and PS 2.0

The 6/7 aseries and X1000 series enjoy HDR, parallax mapping etc plus the THG has a final comparison between the older cards to the new ones.

The 6 series have a MUCH better IQ now due to many games taking advantage of S.M 3.0. Firstly, when S.M 3.0 was introduced, it was used to optimise GPUs for performance rather than IQ. Although a card like a 6800GT could be slow to take advantage of S.M 3.0 IQ enhancements, it will still be enough to play future games at a lower res (maybe without AA or even AF), e.g 10x7 with all those features enabled unlike the X8 series which will miss all those out all together.

edit - Some will say its too slow to play with all S.M 3.0 eye candy, but this is subjective as some people are FPS sensitive, or doesnt mind playing at lower res without AA or even AF. (Well the 6 series been out for quite a long time, so you would have to expect the level of pofermance to be quite low).

Not to mention future games might not even provide a S.M 2.0 backup shader, and instead be S.M 3.0 only. They might have a backup shader model at 1.1 which shows considerable IQ loss as seen here.

HardOCP's S.M 3.0 article.

Naturally, Future OpenGL games will perform great with NV hardware (even the 6 series), unlike the X8xx series which get slaughtered in OpenGL titles. (X1000 series does pretty well which is a nice thing).

Maybe some users with 6800GT/6800GS can post some benchmarks on todays games.



 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: Cookie Monster
Originally posted by: evolucion8
Originally posted by: akshayt

6800GT/GS will compete that card and IMO are much better due to SM3.0/HDR support.

Not really, in newer and more graphically advanced games the 6800 series are slower than the X8X0 series, you can see it in the Tom's Hardware VGA Charts, in games like F.E.A.R, Oblivion etc. So is useless having 3.0 and HDR without the horsepower to run it, anyways there's nothing different in image quality between 2.0b and 3.0, you can see it in games like Age of Empires, Far Cry etc. Anyways, an equivalent to the X800 PRO in AGP should be the 6800GS, the X850 PRO is faster and should compete with the 6800GT. The 6800 Ultra competes with the X800XT and blah blah blah.

Hmm wrong.

AOE3 is a VERY good example of the difference.

Since you mentioned THG,

S.M 3.0 cards can activate better IQ in tomb raider legend

Difference of PS 3.0 and PS 2.0

The 6/7 aseries and X1000 series enjoy HDR, parallax mapping etc plus the THG has a final comparison between the older cards to the new ones.

The 6 series have a MUCH better IQ now due to many games taking advantage of S.M 3.0. Firstly, when S.M 3.0 was introduced, it was used to optimise GPUs for performance rather than IQ. Although a card like a 6800GT could be slow to take advantage of S.M 3.0 IQ enhancements, it will still be enough to play future games at a lower res (maybe without AA or even AF), e.g 10x7 with all those features enabled unlike the X8 series which will miss all those out all together.

edit - Some will say its too slow to play with all S.M 3.0 eye candy, but this is subjective as some people are FPS sensitive, or doesnt mind playing at lower res without AA or even AF. (Well the 6 series been out for quite a long time, so you would have to expect the level of pofermance to be quite low).

Not to mention future games might not even provide a S.M 2.0 backup shader, and instead be S.M 3.0 only. They might have a backup shader model at 1.1 which shows considerable IQ loss as seen here.

HardOCP's S.M 3.0 article.

Naturally, Future OpenGL games will perform great with NV hardware (even the 6 series), unlike the X8xx series which get slaughtered in OpenGL titles. (X1000 series does pretty well which is a nice thing).

Maybe some users with 6800GT/6800GS can post some benchmarks on todays games.
SM 3.0 is just a marketing crap. After all, I was talking about SM 2.0b, not 2.0 which is severely limited to 160 instructions. SM 3.0 and SM 2.0b have 512 instructions each component in hardware, both are almost identical, the only and ONLY difference is the Branching, 2.0b uses static branching and SM 3.0 uses dynamic branching that allows to increase the instruction count to infinite lenght. But remember that we are talking about Floating Point Data that is hard to predict, so if a misprediction occurs, all the shader pipeline must be flushed. Actually there's little or no benefit, even loss of performance using Dynamic Branching. That's why developers (and games) don't even reach the 300 shader count. And remember, Tomb Raider Legend is an nVidia game, they're not gonna optimize the game using custom shader profiles like 2.0b, they simply will go to the default, 2.0, 3.0 etc. And if a game developer opts not to use a fallback code, they will loose sales cause most of the DX 9 card owners have 2.0 shader card. The only and ONLY game that have this kind of issue was Splinter Cell: Chaos Theory, and a patch was released allowing ATi users to use SM 2.0 on the game, and the image quality is almost identical, (Except the lightning quality that is certainly a bit more dim)

OpenGL in ATi cards have been improoved greatly, that's why you can see an X8X0 outperforming a GeForce 6800 Ultra in Quake 4, that doesn't happen with Doom 3, and both uses the same graphic engine, smells like cheating eh? And HDR implemented currently in games is a FP16 Bit Open EX standard, not the Int16, Int10 Microsoft standard implemented in DX9 originally in games like Half Life 2 and demos like Paul Devebec Natural Light. I forgot a game that allows you to choose the Shader Profile, 2.0, 2.0a, 2.0b and 3.0, the difference was show using an Ocean, and they found that everything looks identical between SM 3.0 and SM 2.0b. As soon as I find it, I will post it here. http://www23.tomshardware.com/graphics.html This will let you see that the 6800 Ultra trails far behind against the X8X0 in newer games.
http://www.anandtech.com/video/showdoc.aspx?i=2746&p=5 Here you will see that there's no image quality benefit between SM 2.0 and 3.0 (Except HDR which is not a SM 3.0 feature) And shows how far behind the 6800 series performs.

http://www.hardocp.com/article.html?art=NjA5 <<The only effect then that is unique to these screenshots in Shader Model 3.0 is Displacement Mapping; it appears to HardOCP that everything else could be done in Shader Model 2.0.

Yeah, because True Displacement Mapping is a Vertex Shader feature, not a Pixel Shader feature.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: postmortemIA
So SM2.0 is better because your card doesn't support 3.0?

Man, I told you to stop smoking crack, is burning your brain off. Im just saying the truth, my 4 years old R360 on steroids outperforms a 2 years old NV40 in newer games, that's it. Wheter you like it or not. And like all the reviewers says, there's no image quality benefit between SM 2.0b and 3.0. That can be seen between 2.0 and 3.0. So I'm not gonna upgrade till the DX 10 card debuts, cause after all the only thing that I'm missing is Open EX HDR and most people turned it off because of the Perfomance Penalty and the overbright crap.
 

hmorphone

Senior member
Oct 14, 2005
345
0
0
Originally posted by: postmortemIA
So SM2.0 is better because your card doesn't support 3.0?

Dude, you just got done suggesting a 7600GS to the OP. You're wanting him to downgrade? Now a 7600GT I could maybe understand...:confused:
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: hmorphone
Originally posted by: postmortemIA
So SM2.0 is better because your card doesn't support 3.0?

Dude, you just got done suggesting a 7600GS to the OP. You're wanting him to downgrade? Now a 7600GT I could maybe understand...:confused:

Yeah, the 7600GT trade blows with the X850XTPE.
 

postmortemIA

Diamond Member
Jul 11, 2006
7,721
40
91
He asked for equivalent... i told him that is what is close to equivalent. 7600GT blows it hands down, and it is better deal.
 

hmorphone

Senior member
Oct 14, 2005
345
0
0
Originally posted by: postmortemIA
He asked for equivalent... i told him that is what is close to equivalent. 7600GT blows it hands down, and it is better deal.

Interesting choice of words. ;)
 

ktehmok

Diamond Member
Aug 4, 2001
4,326
0
76
Originally posted by: evenmore1
What is your budget OP?


My budget will be whatever nvidia card I can get for this x800 pro. This card, & the 1800XL I have in my gaming comp are the first, & last ati cards I'll ever own. Unless they seriously revamp their driver "thought process"

The cards seem ok, but I've tried both the Ati & Omega drivers & neither impresses me in the least. Heck, the ATI drivers won't even show the control panel unless it can connect to the internet first...wtf?

This is all old news to you guys I'm sure. I just want to trade the card for something that suit's me better. Which is nvidia.