7600GT vs X850XT, whos faster?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Warren21

Member
Jan 4, 2006
118
0
0
Originally posted by: Acanthus

Ati has Faster and higher quality AF, nvidia has Transparency AA.

And ATI has Adaptive AA (ATI's own spin on 2D AA).

More On-topic, I game on:
Intel 915P chipset w/ Realtek ALC880 Azalia-HD
3.6 Prescott 2MB
2 x 512MB 3-3-3-8 PC2-4200 DDR2
Maxtor SL300S0 300GB SATA 7200 16MB
Sapphire X800 GTO2 softmodded n' OC'ed to X850 XT PE+ (590/610)
17" NEC MultiSync 1735NXM (1280 x 1024, DVI)

I Play Oblivion, Battlefield 2, Battlefront 2, WoW, SC: Chaos Theory and more...
I am quite happy with the performance of these games considering the age of the card's features and chip (R480). In Battlefield 2 with all settings on highest w/ 4x AA and 8x AF I get 40-80 FPS (40-60 on foot, 80+ while flying), Battlefront 2 plays wonderfully without hiccups w/ 4x AA and 8xAF and Bloom, for WoW I had to change my refresh rate from 60Hz to 75 to go from the 60 FPS cap just to hit the 75 again w/ all PS effects and 6xAA, 16x AF... Chaos Theory plays very well with all SM 2.0 effects aswell. Oblivion is rather tempermental, as in caves one easily reaches 60 FPS but in open grassy fields it's around 30 and in heavy forests around 20 (still playable)... this with Bloom, spectral effects, 2x AA and 8x AF with max Grass LOD and extended terrain, far buildings and far trees enables (Draw distance as far as you can see). I'm still happy in this investment, it's sure payed off the 250$ CAN I spent on it.
 

Sonikku

Lifer
Jun 23, 2005
15,893
4,902
136
Still waiting on AAtech to do a Apples to Apples benchmark on these two cards. Very few people are upgrading to some crazy ass liqued cooled X1900XTX CROSSFIRE PWNAGE SETUP. The vast majority of PC gamers in this country deal with the mid range cards and lower. These are the cards people are thinking about upgrading to. Give us benchmarks of them already.
 

Powermoloch

Lifer
Jul 5, 2005
10,084
4
76
x850xt

it plays the games out right now w/ no problem

especially on oblivion :)

specs:

3100+ sempron @ 2.069 Ghz
2 x 1 Gb Mushkin Extreme DDR500 @ 230 mhz
BBA x850xtagp

scored 6144 on 3dmark05
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: Acanthus
quote:

--------------------------------------------------------------------------------
Originally posted by: Lyfer

quote:

--------------------------------------------------------------------------------
Originally posted by: openwheelformula1
x850XT lacks so many features. There are very good reason why it's so cheap. Efficiency and features are the most important. Unless you don't plan on playing newer games in the next year, I'd get the 7600GT.

X850XT should have more frames in older games, but 7600GT will handle them with ease. Future games will suit 7600GT better.
--------------------------------------------------------------------------------



So many newer features? Care to list them. Which ones the one with hardware divx encoding? ATI always was better in IQ, is in Nvidia the top in IQ now?
--------------------------------------------------------------------------------



Ati has Faster and higher quality AF, nvidia has Transparency AA.

ATI uses Adaptive Anti Aliasing that is super sampling on alpha textures and multi sampling on the rest of the scene. And there's a registry tweak to enable it on older R3XX and R4X0 hardware. It works for me with a minimal performance hit. And about feature set, there's no SM 3.0 effect that SM 2.0b cannot do. You can see that this is just a marketing thing. No videocard will run more than 500 shader instructions without struggling. Open EX HDR is not a SM 3 effect, is not even a Microsoft standard, simply it was implemented on newer videocards that are SM 3.0 complaint. And HDR can be implemented by Int16 or Int10, it may not look identical compared to FP16 or FX16 but they're pretty close. Even many people cannot see much difference between Bloom and HDR in Oblivion, so can you imagine between Int16 and FP16?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Eldarion

1280 x 1024 would do fine... I´ve heard that HDR is awesome in Oblivion, but also slow you down a lot... did you try the X850XT with HDR or is that an impossible?

BTW, there´s a great review at: http://www.bit-tech.net/gaming/2006/03/31/elder_scrolls_oblivion/1.html

Thx

evidently HDR+Bloom+AA is impossible for the x800 series

and it is an eXceptional game . . . i have well over 150 hours into it
:heart:

nothing else even comes close
it is my game of this Millenium, so far
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,952
32,286
146
The XFX 7600GTXXX is a hot deal right now. $168.69 shipped after MIR, you get Ghost Recon:AW about a $40 value, and it comes clocked@590/1600 OOB, and it has a double life-time warranty. For anyone looking to get GRAW anyways, that makes the card a pratical value of around a $130 :beer:
 

Sonikku

Lifer
Jun 23, 2005
15,893
4,902
136
The 128mb memory kind of sucks though. It probably wouldn't make a HUGE difference all things considered, but then again 128mb of ram is probably just a tad more limiting versus 256mb as opposed to shader 3 versus shader 2.
 

kyparrish

Diamond Member
Nov 6, 2003
5,935
1
0
Originally posted by: Sonikku
The 128mb memory kind of sucks though. It probably wouldn't make a HUGE difference all things considered, but then again 128mb of ram is probably just a tad more limiting versus 256mb as opposed to shader 3 versus shader 2.

No, it has 256mb of memory, 128-bit interface. The memory is very quick however, I had a 7600gt, ran it at 1280x1024 on my LCD, and at that resolution, the 128-bit memory wasn't really an issue, even in Oblivion.
 

Sonikku

Lifer
Jun 23, 2005
15,893
4,902
136
128-bit interface, yeah. My bad. I was just saying that if given a choice between a 128-bit interface card with Shader 3, or a 256-bit interface with Shader 2, I would go with the latter. For the middle range cards at least.
 

A554SS1N

Senior member
May 17, 2005
804
0
0
Originally posted by: Sonikku
128-bit interface, yeah. My bad. I was just saying that if given a choice between a 128-bit interface card with Shader 3, or a 256-bit interface with Shader 2, I would go with the latter. For the middle range cards at least.

Depends on the speed of the memory though surely?? At 1400Mhz, over 128-bit, you're looking at the same actual bandwidth as 256-bit at 700Mhz, but because of higher frequency being more useful, it could probably feel as good as a 256-bit card with 800 or even 900Mhz on a 256-bit bus. Thus, so long as the core is powerful enough, it can overcome deficiencies in actual memory bandwidth to provide decent performance. At this stage in the game, I wouldn't want to miss out on SM3 at all, so 128-bit versus 256-bit is a bit too general to base a choice on in terms of performance.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,952
32,286
146
Originally posted by: A554SS1N
Depends on the speed of the memory though surely?? At 1400Mhz, over 128-bit, you're looking at the same actual bandwidth as 256-bit at 700Mhz, but because of higher frequency being more useful, it could probably feel as good as a 256-bit card with 800 or even 900Mhz on a 256-bit bus. Thus, so long as the core is powerful enough, it can overcome deficiencies in actual memory bandwidth to provide decent performance. At this stage in the game, I wouldn't want to miss out on SM3 at all, so 128-bit versus 256-bit is a bit too general to base a choice on in terms of performance.
The comment about chosing based on bus width seemed silly to me as well. People can use whatever criteria they like when spending their money, but criteria like bus width as THE factor? I prefer to evaluate almost every aspect of a purchase- warranty, tech specs, performance in titles I play or intend to, bundled extras, features like AVIVO&PV, effectiveness and noise level of stock cooler, overclocking potential, less important features, such as not requiring a PSU connector, SLI&CF ease of setup and use, end-user experience *viral marketing has to be measured here ;) * all add up to help make a decision.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: DAPUNISHER
Originally posted by: A554SS1N
Depends on the speed of the memory though surely?? At 1400Mhz, over 128-bit, you're looking at the same actual bandwidth as 256-bit at 700Mhz, but because of higher frequency being more useful, it could probably feel as good as a 256-bit card with 800 or even 900Mhz on a 256-bit bus. Thus, so long as the core is powerful enough, it can overcome deficiencies in actual memory bandwidth to provide decent performance. At this stage in the game, I wouldn't want to miss out on SM3 at all, so 128-bit versus 256-bit is a bit too general to base a choice on in terms of performance.
The comment about chosing based on bus width seemed silly to me as well. People can use whatever criteria they like when spending their money, but criteria like bus width as THE factor? I prefer to evaluate almost every aspect of a purchase- warranty, tech specs, performance in titles I play or intend to, bundled extras, features like AVIVO&PV, effectiveness and noise level of stock cooler, overclocking potential, less important features, such as not requiring a PSU connector, SLI&CF ease of setup and use, end-user experience *viral marketing has to be measured here ;) * all add up to help make a decision.

for me price is the biggest factor.

you can get a x850xt [pcie] for $150 after $50 MIR at MicroCenter. ;)
[in today's Hot Deals]

either card would be a great choice. ;)
:thumbsup: