• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

X1800XT crossfire vs 7800GTX sli (updated with price comparison)

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: SickBeast
Originally posted by: KeepItRed
Nice topic..

+1 ATI

Why does ATI get +1? The cards aren't available and will cost more than the 256mb GTX's, not to mention much more than the GT's.

Do you work for ATI or something? I work in Markham, ON. Can you tell me details of R600? If so, we should do lunch. 😉

Maybe he meant +1 paper launch.
 
Originally posted by: Ackmed
Typical responses from some worthless posters.

Typical worthless AT review, again. AT's reviews have goen down hill from one of the top, to one of the lowest. No AF, no Super/SLI AA, etc. Of course they mention ATi's problems with getting cards out, and keeping them out. Yet not one word on the severe lack of availablity, and sky high prices of the 512MB GTX's. I want my 10 minutes back from that "review".

if you hate AT so much, leave and stop insulting people...Rollo's post was COMPLETLY unbiased. Idiot....
 
Something seems to be a bit off with the SuperAA mode scores at FS.

Quake4 1600x1200 on the XT hits 102.3FPS running 4x AA and 54.3FPS running 14x AA- a 47% performance hit..... hmmmm. The backbuffer/z-buffer requirement alone at that setting is 205MBs. Obviously the XT packs 512MB but running HL2 the XL is pushing 46.4FPS at the above listed setting while in effect ending up a 64MB part. Something is not right.
 
@ Munky:

"Buggy HW"??? Thats that stupid optimazation that NV put in the screws up the shadows in games like FarCy and BF2...not the hardware.
 
I remember back when Nvidia's biggest partner for retail vid cards , err Visiontek gave LIFETIME warranties. Sold a ton of LIFETIME warrantied geforce3s and geforce4s, heck I even got a geforce3 ti200 w/ a LIFETIME warranty. Visiontek went belly up and filed bankruptcy, BYE BYE lifetime warranty

VisionTek dies

Assets liquidated in bankruptcy, Hartford Computers bought the name Visiontek, but not the debt. What about all those awesome Geforce3 & 4 LIFETIME warranties? Its been stated that the executives that ran Visiontek to bankruptcy now work for BFG.
Lifetime Warranty?

If it sounds to good too be true, many times it is. Caveat emptor!
 

Originally posted by: JimmyH
I remember back when Nvidia's biggest partner for retail vid cards , err Visiontek gave LIFETIME warranties. Sold a ton of LIFETIME warrantied geforce3s and geforce4s, heck I even got a geforce3 ti200 w/ a LIFETIME warranty. Visiontek went belly up and filed bankruptcy, BYE BYE lifetime warranty

VisionTek dies

Assets liquidated in bankruptcy, Hartford Computers bought the name Visiontek, but not the debt. What about all those awesome Geforce3 & 4 LIFETIME warranties? Its been stated that the executives that ran Visiontek to bankruptcy now work for BFG.
Lifetime Warranty?

If it sounds to good too be true, many times it is. Caveat emptor!

Errr, "it many times is"?

LOL- you gave one example from three years ago! I'd call that "it one time was"!

 
First time shame on you, second time shame on me 😀

I own some 3dfx cards too with LIFETIME Warranties.

And you can't beat 3dfx's lifetime warranty.

TDFX, (ie 3dfx) also went bankrupt and those LIFETIME warranties are also WORTHLESS. Its been stated that some executives at 3dfx that ran the company into bankruptcy went on to be employed at Nvidia.

...three years ago!

A reasonable person might assume a LIFETIME warranty is more than 3 years! *Gasp*

Even fellow anandtech members have been duped by PNY graphics' "LIFETIME" warranty on a ti4200 and the sweet Nvidia fx5200 Ultra. Pny's lifetime warranty = "lifetime of card in production." This is almost comparable to my 10/10 "lifetime" warranty on stuff sold by me. Ten feet or Ten seconds, whichever comes first 🙂

Lifetime warranty = non-quantifiable time
 
Originally posted by: Wreckage
Originally posted by: munky
[You forgot my card. Or was that another convenient omission?
As anyone can see here, a GTO gets stomped by a GS
http://www.xbitlabs.com/articles/video/display/powercolor-x800gto.html

As you can see in my sig, a gto is really an x800xt (or faster) in disguise. Oh wait, but bios flashing voids the warranty, so better be a nub and ignore the 4 extra pipes and a nice additional 140+ mhz the card has in it.
 
Originally posted by: JimmyH
First time shame on you, second time shame on me 😀

I own some 3dfx cards too with LIFETIME Warranties.

And you can't beat 3dfx's lifetime warranty.

TDFX, (ie 3dfx) also went bankrupt and those LIFETIME warranties are also WORTHLESS. Its been stated that some executives at 3dfx that ran the company into bankruptcy went on to be employed at Nvidia.

...three years ago!

A reasonable person might assume a LIFETIME warranty is more than 3 years! *Gasp*

Even fellow anandtech members have been duped by PNY graphics' "LIFETIME" warranty on a ti4200 and the sweet Nvidia fx5200 Ultra. Pny's lifetime warranty = "lifetime of card in production." This is almost comparable to my 10/10 "lifetime" warranty on stuff sold by me. Ten feet or Ten seconds, whichever comes first 🙂

Lifetime warranty = non-quantifiable time


Your post is ridiculous, give it up already.

Telling people that because a company went bankrupt once "a lifetime warranty may not mean much" was pretty dumb.

Following it up with "fool me once shame on you" and a reference to a PNY warranty they don't even use any more is worse.

What "news" will you bring us next?

"They make BFG cards in Chicago, and Lake Michigan might flood and leave the whole company under water, incapable of honoring your so called warranty!" :roll:



 
Originally posted by: RampantAndroid
@ Munky:

"Buggy HW"??? Thats that stupid optimazation that NV put in the screws up the shadows in games like FarCy and BF2...not the hardware.

Maybe, but no drivers or tweaking have been able to fix the shadows as far as I know. So either they refuse to turn off the optimization, or it's hardwired into the card, I'm not sure if option 1 is any better than option 2.
 
LOL :laugh:

ATI FanATIcs are trying to down play a lifetime warranty.

I guess ATI could ship a card that makes your whole house smell like cat poop and you would say it was a great feature to have :roll:
 
Originally posted by: Wreckage
LOL :laugh:

ATI FanATIcs are trying to down play a lifetime warranty.

I guess ATI could ship a card that makes your whole house smell like cat poop and you would say it was a great feature to have :roll:

Yeah we don't see Jimmy H talking about more recent warranty history- like ATI showing us their confidence in BBA products by slashing the warranty from three years to one!
 
Right now, honestly, ATi doesn't have much going for it with crossfire. As with almost any newer, optional technology, I'd give it 6 months or so to mature and get all the kinks out of it before using it...
 
Originally posted by: BenSkywalker
Something seems to be a bit off with the SuperAA mode scores at FS.

Quake4 1600x1200 on the XT hits 102.3FPS running 4x AA and 54.3FPS running 14x AA- a 47% performance hit..... hmmmm. The backbuffer/z-buffer requirement alone at that setting is 205MBs. Obviously the XT packs 512MB but running HL2 the XL is pushing 46.4FPS at the above listed setting while in effect ending up a 64MB part. Something is not right.
Ben, what's your math to come up with that number? I'm coming up with 1600(w)*1200(h)*4bytes(color values)*2(2 buffers)*2(2 texture samples, so 2x SSAA?), which is only 30MB. Now I'm fairly sure I'm probably doing something wrong here, but a difference of nearly 7-fold? The only thing I can come up with that matches your math is 1600x1200*4*2*14(14x AA), which if that's what you're using, wouldn't be right since while AA would use bandwidth, MSAA doesn't need extra buffer space. So what are you using to come up with this?
 
I didnt think the DFI Ultra D's were supposed to be supported by Nvidia as SLI boards in the 1st place Munky, that was a hack to enable it, which would explain why you didnt get the SLI bridge.

On another note to the kids on the forum, it only seems to be the ATI fanboys crying, I havent seen any bragging from the the Nvidia crew...Isnt it nice to feel secure!

If the ATI boys support the red-team so much, put your money where ya damn mouth is, go buy, scratch that, order one today and shut up, instead of slagged off the site.

My 2 cents
 
Originally posted by: tuteja1986
yeah the SAPPHIRE X1800XT at neweggs is $489.00 + $600 Crossfire it comes up to 1089ish
http://www.newegg.com/Product/Product.asp?Item=N82E16814102629

2x 7800GTX Evga $469.00 = $938
http://www.newegg.com/Product/Product.asp?Item=N82E16814130254

but i think it can be worth :! if ati reduce the X1800XT crossfire price by $50

Crossfire X1800XT can keep up with 7800GTX SLI 512MB in some games like

so it should....the x1800xt has 512 ram too right? so it should keep up with the GTX 512..

 
The only thing I can come up with that matches your math is 1600x1200*4*2*14(14x AA), which if that's what you're using, wouldn't be right since while AA would use bandwidth, MSAA doesn't need extra buffer space. So what are you using to come up with this?

MSAA requires buffer space be reserved for any instance where there is variation in Z values. You must have the framebuffer space dedicated the same as SSAA when using MSAA as you do not know in advance where your varriations are going to be unless you are utilizing a deferred renderer(which we haven't seen since the K2).
 
Originally posted by: BenSkywalker
The only thing I can come up with that matches your math is 1600x1200*4*2*14(14x AA), which if that's what you're using, wouldn't be right since while AA would use bandwidth, MSAA doesn't need extra buffer space. So what are you using to come up with this?

MSAA requires buffer space be reserved for any instance where there is variation in Z values. You must have the framebuffer space dedicated the same as SSAA when using MSAA as you do not know in advance where your varriations are going to be unless you are utilizing a deferred renderer(which we haven't seen since the K2).
Humm, alright. I went to Nvidia's site to double-check this, and I found a formula for video memory usage:

Vid_mem = sizeof(Front_buffer) + sizeof(Back_buffer) + num_samples * (sizeof(Front_buffer) +sizeof(ZS_buffer))

So if my numbers are right here, the front buffer would be 5,760,000bytes(since we only need 24 bits of color), the back buffer 7,680,000bytes, and the Z buffer is also 7,680,000 bytes. So with that formula, we get 201,600,000bytes(192MB).

Now, I think the funny business is coming in to effect when we start talking about compression. ATI and Nvidia have compressed their Z buffers since ancient times(ATI was first with at least the 8500, I believe), and color compression was added in the NV30 for Nvidia(and I'm not sure for ATI). So really there's no way here for us to know what the actual video memory size is, since we don't know the compression ratio(or even if it's a constant ratio). However I suppose if we could get 4:1 compression on all buffers, memory usage would drop to something like 50MB, which would explain what you noticed in the first place.
 
Originally posted by: RampantAndroid
Originally posted by: Ackmed
Typical responses from some worthless posters.

Typical worthless AT review, again. AT's reviews have goen down hill from one of the top, to one of the lowest. No AF, no Super/SLI AA, etc. Of course they mention ATi's problems with getting cards out, and keeping them out. Yet not one word on the severe lack of availablity, and sky high prices of the 512MB GTX's. I want my 10 minutes back from that "review".

if you hate AT so much, leave and stop insulting people...Rollo's post was COMPLETLY unbiased. Idiot....


Way to be a hypocrite. Tell me to stop insulting people, and you do even worse. Take your own advice.

Who said I was talking about rollo? I dont read this posts. If you actually read the thread, you would see wreckage replied, and I told him why his post was worthless. Because you cant get a GTX for $250 less, and the Master card is available. Which is the opposite of what he said.
 
Originally posted by: Leper Messiah
Right now, honestly, ATi doesn't have much going for it with crossfire. As with almost any newer, optional technology, I'd give it 6 months or so to mature and get all the kinks out of it before using it...

Heh, we are talking about new video card technology, plus the mobo chipset technology, plus the Windows or what ever OS driver that is needed to support the video card and the chipset. I'd say 6 months is a bit optimistic.

 
Originally posted by: munky

$50 more than what? I did not see a $150 card in any review beating a GS.
You forgot my card. Or was that another convenient omission?
[/quote]

I don't remember a stock X800GTO (4500 3dmarks) outperforming a stock 6800GS (5100 3dmarks). Your particular card unlocked & overclocked, others got good overclocks out of theirs. That doesn't mean a card ordered TODAY will unlock. And overclocking is also a YMMV. And some 6800GS are also reported to be overclocking beasts. End result of cranking an X800GTO is more likely to be higher, but not guaranteed.

Your particular card's unlock and overclock makes it a fantastic value for YOU. The moron who botches the bios flash and ruins their GTO or overclocks it until the magic smoke comes out may differ re: the end value of their non-working $150 video card.

An X850XT is $230 last I checked. A stock X800GTO is *NOT* an X850XT. It has a high chance of becoming one (minus a warranty) but is NOT one.
 
Originally posted by: Ackmed
Who said I was talking about rollo? I dont read this posts. If you actually read the thread, you would see wreckage replied, and I told him why his post was worthless. Because you cant get a GTX for $250 less, and the Master card is available. Which is the opposite of what he said.

XT Mastercard not available and GTX256 SLI at least $250 cheaper.

Making your reply worthless.

 
Originally posted by: Wreckage
Originally posted by: Ackmed
Who said I was talking about rollo? I dont read this posts. If you actually read the thread, you would see wreckage replied, and I told him why his post was worthless. Because you cant get a GTX for $250 less, and the Master card is available. Which is the opposite of what he said.

XT Mastercard not available and GTX256 SLI at least $250 cheaper.

Making your reply worthless.


Not available? Hows about $574 retail from newegg? http://www.newegg.com/Product/Product.asp?Item=N82E16814102655

Crossfire IS available, so you saying otherwise is spreading FUD. Again.
 
So if my numbers are right here, the front buffer would be 5,760,000bytes(since we only need 24 bits of color), the back buffer 7,680,000bytes, and the Z buffer is also 7,680,000 bytes. So with that formula, we get 201,600,000bytes(192MB).

32bit color is used by ATi and nVidia- it ends up being faster due to bandwidth useage(32bit per pixel means two quads of pixels can be written per rise/fall). Edit- Just noticed, your Z buffer size also has to be multiplied by the number of samples you are using also- if it isn't running at an oversampled rate compared to the final rendered resolution then there is no additional data for you to perform SS or MSAA. /Edit

Now, I think the funny business is coming in to effect when we start talking about compression. ATI and Nvidia have compressed their Z buffers since ancient times(ATI was first with at least the 8500, I believe), and color compression was added in the NV30 for Nvidia(and I'm not sure for ATI). So really there's no way here for us to know what the actual video memory size is, since we don't know the compression ratio(or even if it's a constant ratio). However I suppose if we could get 4:1 compression on all buffers, memory usage would drop to something like 50MB, which would explain what you noticed in the first place.

Allocation for framebuffer must factor in no compression as again, there is no way of knowing in advance what level of compression you are going to be able to use unless you have a deferred renderer. Bandwidth savings are the big issue with compression(ie- if all color values and z values are the same for a quad you save on read/write overhead).
 
Back
Top