***OFFICIAL*** ATI R520 (X1800, X1600, X1300) Reviews Thread!

Page 19 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Cookie Monster

I got to agree with 5150joker on this one. The architectural side of the G70 is infact different to the NV40, by far. Different clock domains, functions differently from the NV40 architecture(shown by the review from beyond3d that a 7800GTX reduced to 16 pipes perform less than the 6800 ultra), and i could go on for more differences. Simply put you cant judge the book by its cover.

The 7800 isnt primitve , its infact more efficent because based on performance/watt ratio the 7800 beats the X1800. Not only that but it took ATi to use 320 million trannies for a 16 pipe card against a 302 million trannies 24pp GTX. The guys at Nvidia worked on efficeny, and used every trick in the book to increase performance NOT by insane clock speeds like the X1800 XTs 625mhz.

What features? Avivo? the 7 series Pure Video is fine, if not better. (more to come from AT but nvidia is in the lead in de-interlacing)

Adaptive AA? transparcey AA which is infact better. Some reviewers mention that the AA on the 7800GTX is better than the X1 series. (xbitlabs, hothardware for instance)

Big performance hit? the XT has 512mb at 1500mhz, comparing to the GTXs 256mb 1200mhz. Wait til the 512mb GTX with some faster memory then conclude which card takes more of a hit.

How can you compare the availablity/price of the 6800 ultra 512mb to the 512mb GTX? looking at yields/availablity nvidia isnt suffering from such issue. And as they themselves set the standard high on availablity i dont think they will shoot themselves on the foot.

IQ? have you tried playing HL2 with 8xS? Of course their is differences, differences in IQ from different reviewers, but to most people its the same. Some say AF on the 7 series is better, some say the X1 series do, so i dont see why people argue over IQ, when the differences is minimal.

Watch him jump in here and tell us how having HDR+AA is so important these days and that the "primitive" 7800 lacking this is just horrible. Or he may even try to grasp at straws by saying ATi did SM 3.0 "right" because R520 has an ultra-threaded architecture and can execute one flow control instruction per cycle.

Of course none of that means jack sh!t when you look at the hard facts and those are the benchmark results. In OpenGL, the "primitive" G70 wipes the floor with R520, HDR both take about the same hit (effectively making HDR+AA too much of a penalty), and in D3D the results go back and forth depending on which review you read because some used reference clocked G70 cards (which you can't even buy) while others used "stock" retail products with 450 core.

Like you already mentioned, nVidia FSAA IQ is still the winner due to 8x+SS and the fact that adaptive AA doesn't look any better than TSAA. Finally, Driver Heaven (ATi stronghold) themselves said 16x HQ AF showed no visual gains over standard 16x AF. So I'm left wondering, what is so special about x1800xt aside from it's high power consumption, 2 slot cooler design and lack of availability?

Edit: Nice find on the missing vertex texture fetch. So much for the advanced R520.

Edit 2: AH it just hit me, he thinks R520 is advanced because it can do a lot more than just play games. For example, it can double as a compact leaf blower with the proper accessories: http://clan786.com/modules/coppermine/albums/userpics/10002/normal_ibiza2.jpg
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Originally posted by: 5150Joker

Of course none of that means jack sh!t when you look at the hard facts and those are the benchmark results. In OpenGL, the "primitive" G70 wipes the floor with R520, HDR both take about the same hit (effectively making HDR+AA too much of a penalty), and in D3D the results go back and forth depending on which review you read because some used reference clocked G70 cards (which you can't even buy) while others used "stock" retail products with 450 core.

http://www.newegg.com/Product/Product.asp?Item=N82E16814143039
http://www.newegg.com/Product/Product.asp?Item=N82E16814170087
http://www.newegg.com/Product/Product.asp?Item=N82E16814127187
http://www.newegg.com/Product/Product.asp?Item=N82E16814133145

All with stock 430/1200 speeds. There's plenty more but I don't have time to list them.

These are the cards that should be compared against the stock X1800XT. It's only fair to compare stock vs stock, oc'd vs oc'd.

 

Chocolate Pi

Senior member
Jan 11, 2005
245
0
0
Some would feel it is more fair to compare price to price, but seeing as we can't do that for a couple more days...
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: M0RPH
Originally posted by: 5150Joker

Of course none of that means jack sh!t when you look at the hard facts and those are the benchmark results. In OpenGL, the "primitive" G70 wipes the floor with R520, HDR both take about the same hit (effectively making HDR+AA too much of a penalty), and in D3D the results go back and forth depending on which review you read because some used reference clocked G70 cards (which you can't even buy) while others used "stock" retail products with 450 core.

http://www.newegg.com/Product/Product.asp?Item=N82E16814143039
http://www.newegg.com/Product/Product.asp?Item=N82E16814170087
http://www.newegg.com/Product/Product.asp?Item=N82E16814127187
http://www.newegg.com/Product/Product.asp?Item=N82E16814133145

All with stock 430/1200 speeds. There's plenty more but I don't have time to list them.

These are the cards that should be compared against the stock X1800XT. It's only fair to compare stock vs stock, oc'd vs oc'd.



Well I stand corrected with that - didn't think there were anymore reference clocked G70's on the market. However, all the big name manufacturers like BFG (aside from it's fuzion brand), Leadtek, Asus, XFX and eVGA are clocking higher than the 430 mhz spec and those are the cards most people tend to buy.
 

Blastman

Golden Member
Oct 21, 1999
1,758
0
76
The only games the 7800 GTX is beating the X1800 at are OGL. The X1800 XT is winning a lot of DX games by a large margin. Since 90%+ games out there are DX, the X1800 is looking pretty good. OGL is almost dead as far as games are concerned. Even COD 2 went DX.

A good example is this hardwarefr review.

? ? ? ? ? ?7800GTX ?X1800 XT ? 12x10-4AA/16AF ?

HL2 ? ? ? ? 103.2 ? ? 120.1
Doom3 ? ? ? 128.2 ? ? 109.1
Farcry ? ? ? ? 60.3 ? ? 76.4
SC : CT ? ? ? .48.4 ? ? 62.7
Collin McR 05 ..?56.9 ? ?. 72.6
Act of War ? ? .. 69.6 ? ? 70.2
IL2 - PF ? ? ? ..41.8 ? ?.31.3

The only games the 7800 wins in this review are the 2 OGL games.

If you run the 7800 GTX in HQ mode to get rid of the texture shimmering you can drop its performance 10-15% over what?s in those benches. The X1800 even looks better when you consider that. So the X1800 has a bigger lead than those benches suggest if you match IQ.

ATI needs to come out with a faster and lousier IQ setting like NV. If hardware sites insist on benching with lower IQ on the NV cards, then ATI should just follow suit. Put a checkbox somewhere to turn the opts off but leave them on by default so hardware sites bench with them.
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Originally posted by: Blastman
The only games the 7800 GTX is beating the X1800 at are OGL. The X1800 XT is winning a lot of DX games by a large margin. Since 90%+ games out there are DX, the X1800 is looking pretty good. OGL is almost dead as far as games are concerned. Even COD 2 went DX.

Agreed. If it wasn't for Id Software, OpenGL would be dead by now and we'd all be better off. Nvidia and ATi could concentrate on DX and not have to worry about optimizing their chips for two different APIs.

 

hop1hop2

Member
Mar 31, 2005
92
0
0
I've noticed this card has a bit of a bottle neck, when placed with a FX57 at 2.8ghz (used at Toms Hardware) it beats the 7800GTX in the majority of the benchmarks, but when paired with the X2 4800+ at 2.4ghz (used at the Inquirer) or the FX55 at 2.6ghz (AT) it performs at about the same level as the 7800GTX, sometimes worse.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Cookie Monster
Originally posted by: 5150Joker
Originally posted by: munky
Originally posted by: DLeRium
Originally posted by: Soviet
Ati has the better card here, if you disagree, then your either a cheapskate or an nvidia fan. The benches speak for themselves as the X1800 XT wins most of them. Besides, who gives a toss about price, if youre gonna go spend all that dough on a high end card then price shouldnt be THAT big a factor. Also its only using 16 pipelines to do more than nvidias 24 pipeline card is doing, well done ATI.

Are you a FANBOY or what. It's tied like most of the way. Hexus gives ATI the edge, but if you look at osme other benchies, they're almost a tie with a few wins on each side. So what if it's more efficient? We know AMD is more efficient than Intel but their CPUs do outperform P4s easily right? 3700+ can whip a 3.8 GHz anyday meaning that the FX series will demolish anything. Who cares if you're efficient. YOu also have to wear the performance crown. If you're efficient and only the same, no one cares. Price matters. GTXes are on sale all the time and so are GTs, so unless ATI cards are gonna be coming with crazy deals, I see NV as the winner.


Nv is only a winner until the x1800xt actually becomes available for sale. And if Nv releases a 512mb gtx, I doubt it will be any cheaper than the x1800xt, just look at the ridiculous price they charged for a 512mb 6800u.

When the x1800 becames available for a reasonable price, I see no reason for anyone to buy a primitive 7800 instead. It's slower in most games, suffers a bigger hit from AA/AF, cant deliver the same IQ, and lacks a bunch of features that it should have had if Nv actually put the effort into designing a new core instead of rehashing the nv40.



LOL what a bunch of bologna. There's so many holes in the b.s. you typed but I'm sure you're already aware of that and just did it to troll.


I got to agree with 5150joker on this one. The architectural side of the G70 is infact different to the NV40, by far. Different clock domains, functions differently from the NV40 architecture(shown by the review from beyond3d that a 7800GTX reduced to 16 pipes perform less than the 6800 ultra), and i could go on for more differences. Simply put you cant judge the book by its cover.

The 7800 isnt primitve , its infact more efficent because based on performance/watt ratio the 7800 beats the X1800. Not only that but it took ATi to use 320 million trannies for a 16 pipe card against a 302 million trannies 24pp GTX. The guys at Nvidia worked on efficeny, and used every trick in the book to increase performance NOT by insane clock speeds like the X1800 XTs 625mhz.

What features? Avivo? the 7 series Pure Video is fine, if not better. (more to come from AT but nvidia is in the lead in de-interlacing)

Adaptive AA? transparcey AA which is infact better. Some reviewers mention that the AA on the 7800GTX is better than the X1 series. (xbitlabs, hothardware for instance)

Big performance hit? the XT has 512mb at 1500mhz, comparing to the GTXs 256mb 1200mhz. Wait til the 512mb GTX with some faster memory then conclude which card takes more of a hit.

How can you compare the availablity/price of the 6800 ultra 512mb to the 512mb GTX? looking at yields/availablity nvidia isnt suffering from such issue. And as they themselves set the standard high on availablity i dont think they will shoot themselves on the foot.

IQ? have you tried playing HL2 with 8xS? Of course their is differences, differences in IQ from different reviewers, but to most people its the same. Some say AF on the 7 series is better, some say the X1 series do, so i dont see why people argue over IQ, when the differences is minimal.

Edit: the X1 series dont even have proper S.M 3.0 according to tech report.

"Turns out that the vertex shaders in the Radeon X1000 series GPUs don't support a notable Shader Model 3.0 feature: vertex texture fetch. As it sounds, this capability allows the vertex shaders to read from texture memory, which is important because texture memory is sometimes treated as general storage in programmable GPUs. Vertex texture fetch is useful for techniques like displacement mapping, where the vertex and pixel shaders need to share data with one another. "

Vertex Texture Fetch is one of the biggest features in V.S 3.0, because it is used for true displacement mapping. The 6 and 7 series support it, and if this is infact true, its going to limit the X1 series capablities.

The 7800gtx is not a new core, it's just an improved 6800. It's not different "by far", only an evolutionary step. I though everybody knew this by now. :roll:

And since when is performance/watt an important factor, are you reading too much of the latest Intel slides? At Techreport they show 225 watts under load for the gtx, and 250 for the x1800. Hardly anything to rave about, especially since the gtx has less performance.

TRAA on the gtx is only better if you use the 8xS mode, which does impose a big performance drop. In 4x they look similar, and in 2x the Ati looks better. AF is not even a contest, and Quality AF does not cause a huge performance drop either. Looking at the Hardocp screenshots, only a blind person can call Nv's AF equal or better. HDR + AA is something that I always felt the 7800 should have had, it's hard to pimp one feature when it makes you give up another feature.

http://www.extremetech.com/article2/0,1697,1867129,00.asp
Look at this page - both the 1800xl and the 7800gt have 1ghz mem, but even then the xl suffers less of a hit doing AA+AF. Faster mem is not gonna help the gtx as mush as you wish.

As for the vertex texture fetch, it is possible on the x1800, just not done the same way as the Nv cards. Text

Only in 3 games have we seen the gtx beat the x1800 - Riddick, Guild Wars, and Doom3. It looses in every other game benchmark so far.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: munky
Originally posted by: Cookie Monster
Originally posted by: 5150Joker
Originally posted by: munky
Originally posted by: DLeRium
Originally posted by: Soviet
Ati has the better card here, if you disagree, then your either a cheapskate or an nvidia fan. The benches speak for themselves as the X1800 XT wins most of them. Besides, who gives a toss about price, if youre gonna go spend all that dough on a high end card then price shouldnt be THAT big a factor. Also its only using 16 pipelines to do more than nvidias 24 pipeline card is doing, well done ATI.

Are you a FANBOY or what. It's tied like most of the way. Hexus gives ATI the edge, but if you look at osme other benchies, they're almost a tie with a few wins on each side. So what if it's more efficient? We know AMD is more efficient than Intel but their CPUs do outperform P4s easily right? 3700+ can whip a 3.8 GHz anyday meaning that the FX series will demolish anything. Who cares if you're efficient. YOu also have to wear the performance crown. If you're efficient and only the same, no one cares. Price matters. GTXes are on sale all the time and so are GTs, so unless ATI cards are gonna be coming with crazy deals, I see NV as the winner.


Nv is only a winner until the x1800xt actually becomes available for sale. And if Nv releases a 512mb gtx, I doubt it will be any cheaper than the x1800xt, just look at the ridiculous price they charged for a 512mb 6800u.

When the x1800 becames available for a reasonable price, I see no reason for anyone to buy a primitive 7800 instead. It's slower in most games, suffers a bigger hit from AA/AF, cant deliver the same IQ, and lacks a bunch of features that it should have had if Nv actually put the effort into designing a new core instead of rehashing the nv40.



LOL what a bunch of bologna. There's so many holes in the b.s. you typed but I'm sure you're already aware of that and just did it to troll.


I got to agree with 5150joker on this one. The architectural side of the G70 is infact different to the NV40, by far. Different clock domains, functions differently from the NV40 architecture(shown by the review from beyond3d that a 7800GTX reduced to 16 pipes perform less than the 6800 ultra), and i could go on for more differences. Simply put you cant judge the book by its cover.

The 7800 isnt primitve , its infact more efficent because based on performance/watt ratio the 7800 beats the X1800. Not only that but it took ATi to use 320 million trannies for a 16 pipe card against a 302 million trannies 24pp GTX. The guys at Nvidia worked on efficeny, and used every trick in the book to increase performance NOT by insane clock speeds like the X1800 XTs 625mhz.

What features? Avivo? the 7 series Pure Video is fine, if not better. (more to come from AT but nvidia is in the lead in de-interlacing)

Adaptive AA? transparcey AA which is infact better. Some reviewers mention that the AA on the 7800GTX is better than the X1 series. (xbitlabs, hothardware for instance)

Big performance hit? the XT has 512mb at 1500mhz, comparing to the GTXs 256mb 1200mhz. Wait til the 512mb GTX with some faster memory then conclude which card takes more of a hit.

How can you compare the availablity/price of the 6800 ultra 512mb to the 512mb GTX? looking at yields/availablity nvidia isnt suffering from such issue. And as they themselves set the standard high on availablity i dont think they will shoot themselves on the foot.

IQ? have you tried playing HL2 with 8xS? Of course their is differences, differences in IQ from different reviewers, but to most people its the same. Some say AF on the 7 series is better, some say the X1 series do, so i dont see why people argue over IQ, when the differences is minimal.

Edit: the X1 series dont even have proper S.M 3.0 according to tech report.

"Turns out that the vertex shaders in the Radeon X1000 series GPUs don't support a notable Shader Model 3.0 feature: vertex texture fetch. As it sounds, this capability allows the vertex shaders to read from texture memory, which is important because texture memory is sometimes treated as general storage in programmable GPUs. Vertex texture fetch is useful for techniques like displacement mapping, where the vertex and pixel shaders need to share data with one another. "

Vertex Texture Fetch is one of the biggest features in V.S 3.0, because it is used for true displacement mapping. The 6 and 7 series support it, and if this is infact true, its going to limit the X1 series capablities.

The 7800gtx is not a new core, it's just an improved 6800. It's not different "by far", only an evolutionary step. I though everybody knew this by now. :roll:

And since when is performance/watt an important factor, are you reading too much of the latest Intel slides? At Techreport they show 225 watts under load for the gtx, and 250 for the x1800. Hardly anything to rave about, especially since the gtx has less performance.

TRAA on the gtx is only better if you use the 8xS mode, which does impose a big performance drop. In 4x they look similar, and in 2x the Ati looks better. AF is not even a contest, and Quality AF does not cause a huge performance drop either. Looking at the Hardocp screenshots, only a blind person can call Nv's AF equal or better. HDR + AA is something that I always felt the 7800 should have had, it's hard to pimp one feature when it makes you give up another feature.

http://www.extremetech.com/article2/0,1697,1867129,00.asp
Look at this page - both the 1800xl and the 7800gt have 1ghz mem, but even then the xl suffers less of a hit doing AA+AF. Faster mem is not gonna help the gtx as mush as you wish.

As for the vertex texture fetch, it is possible on the x1800, just not done the same way as the Nv cards. Text

Only in 3 games have we seen the gtx beat the x1800 - Riddick, Guild Wars, and Doom3. It looses in every other game benchmark so far.

What do you know about the core being new or not? Are you an engineer? Or do the guys in Nvidia telling lies about their G70 core? Who holds more credibility in what they are saying?

Power is always a big issue, many people live on a tight budget, and you think power consumptions doesnt matter? The X1 series idles at 170ish W while the GTX 140ish, who like to pay more money for your electricity bills?

About AF, have you read the other 5 sites talking about IQ? So most reviewers are wrong in what they say compared to your view. To me the reviewers themselves hold more credibility then what you want to say and believe in.

The 7800GT is faster than the X1800 XL, confirmed in guru3d, hardocp, hothardware, AT, etc. And thats using 16x12 AA/AF.

"Only in 3 games have we seen the gtx beat the x1800 - Riddick, Guild Wars, and Doom3. It looses in every other game benchmark so far. "

Do you count as a win when the X1 series has a 1 -5 fps performance lead?

 

compgeek89

Golden Member
Dec 11, 2004
1,860
0
76
Originally posted by: munky
Originally posted by: Cookie Monster
Originally posted by: 5150Joker
Originally posted by: munky
Originally posted by: DLeRium
Originally posted by: Soviet
Ati has the better card here, if you disagree, then your either a cheapskate or an nvidia fan. The benches speak for themselves as the X1800 XT wins most of them. Besides, who gives a toss about price, if youre gonna go spend all that dough on a high end card then price shouldnt be THAT big a factor. Also its only using 16 pipelines to do more than nvidias 24 pipeline card is doing, well done ATI.

Are you a FANBOY or what. It's tied like most of the way. Hexus gives ATI the edge, but if you look at osme other benchies, they're almost a tie with a few wins on each side. So what if it's more efficient? We know AMD is more efficient than Intel but their CPUs do outperform P4s easily right? 3700+ can whip a 3.8 GHz anyday meaning that the FX series will demolish anything. Who cares if you're efficient. YOu also have to wear the performance crown. If you're efficient and only the same, no one cares. Price matters. GTXes are on sale all the time and so are GTs, so unless ATI cards are gonna be coming with crazy deals, I see NV as the winner.


Nv is only a winner until the x1800xt actually becomes available for sale. And if Nv releases a 512mb gtx, I doubt it will be any cheaper than the x1800xt, just look at the ridiculous price they charged for a 512mb 6800u.

When the x1800 becames available for a reasonable price, I see no reason for anyone to buy a primitive 7800 instead. It's slower in most games, suffers a bigger hit from AA/AF, cant deliver the same IQ, and lacks a bunch of features that it should have had if Nv actually put the effort into designing a new core instead of rehashing the nv40.



LOL what a bunch of bologna. There's so many holes in the b.s. you typed but I'm sure you're already aware of that and just did it to troll.


I got to agree with 5150joker on this one. The architectural side of the G70 is infact different to the NV40, by far. Different clock domains, functions differently from the NV40 architecture(shown by the review from beyond3d that a 7800GTX reduced to 16 pipes perform less than the 6800 ultra), and i could go on for more differences. Simply put you cant judge the book by its cover.

The 7800 isnt primitve , its infact more efficent because based on performance/watt ratio the 7800 beats the X1800. Not only that but it took ATi to use 320 million trannies for a 16 pipe card against a 302 million trannies 24pp GTX. The guys at Nvidia worked on efficeny, and used every trick in the book to increase performance NOT by insane clock speeds like the X1800 XTs 625mhz.

What features? Avivo? the 7 series Pure Video is fine, if not better. (more to come from AT but nvidia is in the lead in de-interlacing)

Adaptive AA? transparcey AA which is infact better. Some reviewers mention that the AA on the 7800GTX is better than the X1 series. (xbitlabs, hothardware for instance)

Big performance hit? the XT has 512mb at 1500mhz, comparing to the GTXs 256mb 1200mhz. Wait til the 512mb GTX with some faster memory then conclude which card takes more of a hit.

How can you compare the availablity/price of the 6800 ultra 512mb to the 512mb GTX? looking at yields/availablity nvidia isnt suffering from such issue. And as they themselves set the standard high on availablity i dont think they will shoot themselves on the foot.

IQ? have you tried playing HL2 with 8xS? Of course their is differences, differences in IQ from different reviewers, but to most people its the same. Some say AF on the 7 series is better, some say the X1 series do, so i dont see why people argue over IQ, when the differences is minimal.

Edit: the X1 series dont even have proper S.M 3.0 according to tech report.

"Turns out that the vertex shaders in the Radeon X1000 series GPUs don't support a notable Shader Model 3.0 feature: vertex texture fetch. As it sounds, this capability allows the vertex shaders to read from texture memory, which is important because texture memory is sometimes treated as general storage in programmable GPUs. Vertex texture fetch is useful for techniques like displacement mapping, where the vertex and pixel shaders need to share data with one another. "

Vertex Texture Fetch is one of the biggest features in V.S 3.0, because it is used for true displacement mapping. The 6 and 7 series support it, and if this is infact true, its going to limit the X1 series capablities.

The 7800gtx is not a new core, it's just an improved 6800. It's not different "by far", only an evolutionary step. I though everybody knew this by now. :roll:

And since when is performance/watt an important factor, are you reading too much of the latest Intel slides? At Techreport they show 225 watts under load for the gtx, and 250 for the x1800. Hardly anything to rave about, especially since the gtx has less performance.

TRAA on the gtx is only better if you use the 8xS mode, which does impose a big performance drop. In 4x they look similar, and in 2x the Ati looks better. AF is not even a contest, and Quality AF does not cause a huge performance drop either. Looking at the Hardocp screenshots, only a blind person can call Nv's AF equal or better. HDR + AA is something that I always felt the 7800 should have had, it's hard to pimp one feature when it makes you give up another feature.

http://www.extremetech.com/article2/0,1697,1867129,00.asp
Look at this page - both the 1800xl and the 7800gt have 1ghz mem, but even then the xl suffers less of a hit doing AA+AF. Faster mem is not gonna help the gtx as mush as you wish.

As for the vertex texture fetch, it is possible on the x1800, just not done the same way as the Nv cards. Text

Only in 3 games have we seen the gtx beat the x1800 - Riddick, Guild Wars, and Doom3. It looses in every other game benchmark so far.

Holy cow we still have a disbeleiving ATi fanboy.

Im gald i got my GTX at $450... a month ago...

Im getting 33% better performance in openGL games, the only thing im regretting is FEAR performance, but dvelopers claim thats going to change anyway.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Im basing my conclusion on 8 different reviews around the web, ive seen many IQ comparisons, not just from hardocp, but from AT, hothardware, xbitlabs etc.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
I only see one person getting the X18 series and that person is BFG, he'll want that better Anisotropic Filtering ATI has to offer.
Better AF would be nice but it's not really a big factor for me as I'm happy with current AF implementations.

My biggest problem with the R520 is the dual slot cooling solution, especially after the 6800U fiasco with respect to noise. The 7800 just has it beat in that department with single slot cooling and less noise. Of course slapping 512 MB onto a 7800 may well make it go dual slot so there's something else to think about.

I think I'll just sit tight with my passively cooled X800 XL for now and see how the situation pans out.
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Originally posted by: BFG10K
I only see one person getting the X18 series and that person is BFG, he'll want that better Anisotropic Filtering ATI has to offer.
Better AF would be nice but it's not really a big factor for me as I'm happy with current AF implementations.

My biggest problem with the R520 is the dual slot cooling solution, especially after the 6800U fiasco with respect to noise. The 7800 just has it beat in that department with single slot cooling and less noise. Of course slapping 512 MB onto a 7800 may well make it go dual slot so there's something else to think about.

I think I'll just sit tight with my passively cooled X800 XL for now and see how the situation pans out.

It's really not a loud card.
http://www.techreport.com/reviews/2005q4/radeon-x1000/index.x?pg=16




 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Originally posted by: Cookie Monster
Im basing my conclusion on 8 different reviews around the web, ive seen many IQ comparisons, not just from hardocp, but from AT, hothardware, xbitlabs etc.

You mean you're ignoring the hard evidence in actual screenshots and choosing to believe comments from reviewers who sound like they only took a cursory glance at IQ.
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Originally posted by: Ronin
If it's the same fan as the X850 line, it's loud when it boots, and it's loud when you're in gaming.

And I should believe you instead of these guys who actually have the cards and compared them with a decibel meter, right?

http://www.techreport.com/reviews/2005q4/radeon-x1000/index.x?pg=16

Edit: misread your post, but as you can clearly see from the link I posted, it's not the same fan
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
The X1800XT's load sound level is about three times that of the 7800GTX if I'm right that dB is measured on logarithm with every 3dB being twice as loud?
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Originally posted by: xtknight
The X1800XT's load sound level is about three times that of the 7800GTX if I'm right that dB is measured on logarithm with every 3dB being twice as loud?

You say 3db means twice as loud and there's a 4db difference between the cards, so how are you getting 3x as loud?
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Also, shouldn't 2x7800GTX be twice as loud as 1x7800GTX. Dual GTX comes in at 61db.

GTX 55
XT 59
2XGTX 61

If i'm right then an XT is less than twice as loud as a GTX.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
It's really not a loud card.
Err, according to your own graph it's at least twice as loud as a single 7800 GTX.

Also, shouldn't 2x7800GTX be twice as loud as 1x7800GTX.
Under load it's 6 db higher which is well over twice as loud.
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Nobody here has shown me that they really know for sure how decibels work, so until someone produces a link, I'll just go by what the reviewer says:

The XT's dual-slot cooler can be loud when it kicks into high gear as the system powers on, but otherwise, it just whispers.

Doesn't concern me if a card is a bit noisy under load because if it's under load that means I'm playing a game, which means I'm not going to notice it.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
3 db is double the loudness although it can take up to 10 db for a human ear to register it as double.

However even ignoring db for a minute, the fact that the card is drawing more power and has a dual slot cooler more than likely means it's going to be louder than a single slot card which draws less power.

Doesn't concern me if a card is a bit noisy under load because if it's under load that means I'm playing a game, which means I'm not going to notice it.
Clearly you have never used a loud card.
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Originally posted by: BFG10K

Clearly you have never used a loud card.

I've used cards that others wold probably consider loud cards. It's just that I'm not a low noise fanatic.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
It's just that I'm not a low noise fanatic.
I'm not a fanatic but I'm also not deaf and I know a loud card when I hear one.

nV fanboys were trying to tell me the 6800U is quiet and yet it's pretty much right at the top of the graphs you linked to. Thank goodness I didn't get a X850 because it appears to be just as bad, if not worse.

A 6800U drove me nuts especially when gaming as the fan would turn into a jet engine, so from now on noise (or lack thereof) is going to be a big part of my future purchases. A loud GPU will easily generate more noise than a CPU and PSU combined.