***OFFICIAL*** ATI R520 (X1800, X1600, X1300) Reviews Thread!

Page 18 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Hacp

Lifer
Jun 8, 2005
13,923
2
81
Originally posted by: M0RPH
Originally posted by: Ronin
Want an idea how a 512MB 7800GTX will perform? Try a Quadro 4500.

In regards to:
Originally posted by: M0RPH
What the heck are you talking about?? The MSRP on the X800XL is $449. The MSRP on the X800XT 512MB is $549. Get your facts straight.

We'll need to teach you a bit about supply and demand. Economics 101.

The person I was replying to said the XL was gonna cost $600+. He said XL, not XT. So, are you agreeing with him then? If so, how bout put your money where your mouth is. I'll bet you that in a week I can buy a X1800XL for less than it's MSRP of $449. If I'm right, you buy me an XL card. If I'm wrong I'll buy you one.

Since you know so much about the supply and demand of these upcoming cards, you should be ready to jump on this bet. So just let me know.


SHipping and tax is included. Shipping has to be overnight with full insurance.
 

booomups

Junior Member
Oct 5, 2005
21
0
0
Originally posted by: Cookie Monster
Originally posted by: booomups
Originally posted by: booomups
very much simplyfied and i dont know what i am talking about



quote myself....

but higher memory bandwith does not explain the current fear results

or so mr tink

"We were interested in testing the FEAR demo, but after learning that the final version would change the performance characteristics of the game significantly, we decided it would be best to wait until we had a shipping game. From what we hear, the FEAR demo favors ATI's X1000 series considerably over NVIDIA hardware, but performance will be much closer in the final version."

You do realise monolith has been optimising their game for the 7 series for a while. And plus its a DEMO.

jo.. easy easy.. i just stated the thesis that the x18 is about alike the g70 but more a tradeoff towards shader performance but less .. maybe less pixelrate

meaning that the more shader power used, the more of an advantage it might get over the g70.....

then someone saiys 18s sole main advantage is higher memory bandwith... then me say newer game like fear... many many shaders.... so it profits moslikely from this tradeoff and the better performance in shaderheavy games is not explaynable only becaue of higher bandwith....

and about fear beeing optimized for g70... good for g70 owners.. dont you think?
 
Feb 19, 2001
20,155
23
81
Originally posted by: Hacp
Originally posted by: M0RPH
Originally posted by: Ronin
Want an idea how a 512MB 7800GTX will perform? Try a Quadro 4500.

In regards to:
Originally posted by: M0RPH
What the heck are you talking about?? The MSRP on the X800XL is $449. The MSRP on the X800XT 512MB is $549. Get your facts straight.

We'll need to teach you a bit about supply and demand. Economics 101.

The person I was replying to said the XL was gonna cost $600+. He said XL, not XT. So, are you agreeing with him then? If so, how bout put your money where your mouth is. I'll bet you that in a week I can buy a X1800XL for less than it's MSRP of $449. If I'm right, you buy me an XL card. If I'm wrong I'll buy you one.

Since you know so much about the supply and demand of these upcoming cards, you should be ready to jump on this bet. So just let me know.


SHipping and tax is included. Shipping has to be overnight with full insurance.

1 week pal. If I can't get an X1800XL in 1 week for less than I got my 7800GT on LAUNCH DAY (8/11 for $334) then ATI really SUX.
 

booomups

Junior Member
Oct 5, 2005
21
0
0
and if this shaderpower advantage doesnt happen in atis and the games future.... jo.. not my fault ;-)
 

MBrown

Diamond Member
Jul 5, 2001
5,726
35
91
I'd say nVidia one this round even though the XT might be slightly faster.
 

Hacp

Lifer
Jun 8, 2005
13,923
2
81
Originally posted by: MBrown
I'd say nVidia one this round even though the XT might be slightly faster.

surprise surprise. Obviously very biased opinion. Only Hindsight will tell who won and who lost.
 

booomups

Junior Member
Oct 5, 2005
21
0
0
be happy that ati atleast got e technical equivalent to the nvidia cards... could have been worse and the market wouldnt have benn moving for some montsh....
 

SumYungGai

Banned
Sep 29, 2005
43
0
0
Just to comment of the efficiency thing people keep bringing up: First of all, it doesn't matter about the pipelines, vertex processors, MHz, or anything. The 9000 line was hugely efficient because it took much less power then the GeForce, and was even physically smaller. It didn't need external power or dual-slot cooling.

And if you think that having less pipelines makes it more efficient in any regard, you're simply wrong. One could also reverse it and say the GeForce is more efficient because it needs less MHz because it has more pipelines to alleviate the speed stress.

Although, I do think the X1000 line is more efficient in terms of image quality. Some of the lower settings appeared as good as some somewhat higher GeForce settings. That means better image and the same speed, or the same image at a faster speed.

Too bad that probably doesn't transfer over to OpenGL, but it would be cool if driver tweaks fixed that.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Hacp
Originally posted by: MBrown
I'd say nVidia one this round even though the XT might be slightly faster.

surprise surprise. Obviously very biased opinion. Only Hindsight will tell who won and who lost.

I do believe Nvidia won this round.
One word:Availablity.

The funnier thing is Nvidia hasnt even released the real G70. The 7 series right now, are the NV47s that WAS suppose to comete against the X850 XT PE. Remember Nvidia saying "fans will be dissapointed by the NV47 because its only 20~30 mhz clocked higher than the 6800 ultra"

The real G70 is the G72, 90nm refresh, wouldnt be surprised if its architectural side is bit different, and is planned to be released early 2006 according to digitimes.

So things are getting interesting. :D


 

booomups

Junior Member
Oct 5, 2005
21
0
0
Originally posted by: SumYungGai
Just to comment of the efficiency thing people keep bringing up: First of all, it doesn't matter about the pipelines, vertex processors, MHz, or anything. The 9000 line was hugely efficient because it took much less power then the GeForce, and was even physically smaller. It didn't need external power or dual-slot cooling.

And if you think that having less pipelines makes it more efficient in any regard, you're simply wrong. One could also reverse it and say the GeForce is more efficient because it needs less MHz because it has more pipelines to alleviate the speed stress.

Although, I do think the X1000 line is more efficient in terms of image quality. Some of the lower settings appeared as good as some somewhat higher GeForce settings. That means better image and the same speed, or the same image at a faster speed.

Too bad that probably doesn't transfer over to OpenGL, but it would be cool if driver tweaks fixed that.


whats so difficult to understand, if i say that the pipes are( seem to boe) more efficien? i never said anything about it needing less elektr. and beeing cooler.....

and then i even took in account the higher clock speeds and thus maybe the pipes (so the whole architecture) arent more efficient at all...

the efficiencey i am talking about: pipes *mhz = avg fps

doesnt say anything about power usage...

or did i___ nah... i didnt... and iff you didnt mean me... ok too.
 

SumYungGai

Banned
Sep 29, 2005
43
0
0
Originally posted by: booomups
whats so difficult to understand, if i say that the pipes are more efficien? i never said anything about it needing less elektr. and beeing cooler.....

or did i___ nah... i didnt... and iff you didnt mean me... ok too.
Uhh, yeah.... I think that if you shut up, it would do us all a favor.
 

booomups

Junior Member
Oct 5, 2005
21
0
0
Originally posted by: SumYungGai
Originally posted by: booomups
whats so difficult to understand, if i say that the pipes are more efficien? i never said anything about it needing less elektr. and beeing cooler.....

or did i___ nah... i didnt... and iff you didnt mean me... ok too.
Uhh, yeah.... I think that if you shut up, it would do us all a favor.



please you go ***** and ** **** ....mayb ei should you ****...

love
booomups
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: aggressor
The real thing that shocks me about the R520 release is just how much better ATIs AF and AA modes are compared to nVidias. It almost seems unfair to bench ATIs 4X AA against nVidias 4x AA when nVidias implementation looks like ATI's 2x mode. Ditto for anisotropic filtering and transparency AA modes.

http://www.hardocp.com/image.html?image...I4MDE0MEFCVGlYSnBoRUNfOF8xMl9sLmpwZw==
http://www.hardocp.com/image.html?image=MTEyODI4MDE0MEFCVGlYSnBoRUNfOF8zX2wuanBn

It's too bad about the price of these cards, though :(

af: yes. aa: no.

tho all nvidia has to do is go back to the gf5 af method to match ati.. whether they do or not is another matter, but kudos to ati for not forcing us to use adaptive filtering..
 

orangat

Golden Member
Jun 7, 2004
1,579
0
0
Looks like the high end x1800xt is very close or a hair faster to the 7800gtx but also slightly more expensive than the gtx so its a tie.
But the x1800xl, x1600 and x1300 models are significantly slower than Nvidia and also quite alot more expensive so Nvidia wins big there.

I see the same problem of ATI not offering a competitive mainstream gamer model (same problem of the slower x800pro of the previous gen). The x1800xl is slower and more expensive by about $90 over the 7800gt.
 

Velk

Senior member
Jul 29, 2004
734
0
0
Originally posted by: Hacp
Originally posted by: Duvie
Originally posted by: Hacp
Originally posted by: Duvie
Originally posted by: crazydingo
Originally posted by: rmed64
wow, ati gets stomped. Their midrange and budget cards are worthless. They are done this gen for sure now.
These death knells are oft-repeated and boring. :clock:

I have to agree...I didn't see anything like that...A bit overdramatic. The card appaers to be the overall leader but not by the margin the hype was building. If an ultra comes in with 512mb and uses dual core drivers it really cuts into this lead.

Aside from ATI being the leader the price and availability is now the main factor holding it back. Yes it leads but at what premium??? That is a user definition.

That post is speculation.
1) Your speculating that 512 MB will help Nvidia
2) Your speculating that the dual core drivers will increase frame rates drastically enough to cut into the XT lead.

We have the facts right before us right now. XT is projected to be faster than Nvidia's GTX, and has better IQ. But avilibility=1 month in the future. THe XL also has great Image quality, but it is using the bad cores, so frankly it does not look like a viable option. Also, the low clock speed makes it the underdog vs the GT.

That is not speculations!!! That is proven facts I can point to reviews to prove. The only speculation is when a 512mb card will be available by Nvidia (which I didn't try to guess) but then again so is saying the ATi cards will be available in 1 month.....

Heck the AT article even stated the effects of the 512mb card. We now the peformance will be better> I didn't say HOW MUCH...That would be speculating. I am stating commonsense things here...Join reality....

1) You speculated that 512 performance increase will be good.
2) You speculated that dual core drivers+512 will cut deeply into XT performance lead significantly enough.

Not common sense. We don't know how much the drivers+512 will cut into the performance lead.

While it's true that we do not currently know just how much the combination of dual core drivers and 512MB together will give, we know enough about the dual core drivers from the beta versions to safely say that they will cut deeply into the performance lead just by themselves.

From the anand review, the largest lead by the 1800XT is 29% ( in splinter cell ct ) and my own testing of the dual core drivers has produced framerate increases of up to 18%. Admittedly that is not specific to splinter cell, as I do not own the game, but an expected increase of 10% is in no way unreasonable given the results for other games.

For a specific example, the dual core increased my systems quake3 scores by 13% for 1600x1200x4aa, which from anad's results would lead to the gtx widening a 44% performance gap to a 69% gap.

Overall I'd have to say that the X1800XT is a bit of a disappointment, as given the choice between it and a 7800gtx both available now, and both at the same price, it would still be a bit of a tossup rather than an obvious no brainer. As for replacing an existing 7800gtx with a new x1800xt, that would be tantamount to piling money up and setting it on fire.



 

booomups

Junior Member
Oct 5, 2005
21
0
0
thanks for the info.. nice to hear the second core is beieing used so good. whats the cpu load on the secon core while running a game?
 

Maximilian

Lifer
Feb 8, 2004
12,604
15
81
Ati has the better card here, if you disagree, then your either a cheapskate or an nvidia fan. The benches speak for themselves as the X1800 XT wins most of them. Besides, who gives a toss about price, if youre gonna go spend all that dough on a high end card then price shouldnt be THAT big a factor. Also its only using 16 pipelines to do more than nvidias 24 pipeline card is doing, well done ATI.
 
Feb 19, 2001
20,155
23
81
Originally posted by: Soviet
Ati has the better card here, if you disagree, then your either a cheapskate or an nvidia fan. The benches speak for themselves as the X1800 XT wins most of them. Besides, who gives a toss about price, if youre gonna go spend all that dough on a high end card then price shouldnt be THAT big a factor. Also its only using 16 pipelines to do more than nvidias 24 pipeline card is doing, well done ATI.

Are you a FANBOY or what. It's tied like most of the way. Hexus gives ATI the edge, but if you look at osme other benchies, they're almost a tie with a few wins on each side. So what if it's more efficient? We know AMD is more efficient than Intel but their CPUs do outperform P4s easily right? 3700+ can whip a 3.8 GHz anyday meaning that the FX series will demolish anything. Who cares if you're efficient. YOu also have to wear the performance crown. If you're efficient and only the same, no one cares. Price matters. GTXes are on sale all the time and so are GTs, so unless ATI cards are gonna be coming with crazy deals, I see NV as the winner.
 

Maximilian

Lifer
Feb 8, 2004
12,604
15
81
Originally posted by: DLeRium
Originally posted by: Soviet
Ati has the better card here, if you disagree, then your either a cheapskate or an nvidia fan. The benches speak for themselves as the X1800 XT wins most of them. Besides, who gives a toss about price, if youre gonna go spend all that dough on a high end card then price shouldnt be THAT big a factor. Also its only using 16 pipelines to do more than nvidias 24 pipeline card is doing, well done ATI.

Are you a FANBOY or what. It's tied like most of the way. Hexus gives ATI the edge, but if you look at osme other benchies, they're almost a tie with a few wins on each side. So what if it's more efficient? We know AMD is more efficient than Intel but their CPUs do outperform P4s easily right? 3700+ can whip a 3.8 GHz anyday meaning that the FX series will demolish anything. Who cares if you're efficient. YOu also have to wear the performance crown. If you're efficient and only the same, no one cares. Price matters. GTXes are on sale all the time and so are GTs, so unless ATI cards are gonna be coming with crazy deals, I see NV as the winner.

Cheapskate.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: DLeRium
Originally posted by: Soviet
Ati has the better card here, if you disagree, then your either a cheapskate or an nvidia fan. The benches speak for themselves as the X1800 XT wins most of them. Besides, who gives a toss about price, if youre gonna go spend all that dough on a high end card then price shouldnt be THAT big a factor. Also its only using 16 pipelines to do more than nvidias 24 pipeline card is doing, well done ATI.

Are you a FANBOY or what. It's tied like most of the way. Hexus gives ATI the edge, but if you look at osme other benchies, they're almost a tie with a few wins on each side. So what if it's more efficient? We know AMD is more efficient than Intel but their CPUs do outperform P4s easily right? 3700+ can whip a 3.8 GHz anyday meaning that the FX series will demolish anything. Who cares if you're efficient. YOu also have to wear the performance crown. If you're efficient and only the same, no one cares. Price matters. GTXes are on sale all the time and so are GTs, so unless ATI cards are gonna be coming with crazy deals, I see NV as the winner.


Nv is only a winner until the x1800xt actually becomes available for sale. And if Nv releases a 512mb gtx, I doubt it will be any cheaper than the x1800xt, just look at the ridiculous price they charged for a 512mb 6800u.

When the x1800 becames available for a reasonable price, I see no reason for anyone to buy a primitive 7800 instead. It's slower in most games, suffers a bigger hit from AA/AF, cant deliver the same IQ, and lacks a bunch of features that it should have had if Nv actually put the effort into designing a new core instead of rehashing the nv40.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: munky
Originally posted by: DLeRium
Originally posted by: Soviet
Ati has the better card here, if you disagree, then your either a cheapskate or an nvidia fan. The benches speak for themselves as the X1800 XT wins most of them. Besides, who gives a toss about price, if youre gonna go spend all that dough on a high end card then price shouldnt be THAT big a factor. Also its only using 16 pipelines to do more than nvidias 24 pipeline card is doing, well done ATI.

Are you a FANBOY or what. It's tied like most of the way. Hexus gives ATI the edge, but if you look at osme other benchies, they're almost a tie with a few wins on each side. So what if it's more efficient? We know AMD is more efficient than Intel but their CPUs do outperform P4s easily right? 3700+ can whip a 3.8 GHz anyday meaning that the FX series will demolish anything. Who cares if you're efficient. YOu also have to wear the performance crown. If you're efficient and only the same, no one cares. Price matters. GTXes are on sale all the time and so are GTs, so unless ATI cards are gonna be coming with crazy deals, I see NV as the winner.


Nv is only a winner until the x1800xt actually becomes available for sale. And if Nv releases a 512mb gtx, I doubt it will be any cheaper than the x1800xt, just look at the ridiculous price they charged for a 512mb 6800u.

When the x1800 becames available for a reasonable price, I see no reason for anyone to buy a primitive 7800 instead. It's slower in most games, suffers a bigger hit from AA/AF, cant deliver the same IQ, and lacks a bunch of features that it should have had if Nv actually put the effort into designing a new core instead of rehashing the nv40.



LOL what a bunch of bologna. There's so many holes in the b.s. you typed but I'm sure you're already aware of that and just did it to troll.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: 5150Joker
Originally posted by: munky
Originally posted by: DLeRium
Originally posted by: Soviet
Ati has the better card here, if you disagree, then your either a cheapskate or an nvidia fan. The benches speak for themselves as the X1800 XT wins most of them. Besides, who gives a toss about price, if youre gonna go spend all that dough on a high end card then price shouldnt be THAT big a factor. Also its only using 16 pipelines to do more than nvidias 24 pipeline card is doing, well done ATI.

Are you a FANBOY or what. It's tied like most of the way. Hexus gives ATI the edge, but if you look at osme other benchies, they're almost a tie with a few wins on each side. So what if it's more efficient? We know AMD is more efficient than Intel but their CPUs do outperform P4s easily right? 3700+ can whip a 3.8 GHz anyday meaning that the FX series will demolish anything. Who cares if you're efficient. YOu also have to wear the performance crown. If you're efficient and only the same, no one cares. Price matters. GTXes are on sale all the time and so are GTs, so unless ATI cards are gonna be coming with crazy deals, I see NV as the winner.


Nv is only a winner until the x1800xt actually becomes available for sale. And if Nv releases a 512mb gtx, I doubt it will be any cheaper than the x1800xt, just look at the ridiculous price they charged for a 512mb 6800u.

When the x1800 becames available for a reasonable price, I see no reason for anyone to buy a primitive 7800 instead. It's slower in most games, suffers a bigger hit from AA/AF, cant deliver the same IQ, and lacks a bunch of features that it should have had if Nv actually put the effort into designing a new core instead of rehashing the nv40.



LOL what a bunch of bologna. There's so many holes in the b.s. you typed but I'm sure you're already aware of that and just did it to troll.


I got to agree with 5150joker on this one. The architectural side of the G70 is infact different to the NV40, by far. Different clock domains, functions differently from the NV40 architecture(shown by the review from beyond3d that a 7800GTX reduced to 16 pipes perform less than the 6800 ultra), and i could go on for more differences. Simply put you cant judge the book by its cover.

The 7800 isnt primitve , its infact more efficent because based on performance/watt ratio the 7800 beats the X1800. Not only that but it took ATi to use 320 million trannies for a 16 pipe card against a 302 million trannies 24pp GTX. The guys at Nvidia worked on efficeny, and used every trick in the book to increase performance NOT by insane clock speeds like the X1800 XTs 625mhz.

What features? Avivo? the 7 series Pure Video is fine, if not better. (more to come from AT but nvidia is in the lead in de-interlacing)

Adaptive AA? transparcey AA which is infact better. Some reviewers mention that the AA on the 7800GTX is better than the X1 series. (xbitlabs, hothardware for instance)

Big performance hit? the XT has 512mb at 1500mhz, comparing to the GTXs 256mb 1200mhz. Wait til the 512mb GTX with some faster memory then conclude which card takes more of a hit.

How can you compare the availablity/price of the 6800 ultra 512mb to the 512mb GTX? looking at yields/availablity nvidia isnt suffering from such issue. And as they themselves set the standard high on availablity i dont think they will shoot themselves on the foot.

IQ? have you tried playing HL2 with 8xS? Of course their is differences, differences in IQ from different reviewers, but to most people its the same. Some say AF on the 7 series is better, some say the X1 series do, so i dont see why people argue over IQ, when the differences is minimal.

Edit: the X1 series dont even have proper S.M 3.0 according to tech report.

"Turns out that the vertex shaders in the Radeon X1000 series GPUs don't support a notable Shader Model 3.0 feature: vertex texture fetch. As it sounds, this capability allows the vertex shaders to read from texture memory, which is important because texture memory is sometimes treated as general storage in programmable GPUs. Vertex texture fetch is useful for techniques like displacement mapping, where the vertex and pixel shaders need to share data with one another. "

Vertex Texture Fetch is one of the biggest features in V.S 3.0, because it is used for true displacement mapping. The 6 and 7 series support it, and if this is infact true, its going to limit the X1 series capablities.