***OFFICIAL*** ATI R520 (X1800, X1600, X1300) Reviews Thread!

Page 17 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
From AT:

"We were interested in testing the FEAR demo, but after learning that the final version would change the performance characteristics of the game significantly, we decided it would be best to wait until we had a shipping game. From what we hear, the FEAR demo favors ATI's X1000 series considerably over NVIDIA hardware, but performance will be much closer in the final version."

Plus, IQ is similiar slight edge going to ATI.
Check out the screenies in hothardware, and you can see the 7 series IQ is =/or < ATi 520s.

Not to mention that transparcey AA is better than Adaptive AA.
And according to screenies in Hothardware, Xseries and X1 series produce slightly more jaggies than NV 7 series.

Pure Video is similar to Avivo. (So far Nvidia has the edge with its clear lead in de interlacing according to AT).

The R520 cunsumes 50W more power at load and idle compared to the GTX, which is worrying me about the R520 mobile parts.

If you look at the shadermark 2.1 test, the GTX leads the X1800XT in most of the shader test by up to 3~37%.

OpenGl titles like quake, enemy territory will favor NV more.

I believe its a tie ATi=NV, but its really to close to call who really won this round.


Edit, Hardocp also reinforces hothardware by the use of screenies, that even the 7 series produce better IQ than the X1 series when 4xAA plus AF is used.




 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Want an idea how a 512MB 7800GTX will perform? Try a Quadro 4500.

In regards to:
Originally posted by: M0RPH
What the heck are you talking about?? The MSRP on the X800XL is $449. The MSRP on the X800XT 512MB is $549. Get your facts straight.

We'll need to teach you a bit about supply and demand. Economics 101.
 

Hacp

Lifer
Jun 8, 2005
13,923
2
81
Originally posted by: Ronin
Want an idea how a 512MB 7800GTX will perform? Try a Quadro 4500.

In regards to:
Originally posted by: M0RPH
What the heck are you talking about?? The MSRP on the X800XL is $449. The MSRP on the X800XT 512MB is $549. Get your facts straight.

We'll need to teach you a bit about supply and demand. Economics 101.

Well, I don't have $$$ for a 1k card (i'm not sure if this is the price or not lol)
 

booomups

Junior Member
Oct 5, 2005
21
0
0
Originally posted by: 1Dark1Sharigan1
Originally posted by: munky
I think it's because even though it has 12 pixel shader units, it only has 4 texture units. So in games that are not heavy on shaders, it basically acts like a 4 pipe card when it comes to texturing, and we all know how THAT works out (*cough* FX cards)

Thanks for pointing that out . . . I totally missed that it had only 4 texture units . . . doh! lol

Now it makes perfect sense why it's performance was so lackluster in most other games and yet better than the 6800 in F.E.A.R.



so you can say that ati thought, that now the time is right make such a tradeoff.

seeing how the current games run fine and the future games are really going to be shader intensive, this seams like the right move.


how many pixel versus vertex shaders/pipes has the gtx agian?


edit: ah.. they are not talking about the top cards... it seems... but i should be the same cenario there.
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Originally posted by: Hacp
Originally posted by: Ronin
Want an idea how a 512MB 7800GTX will perform? Try a Quadro 4500.

In regards to:
Originally posted by: M0RPH
What the heck are you talking about?? The MSRP on the X800XL is $449. The MSRP on the X800XT 512MB is $549. Get your facts straight.

We'll need to teach you a bit about supply and demand. Economics 101.

Well, I don't have $$$ for a 1k card (i'm not sure if this is the price or not lol)

You forgot a $ :p

http://www.sharbor.com/products/PNYN5300015.html
http://www.atacom.com/program/atacom.cgi?SEARCH=SEARCH_ALL&KEYWORDS=VIDR_PNYX
 

booomups

Junior Member
Oct 5, 2005
21
0
0
g70gtx 8vertex 24 pixel

r520 8 vertex 16 pixel


soooo... people here would agree that the ati architecture is a litle more effectife per pipeline / shader


ohno... tacking(edit: yeah tacking.. sure.. mean taking) the higher cpu timings into account one could say that the ati vs. nvidia fight is about is about like that.

ati: 11 vertex 21 pixel
nvd:8 vertex 24 pixel

looks just like simply shiftet prioritys regarding vertex / pixel



lets wait for unfied pipes and then weeeee can then smoke al kindes of we want.


very much simplyfied and i dont know what i am talking about
 

stelleg151

Senior member
Sep 2, 2004
822
0
0
ATI really got beat out, again. I am interested to see if any games that come out will really use the ATI shading capabilities though, because over at GURU3d the reviewer was really raving about the demo that showed off the cards capabilites.

Just my 2 cents
 
Feb 19, 2001
20,155
23
81
Originally posted by: booomups
g70gtx 8vertex 24 pixel

r520 8 vertex 16 pixel


soooo... people here would agree that the ati architecture is a litle more effectife per pipeline / shader


ohno... tacking(edit: yeah tacking.. sure.. mean taking) the higher cpu timings into account one could say that the ati vs. nvidia fight is about is about like that.

ati: 11 vertex 21 pixel
nvd:8 vertex 24 pixel

looks just like simply shiftet prioritys regarding vertex / pixel



lets wait for unfied pipes and then weeeee can then smoke al kindes of we want.


very much simplyfied and i dont know what i am talking about

Efficiency doesn't matter seroiusly. You guys make ATI seem so godly because they are efficient. I love ATI but honestly they were disappointing this time.

Think of it this way. AMD's Athlon 64 is just as efficient, but Intel has already hit 5 ghz. Obviously P4s would be winning if they were at 5 GHz. Ok, so it doesn't matter how efficient AMD's processors are if they can't beat their competitors.... You would still be inclined to buy the faster of the two regardless of efficiency.

What I hope for is that ATI wont have too much trouble transitioning to 24 pipes or 32 pipes. By jumping over the 90nm hurdle earlier, and getting their efficiency issues down, I hope that in the LONG RUN, they will be able to come back with a powerful blow against NV once NV starts to transition to 90nm and hits trouble. Otherwise, ATI is quite doomed.
 

Duvie

Elite Member
Feb 5, 2001
16,215
0
71
Originally posted by: Hacp
Originally posted by: Duvie
Originally posted by: crazydingo
Originally posted by: rmed64
wow, ati gets stomped. Their midrange and budget cards are worthless. They are done this gen for sure now.
These death knells are oft-repeated and boring. :clock:

I have to agree...I didn't see anything like that...A bit overdramatic. The card appaers to be the overall leader but not by the margin the hype was building. If an ultra comes in with 512mb and uses dual core drivers it really cuts into this lead.

Aside from ATI being the leader the price and availability is now the main factor holding it back. Yes it leads but at what premium??? That is a user definition.

That post is speculation.
1) Your speculating that 512 MB will help Nvidia
2) Your speculating that the dual core drivers will increase frame rates drastically enough to cut into the XT lead.

We have the facts right before us right now. XT is projected to be faster than Nvidia's GTX, and has better IQ. But avilibility=1 month in the future. THe XL also has great Image quality, but it is using the bad cores, so frankly it does not look like a viable option. Also, the low clock speed makes it the underdog vs the GT.

That is not speculations!!! That is proven facts I can point to reviews to prove. The only speculation is when a 512mb card will be available by Nvidia (which I didn't try to guess) but then again so is saying the ATi cards will be available in 1 month.....

Heck the AT article even stated the effects of the 512mb card. We now the peformance will be better> I didn't say HOW MUCH...That would be speculating. I am stating commonsense things here...Join reality....
 

Hacp

Lifer
Jun 8, 2005
13,923
2
81
Originally posted by: Duvie
Originally posted by: Hacp
Originally posted by: Duvie
Originally posted by: crazydingo
Originally posted by: rmed64
wow, ati gets stomped. Their midrange and budget cards are worthless. They are done this gen for sure now.
These death knells are oft-repeated and boring. :clock:

I have to agree...I didn't see anything like that...A bit overdramatic. The card appaers to be the overall leader but not by the margin the hype was building. If an ultra comes in with 512mb and uses dual core drivers it really cuts into this lead.

Aside from ATI being the leader the price and availability is now the main factor holding it back. Yes it leads but at what premium??? That is a user definition.

That post is speculation.
1) Your speculating that 512 MB will help Nvidia
2) Your speculating that the dual core drivers will increase frame rates drastically enough to cut into the XT lead.

We have the facts right before us right now. XT is projected to be faster than Nvidia's GTX, and has better IQ. But avilibility=1 month in the future. THe XL also has great Image quality, but it is using the bad cores, so frankly it does not look like a viable option. Also, the low clock speed makes it the underdog vs the GT.

That is not speculations!!! That is proven facts I can point to reviews to prove. The only speculation is when a 512mb card will be available by Nvidia (which I didn't try to guess) but then again so is saying the ATi cards will be available in 1 month.....

Heck the AT article even stated the effects of the 512mb card. We now the peformance will be better> I didn't say HOW MUCH...That would be speculating. I am stating commonsense things here...Join reality....

Nice personal attack.
1) You speculated that 512 performance increase will be good.
2) You speculated that dual core drivers+512 will cut deeply into XT performance lead significantly enough.

Not common sense. We don't know how much the drivers+512 will cut into the performance lead.

3) Ow ya, the If the ultra comes out part.. Maybe a 512 version will come out, but for a seperate Ultra to do so? I dunno.
 

booomups

Junior Member
Oct 5, 2005
21
0
0
ME never said that more efficient per vertex pixel pipe was good or bad....


just stating fact that ati less pipes... same power... so more power per pipe....




the importan efficience is per watt anyway... so long
 

booomups

Junior Member
Oct 5, 2005
21
0
0
Originally posted by: booomups
very much simplyfied and i dont know what i am talking about



quote myself....

but higher memory bandwith does not explain the current fear results

or so mr tink
 

Hacp

Lifer
Jun 8, 2005
13,923
2
81
Originally posted by: booomups
Originally posted by: booomups
very much simplyfied and i dont know what i am talking about



quote myself....

but higher memory bandwith does not explain the current fear results

or so mr tink


Fear=demo..
Lets talk about real results in real games.
 

Duvie

Elite Member
Feb 5, 2001
16,215
0
71
Originally posted by: Hacp
Originally posted by: Duvie
Originally posted by: Hacp
Originally posted by: Duvie
Originally posted by: crazydingo
Originally posted by: rmed64
wow, ati gets stomped. Their midrange and budget cards are worthless. They are done this gen for sure now.
These death knells are oft-repeated and boring. :clock:

I have to agree...I didn't see anything like that...A bit overdramatic. The card appaers to be the overall leader but not by the margin the hype was building. If an ultra comes in with 512mb and uses dual core drivers it really cuts into this lead.

Aside from ATI being the leader the price and availability is now the main factor holding it back. Yes it leads but at what premium??? That is a user definition.

That post is speculation.
1) Your speculating that 512 MB will help Nvidia
2) Your speculating that the dual core drivers will increase frame rates drastically enough to cut into the XT lead.

We have the facts right before us right now. XT is projected to be faster than Nvidia's GTX, and has better IQ. But avilibility=1 month in the future. THe XL also has great Image quality, but it is using the bad cores, so frankly it does not look like a viable option. Also, the low clock speed makes it the underdog vs the GT.

That is not speculations!!! That is proven facts I can point to reviews to prove. The only speculation is when a 512mb card will be available by Nvidia (which I didn't try to guess) but then again so is saying the ATi cards will be available in 1 month.....

Heck the AT article even stated the effects of the 512mb card. We now the peformance will be better> I didn't say HOW MUCH...That would be speculating. I am stating commonsense things here...Join reality....

Nice personal attack.
1) You speculated that 512 performance increase will be good.
2) You speculated that dual core drivers+512 will cut deeply into XT performance lead significantly enough.

Not common sense. We don't know how much the drivers+512 will cut into the performance lead.

3) Ow ya, the If the ultra comes out part.. Maybe a 512 version will come out, but for a seperate Ultra to do so? I dunno.

NOt a personal attack.....dont be so SEN-SI-TIVE

1) is commonsense...ofcourse it will be good in those higher rez's as AT even suggested...
2) I didn't say cut deeply so dont put words into my mouth....There have been a few links showing some rather nice fps jumps with the DC and 8x series drivers...This is not rocket science!!!! The lead is not all th ebig so a modest 5-10% leap with these drivers could be HUGE!!!

3) whatever the heck you are talking about!!!


It is commonsense.....

512 helps in higher rez and the drivers have shown they help...so 2 positives dont make a negative, buddy!!!
 

Hacp

Lifer
Jun 8, 2005
13,923
2
81
"really cuts"

Ok so replace cut deeply with really cuts. But it is taken out of context if used in my sentence, because the "really" you were speaking of meant emphasis. If you replace really in my sentence, it might seem like I'm saying that the cuts are real instead of fake.

And by 512 performance will be good, I meant that it will be significant. I didn not mean that 512 megs will hurt performance.

Finally, from the driver tests that I have seen people do, the performance increase isn't all that its hyped up to be. Regular driver updates have offered better performance.

I'm not saying that the increases are insignificant, but seeing that the XT is leading and beating alot of SLI setups in direct 3d games, we'll have to actually see the numbers when the cards to beleive that a driver increase+512 will "really" cut into the lead, and not speculate.


Ow ya, finally, You said that an ultra "might" (key word might) comeout with the 512 megs. An ultra implies ramped up clock speeds/memory speeds/etc, while a 512 meg version of the GTX implies just that, a 512 meg version. (same as 512 meg versions of the 800XL and 6800ULtras) We don't know if an Ultra will come out or not, or if Nvidia will just directly jump to a new line in Janurary.
 

booomups

Junior Member
Oct 5, 2005
21
0
0
Originally posted by: Hacp
"really cuts"

Ok so replace cut deeply with really cuts. But it is taken out of context if used in my sentence, because the "really" you were speaking of meant emphasis. If you replace really in my sentence, it might seem like I'm saying that the cuts are real instead of fake.

And by 512 performance will be good, I meant that it will be significant. I didn not mean that 512 megs will hurt performance.

Finally, from the driver tests that I have seen people do, the performance increase isn't all that its hyped up to be. Regular driver updates have offered better performance.

I'm not saying that the increases are insignificant, but seeing that the XT is leading and beating alot of SLI setups in direct 3d games, we'll have to actually see the numbers when the cards to beleive that a driver increase+512 will "really" cut into the lead, and not speculate.


Ow ya, finally, You said that an ultra "might" (key word might) comeout with the 512 megs. An ultra implies ramped up clock speeds/memory speeds/etc, while a 512 meg version of the GTX implies just that, a 512 meg version. (same as 512 meg versions of the 800XL and 6800ULtras) We don't know if an Ultra will come out or not, or if Nvidia will just directly jump to a new line in Janurary.



dont even know if tomorow ever comes... but tomorrow will know for sure ;-)
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Originally posted by: Ronin
Want an idea how a 512MB 7800GTX will perform? Try a Quadro 4500.

In regards to:
Originally posted by: M0RPH
What the heck are you talking about?? The MSRP on the X800XL is $449. The MSRP on the X800XT 512MB is $549. Get your facts straight.

We'll need to teach you a bit about supply and demand. Economics 101.

The person I was replying to said the XL was gonna cost $600+. He said XL, not XT. So, are you agreeing with him then? If so, how bout put your money where your mouth is. I'll bet you that in a week I can buy a X1800XL for less than it's MSRP of $449. If I'm right, you buy me an XL card. If I'm wrong I'll buy you one.

Since you know so much about the supply and demand of these upcoming cards, you should be ready to jump on this bet. So just let me know.

 

MegaWorks

Diamond Member
Jan 26, 2004
3,819
1
0
It's so sad, ATi really needs to update their OpenGL drivers. Man the X1800XT is a monster in Direct3D games but its OpenGL performance is horrible!
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
The card seems to tool a 7800 in high detail Direct3D settings. It's a shame about the dual-slot cooling.
 

OneOfTheseDays

Diamond Member
Jan 15, 2000
7,052
0
0
i'm looking forward to their next release to be honest. i'm hoping they will now have time to refine and optimize the heck out of the architecture to bring down power, noise, etc. better open gl drivers wouldn't hurt either.
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Originally posted by: Ronin
I'd rather jump on the ban wagon. Who's with me?

Hahahaha

hehehehehee

lolololololo

(yeah, I'm laughing at you just like you did you someone else)

Go ahead and try to get me banned. In fact, go ahead and try to get everyone banned who doesn't agree with you.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: booomups
Originally posted by: booomups
very much simplyfied and i dont know what i am talking about



quote myself....

but higher memory bandwith does not explain the current fear results

or so mr tink

"We were interested in testing the FEAR demo, but after learning that the final version would change the performance characteristics of the game significantly, we decided it would be best to wait until we had a shipping game. From what we hear, the FEAR demo favors ATI's X1000 series considerably over NVIDIA hardware, but performance will be much closer in the final version."

You do realise monolith has been optimising their game for the 7 series for a while. And plus its a DEMO.