• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

***OFFICIAL*** ATI R520 (X1800, X1600, X1300) Reviews Thread!

Page 20 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I only want to say that I've heard from a very knowlegeable person that AA+HDR won't be a problem for 7800GTX in the future. The vast majority of the game developers will tend to combine HDR(FP16) + MSAA inside the game , so there will be no problem if the card supports it or not.
 
Originally posted by: BFG10K
It's just that I'm not a low noise fanatic.
I'm not a fanatic but I'm also not deaf and I know a loud card when I hear one.

nV fanboys were trying to tell me the 6800U is quiet and yet it's pretty much right at the top of the graphs you linked to. Thank goodness I didn't get a X850 because it appears to be just as bad, if not worse.

A 6800U drove me nuts especially when gaming as the fan would turn into a jet engine, so from now on noise (or lack thereof) is going to be a big part of my future purchases. A loud GPU will easily generate more noise than a CPU and PSU combined.

I'd think you would own a 7800GTX BFG - it's clearly the quietest "fast" card around?
 
Originally posted by: M0RPH
Originally posted by: xtknight
The X1800XT's load sound level is about three times that of the 7800GTX if I'm right that dB is measured on logarithm with every 3dB being twice as loud?

You say 3db means twice as loud and there's a 4db difference between the cards, so how are you getting 3x as loud?

Must have been looking at the wrong number. 2.33x as loud I think...not sure about that logarithmic scale stuff though.
 
Originally posted by: jim1976
I only want to say that I've heard from a very knowlegeable person that AA+HDR won't be a problem for 7800GTX in the future. The vast majority of the game developers will tend to combine HDR(FP16) + MSAA inside the game , so there will be no problem if the card supports it or not.

The G70 core can not perform MSAA in the FP16 space. What do you mean inside the game? The CPU? 😕 Of course, they could emulate it in the pixel shaders and AA wouldn't be a problem. Maybe you mean something uber ghetto like rendering the HDR on to a texture then putting that texture through transparent AA. Ugh though...not really ideal in my opinion (don't know if that's even possible).
 
Originally posted by: Cookie Monster
What do you know about the core being new or not? Are you an engineer? Or do the guys in Nvidia telling lies about their G70 core? Who holds more credibility in what they are saying?

If you want to pretend the g70 is a new core, go ahead and do that. I'm basing my statements on the bunch of g70 reviews I've read last June, so unless it underwent a drastic change between now and then, I'm pretty sure it's not a completely new core or a fresh new design.

Power is always a big issue, many people live on a tight budget, and you think power consumptions doesnt matter? The X1 series idles at 170ish W while the GTX 140ish, who like to pay more money for your electricity bills?

I'm thinking if someone can afford a $500 video card, the added cost to the electricity bill wont upset them that much.

About AF, have you read the other 5 sites talking about IQ? So most reviewers are wrong in what they say compared to your view. To me the reviewers themselves hold more credibility then what you want to say and believe in.

I have read the other five sites, and as usual they do a pretty skimpy job as far as IQ comparisons go. But nowhere did it say Nv's AF is as good or better as Quality AF on the x1800, and it would be laughable for them to do so.

The 7800GT is faster than the X1800 XL, confirmed in guru3d, hardocp, hothardware, AT, etc. And thats using 16x12 AA/AF.

That's a whole different story. I'm talking about the x1800XT. It's pretty obvious that as far a performance goes, the rest of the Ati line up kinda sucks.

Edit: And the point was that the x1800 series suffers less of a hit doing AA+AF even with the same memory speeds, which is true even on the XL. The fact that it's slower overall is for a different reason.

"Only in 3 games have we seen the gtx beat the x1800 - Riddick, Guild Wars, and Doom3. It looses in every other game benchmark so far. "

Do you count as a win when the X1 series has a 1 -5 fps performance lead?

Why not? I count it as a win for Nv when it's ahead by 1-5 fps, so Ati also gets the same treatment.
 
Originally posted by: compgeek89

Holy cow we still have a disbeleiving ATi fanboy.

Im gald i got my GTX at $450... a month ago...

Im getting 33% better performance in openGL games, the only thing im regretting is FEAR performance, but dvelopers claim thats going to change anyway.

I'm glad you got your gtx only a month ago. It means in another month it will only be the second best card available. It doesnt matter that you're 256mb of mem short, cant do HDR with AA, instead of real AF you have texture shimmering, and dynamic branching is all of a sudden not important when you can no longer claim superiority in that department. It only matters that it's on a green PCB. :roll:
 
Originally posted by: DLeRium
Originally posted by: Soviet
Ati has the better card here, if you disagree, then your either a cheapskate or an nvidia fan. The benches speak for themselves as the X1800 XT wins most of them. Besides, who gives a toss about price, if youre gonna go spend all that dough on a high end card then price shouldnt be THAT big a factor. Also its only using 16 pipelines to do more than nvidias 24 pipeline card is doing, well done ATI.

Are you a FANBOY or what. It's tied like most of the way. Hexus gives ATI the edge, but if you look at osme other benchies, they're almost a tie with a few wins on each side.

u can bench frames per second, maybe u should start looking at image quality, texture filtering, eg all the stuff u cannot measure with frapssh!t benchies.
 
Originally posted by: Ronin
If it's the same fan as the X850 line, it's loud when it boots, and it's loud when you're in gaming.


So it will be loud when using Vista when its released. Since Vista is very GPU dependant and all the 3d'ness of Vista is done exclusively on GPU.
 
Originally posted by: ubergeekmeister
Originally posted by: Ronin
If it's the same fan as the X850 line, it's loud when it boots, and it's loud when you're in gaming.


So it will be loud when using Vista when its released. Since Vista is very GPU dependant and all the 3d'ness of Vista is done exclusively on GPU.

Vista is not going to put your GPU under a heavy load like a game would. And we've already established that the the X1800XT does not use the same fan as the X850. Check the Techreport link I gave.
 
Originally posted by: Maluno
What is with the "D/T = V" equation under the countdown clock? Is that supposed to be a hint at the new name? Velocity?


It is a convoluted version of the average velocity equation any 8th grader should be familiar with. 🙂 It is usually written as Vav = Dd / Dt ( average velocity = Delta distance divided by Delta time.) Velocity is expressed in meters or feet per second, Time is usually in seconds, and Distance is usually in meters or feet.

So. Here we go. Time = amount ATI missed origional launch date in seconds ( estimate ) = 9,676,800 seconds. Distance = distance x1800xt has to travel to reach my house from Newegg = 1,600,000 meters. I guess that snail Newegg sent to deliver my x1800xt back in July should arrive in about 3 weeks. 🙂
 
BFG10K and XTKnight - Stay in video as you suck in audio. 😀 A delta of 4dB does not make it twice as loud. It does mean it is more than twice the power, but loudness is based on pitch too. The higher the pitch, the louder it seems. Two devices can have the same dB, but one can appear to be much louder if the pitch is higher. Measuring dB is almost useless as a benchmark without including the pitch range if you are using it to see how 'loud' it is.

Morph - OpenGL is still a good thing. Although I prefer DX9+ in the apps I have, wishing OGL would go away is, well, silly. There is way too much stuff in the graphics world that use OpenGL. There is a strong chance that the models for any DX games may have been created on an OpenGL modeling tool.

Random thought... When the heck is ATI going to release the new OpenGL rewrite? I thought they would have done it by now. Or was I sleeping when it happened?

 
Originally posted by: xtknight
Originally posted by: jim1976
I only want to say that I've heard from a very knowlegeable person that AA+HDR won't be a problem for 7800GTX in the future. The vast majority of the game developers will tend to combine HDR(FP16) + MSAA inside the game , so there will be no problem if the card supports it or not.

The G70 core can not perform MSAA in the FP16 space. What do you mean inside the game? The CPU? 😕 Of course, they could emulate it in the pixel shaders and AA wouldn't be a problem. Maybe you mean something uber ghetto like rendering the HDR on to a texture then putting that texture through transparent AA. Ugh though...not really ideal in my opinion (don't know if that's even possible).


The Valve Way !! 😀

Neither of what you mentioned has to do with HDR implementation. The method is multiple render target or else deferred shading.
It's not the best example since it's the first try but it's sure a way to implement it 😉 .. It's a good start for cards that cannot use 64bpp+MSAA(which means HDR cannot be combined with refraction). Read the article, it's nice.. 🙂
 
well fuk vista i dont want a 5mpg car i dont want an os that needs 1gig of ram
they need to make an os that dosent hog all of my ram,

well at least 80mb at boot,
 
Back
Top