Whats with the hype of the X1900's?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Elfear

Diamond Member
May 30, 2004
7,167
824
126
Originally posted by: beggerking

From what I've read ATI still lags in OpenGL. you can defend ATI all you want ,but i'm not going into that argument again because its just a waste of time, I can't change your preference.

like I have said in another thread, IMHO I believe ATI will come out with a 24 or 32 texture processor card to match G71's supposely 32 pipeline in games w/o heavy shader use.

I don't think you're getting it beggerking. Almost every post of yours I've seen has been about ATI vs. Nvidia. You assume that a person is pro-ATI (or an ATI fanboy) if they make a comment about ATI that puts it in a good light or points out some short-comings of Nvidia. Most of the guys posting really don't care which card company they buy from, most either want the best bang for the buck or the highest performance they can afford. It's as simple as that. There is no hidden agenda, no anti-Nvidia conspiracy theory, no blind devotion to a particular company. Sure there are fanboys out there, but don't assume that anyone who makes an educated statement about either Nvidia or ATI falls into that category. I think the majority of the guys that post here are well-grounded people (as well-grounded as computer geeks can be I guess).
 

the Chase

Golden Member
Sep 22, 2005
1,403
0
0
Originally posted by: Matthias99
Originally posted by: beggerking
From what I've read ATI still lags in OpenGL. you can defend ATI all you want ,but i'm not going into that argument again because its just a waste of time, I can't change your preference.

A little, but it's much, much closer now. To the point where it's hardly worth shifting your buying decision based on it unless ALL you play is a particular OpenGL game/games.

In AT's X1900 article, with AA/AF enabled in Quake4, the X1800XT is beating the 7800GTX 256MB, and the X1900XT is only slightly slower than the much more expensive 7800GTX "Ultra" 512MB. link. *shrug*

like I have said in another thread, IMHO I believe ATI will come out with a 24 or 32 texture processor card to match G71's supposely 32 pipeline in games w/o heavy shader use.

You're really, really stuck on this idea now.

I don't think they really *need* to (unless G71 is relatively cheap, highly clocked, and has 32 full pipelines, but R580 would still be competitive in shader-heavy games), and doing so would either increase transistor count (driving up costs and power/heat) or reduce the number of pixel/vertex shaders. From the looks of upcoming games like ES4: Oblivion and UT2K7, 'heavy shader use' is going to be the norm rather than the exception going forwards, although the way NVIDIA is doing it is clearly a safer bet for legacy/current software.

Yeah the more I read on the 580 design it seems to me ATI took a pretty big gamble on people seeing this. The G71 is probably going to look really good for all the current titles out there except for FEAR. Either way its going to be interesting as new titles come out to see if ATI's gamble pays off.

 

allies

Platinum Member
Jun 18, 2002
2,572
0
71
Originally posted by: Rollo
Originally posted by: Ackmed
Once again, he only talks about speed. As I pointed out before, there is more to video cards than speed.

Not when it was X800 vs 6800U, eh Ackmed? Then it was all about the speed, never mind the missing features. Flip flop.


I think Ackmed was referring to PRICE POINT, as that's what his response was to. Not everyone can afford $750 cards, and if nvidia keeps releasing their top teir at that price point they have to have something else to compete at the $500 level.
 

allies

Platinum Member
Jun 18, 2002
2,572
0
71
Originally posted by: Alaa
nvidia will release a card to run future games when future games are there! i think unified arch. will be the future route anyway


Then what was all the hype about SM3.0 on nv40 series cards...?
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: Elfear
Originally posted by: beggerking

From what I've read ATI still lags in OpenGL. you can defend ATI all you want ,but i'm not going into that argument again because its just a waste of time, I can't change your preference.

like I have said in another thread, IMHO I believe ATI will come out with a 24 or 32 texture processor card to match G71's supposely 32 pipeline in games w/o heavy shader use.

I don't think you're getting it beggerking. Almost every post of yours I've seen has been about ATI vs. Nvidia. You assume that a person is pro-ATI (or an ATI fanboy) if they make a comment about ATI that puts it in a good light or points out some short-comings of Nvidia. Most of the guys posting really don't care which card company they buy from, most either want the best bang for the buck or the highest performance they can afford. It's as simple as that. There is no hidden agenda, no anti-Nvidia conspiracy theory, no blind devotion to a particular company. Sure there are fanboys out there, but don't assume that anyone who makes an educated statement about either Nvidia or ATI falls into that category. I think the majority of the guys that post here are well-grounded people (as well-grounded as computer geeks can be I guess).

really? are you sure? I've seen more fanism here than ever. Are you impartial? do you say Morph is impartial?

so show me those posts.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Alaa
nvidia will release a card to run future games when future games are there! i think unified arch. will be the future route anyway

That's exactly the reason why the FX series failed so miserably - it was designed for then-current games like Quake3 (which it played well) and when more advanced games came out with shaders it performed abysmally. IMO, any high end card should be designed with a reasonable outlook into the requirements of future games.
 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
Originally posted by: beggerking
Originally posted by: Elfear
Originally posted by: beggerking

From what I've read ATI still lags in OpenGL. you can defend ATI all you want ,but i'm not going into that argument again because its just a waste of time, I can't change your preference.

like I have said in another thread, IMHO I believe ATI will come out with a 24 or 32 texture processor card to match G71's supposely 32 pipeline in games w/o heavy shader use.

I don't think you're getting it beggerking. Almost every post of yours I've seen has been about ATI vs. Nvidia. You assume that a person is pro-ATI (or an ATI fanboy) if they make a comment about ATI that puts it in a good light or points out some short-comings of Nvidia. Most of the guys posting really don't care which card company they buy from, most either want the best bang for the buck or the highest performance they can afford. It's as simple as that. There is no hidden agenda, no anti-Nvidia conspiracy theory, no blind devotion to a particular company. Sure there are fanboys out there, but don't assume that anyone who makes an educated statement about either Nvidia or ATI falls into that category. I think the majority of the guys that post here are well-grounded people (as well-grounded as computer geeks can be I guess).

really? are you sure? I've seen more fanism here than ever. Are you impartial? do you say Morph is impartial?

so show me those posts.


Which part of this: "Sure there are fanboys out there, but don't assume that anyone who makes an educated statement about either Nvidia or ATI falls into that category. I think the majority of the guys that post here are well-grounded people." did you fail to comprehend???
 

PhatoseAlpha

Platinum Member
Apr 10, 2005
2,131
21
81
Anybody got benchmarks with AF but without AA? I hate AA and never use it, but AF performance does matter to me.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: PhatoseAlpha
Anybody got benchmarks with AF but without AA? I hate AA and never use it, but AF performance does matter to me.

I think maybe digit-life had some...

But how could you "hate" AA? It's making the rendering more accurate, just like AF. :confused:

Next you're going to tell me you like Digital Vibrance or something. :p
 

PhatoseAlpha

Platinum Member
Apr 10, 2005
2,131
21
81
I don't like bluriness, and AA, to my eyes, just makes things blurry, not more accurate. Tack on the performance penalities, and it's not even close to worth considering.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: PhatoseAlpha
I don't like bluriness, and AA, to my eyes, just makes things blurry, not more accurate.

In real life, objects don't have uber-razor-sharp edges like the objects your video card is rendering. It's not "blurring" the image; it's taking extra samples to blend colors around polygon transitions, making it look smoother and more natural. Mathematically, this is also more accurate; it compensates somewhat for only taking one color sample per output pixel in the normal rendering pipeline without AA.

Tack on the performance penalities, and it's not even close to worth considering.

Unless you want your games to look good or something...

Still :confused:.

Take a look at these comparison screen shots from AT's X1900 review (make sure to zoom them to 100% so you're seeing the exact pixels):

no AA
6xAA

Look at the edges of the pages of the binder on the desk, and the transitions between the panels on the walls. It looks *horrible* without AA; the 'jaggies' stand out much much more without the colors being blended.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: PhatoseAlpha
I don't like bluriness, and AA, to my eyes, just makes things blurry, not more accurate. Tack on the performance penalities, and it's not even close to worth considering.


Hmm true the picture does get a bit more blurry but you know what? I thought the same when I saw pictures of what games looked on the Voodoo 1 back in the day. I was like, "wow that's blurry, whats so special about 3D acceleration?". Once you get used to AA, you won't ever want to go back. I can see where you feel it compromises performance with a 6800 GT but get yourself a nice X1900XT and I promise you'll be singing a different tune. Since you like AF so much, an nVidia card is the last thing you want right now since they have terrible AF IMO (just ditched a 7800 GTX recently).
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
2xMSAA does look crappy, gray, and blurry, but I think 4xSSAA looks great. And that really high mode in nHancer called 16x RGMS, wow.
 

PhatoseAlpha

Platinum Member
Apr 10, 2005
2,131
21
81
I don't consider it good looking. Visually speaking, whats the real difference between extra samples to 'smooth' and blurring? Both achieve the same goal, reduced sharpness, which I don't care for.

If I want my games to look good, I'll pump up the resolution. That so hard to understand?
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: PhatoseAlpha
I don't consider it good looking. Visually speaking, whats the real difference between extra samples to 'smooth' and blurring? Both achieve the same goal, reduced sharpness, which I don't care for.

AA does not reduce sharpness. It is ADDING information, while blurring takes it away. Look at the comparison screenshots I added into my post above. It reduces the aliasing you get near the edges of polygons due to only taking one sample per pixel. You can't tell me you actually prefer the more heavily aliased edges? :confused:

If I want my games to look good, I'll pump up the resolution. That so hard to understand?

...but then you could still add AA to make it look better. :confused:
 

PhatoseAlpha

Platinum Member
Apr 10, 2005
2,131
21
81
The differences in that screenshot are noticable in all of 2 places. It also causes a loss of differentiation - notice the shadow on the tab on the notebook in the no AA, which isn't visible in the 6xAA.

Why on earth would I use that when I can bump up the resolution for the same performance hit? If I already bumped it up, bump it up another notch. I have a 21" CRT for a reason. And on the very very few games I've encounted that run smoothly with extra GPU power at 2048x1536, I've always noticed bugs using AA, primarily due to the age of the games no doubt, but still pretty darned useless.
 

NoStateofMind

Diamond Member
Oct 14, 2005
9,711
6
76
I'm sorry if this has already been discussed, I didn't take the time to read all 100+ posts. But I did read some of it. One thing I want to bring up is that someone stated that the only real advantage of the X1900 vs the 7900 is shader intensive games ie F.E.A.R. . Correct me if I am wrong, but isn't shader intensive games the future of gaming?

Also I see the X1900 as something like AMD is for CPU's. Does alot of shader work per clock. So if these things are true, wouldn't the ATI card seem to be more future proof?
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: PhatoseAlpha
The differences in that screenshot are noticable in all of 2 places.

Uh... what? It's visible on every edge in the scene. The edges of the monitor mounted on the wall, the coffee cups sitting on the console, the binder, the wall panel on the left... about the only thing it doesn't help is the main part of the console, since it's just a flat textured surface. It's just most noticeable on the binder in this shot because that's a high-contrast area (white paper versus black/grey background).

It also causes a loss of differentiation - notice the shadow on the tab on the notebook in the no AA, which isn't visible in the 6xAA.

Huh? I admit it's not the greatest picture, since it's a JPEG rather than a PNG or TGA (so the compression artifacts make it hard to compare very small details), but the shadows look the same.

Why on earth would I use that when I can bump up the resolution for the same performance hit?

1) If you can't bump up your resolution any more (possibly due to being on a fixed-resolution monitor like an LCD, or because pushing a CRT monitor to a resolution/refresh that is too high actually DOES just make it blurry and so is undesirable).

2) If you think that turning on AA improves IQ more than bumping up a resolution step (I agree that for a BIG resolution step or a small amount of AA, the resolution bump is better). I'd probably take 1600x1200 without AA over 1024x768 with 4xAA, but I'd definitely use 1280x1024 with 4xAA over 1600x1200 without AA, or 1600x1200 with AA over 2048x1536 without AA. The jaggies are still way too visible at 1600x1200 on a 20-22" monitor, and 2048x1536 isn't a whole lot better.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
NV shaders are superior than the ATi shaders. (Hence the reason why no AA bench favor NV most of the time) Its only the AA efficeny that makes it look like ATi is more efficent in shader heavy games.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: PC Surgeon
I'm sorry if this has already been discussed, I didn't take the time to read all 100+ posts. But I did read some of it. One thing I want to bring up is that someone stated that the only real advantage of the X1900 vs the 7900 is shader intensive games ie F.E.A.R. . Correct me if I am wrong, but isn't shader intensive games the future of gaming?

Also I see the X1900 as something like AMD is for CPU's. Does alot of shader work per clock. So if these things are true, wouldn't the ATI card seem to be more future proof?

There is no such thing.

I've been trying to psot this where I can in these X1900XT/X threads.

HDR+AA is the main example. HDR+AA runs great on the X1900XT/X on all FIVE games that support HDR (one of which, Far Cry, every PC gamer should have beat a long time ago). Even then the benches arent all that impressive.

I highly doubt by the time that HDR becomes mainstream in games, the X1900XT/X will able to do HDR+AA at high resolutions or with all the eye candy turned up in the newest, most demanding games.

At this point HDR+AA is a tech showcase by ATI just like HDR was for Nvidia when G70 was released. By the time HDR+AA is really needed, you'll need R600/G80 to do so at a decent resolution.

Buying a X1900XT/X purely because of HDR+AA is just as silly as buying a 6800GT for SM 3.0 or a 7800GTX for HDR was.
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Originally posted by: Matt2

I highly doubt by the time that HDR becomes mainstream in games, the X1900XT/X will able to do HDR+AA at high resolutions or with all the eye candy turned up in the newest, most demanding games.

HDR has already become mainstream in games All new games released this year will use HDR.