Whats with the hype of the X1900's?

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
Originally posted by: Matthias99
Originally posted by: PhatoseAlpha
I don't like bluriness, and AA, to my eyes, just makes things blurry, not more accurate.

In real life, objects don't have uber-razor-sharp edges like the objects your video card is rendering. It's not "blurring" the image; it's taking extra samples to blend colors around polygon transitions, making it look smoother and more natural. Mathematically, this is also more accurate; it compensates somewhat for only taking one color sample per output pixel in the normal rendering pipeline without AA.

Tack on the performance penalities, and it's not even close to worth considering.

Unless you want your games to look good or something...

Still :confused:.

Take a look at these comparison screen shots from AT's X1900 review (make sure to zoom them to 100% so you're seeing the exact pixels):

no AA
6xAA

Look at the edges of the pages of the binder on the desk, and the transitions between the panels on the walls. It looks *horrible* without AA; the 'jaggies' stand out much much more without the colors being blended.


Exactly and you really notice it in gameplay since the jaggies tend to move on the object, creating a rippeling or ripping effect. Me thinks this dude does'nt even have a computer saying stuff like "AA blurry" "Not more accurate." Either that of he's a pro gamer all those guys play 10x7 minimum settings..Third choice is clueless to his enviroment. Forth is blind.
 

Tom

Lifer
Oct 9, 1999
13,293
1
76
__________________________________________________________

MATT2 said-

HDR+AA is the main example. HDR+AA runs great on the X1900XT/X on all FIVE games that support HDR (one of which, Far Cry, every PC gamer should have beat a long time ago).

__________________________________________________________________




If you don't count 5 or so games, there isn't any reason to upgrade from a Geforce4 or ATI 9700. ;)
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Cookie Monster
NV shaders are superior than the ATi shaders. (Hence the reason why no AA bench favor NV most of the time) Its only the AA efficeny that makes it look like ATi is more efficent in shader heavy games.

You must be talking about the x1800, because in games that actually stress the card *cough* FEAR *cough* the x1900 is miles ahead of any competing Nv card, with or without AA.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: M0RPH
Originally posted by: Matt2

I highly doubt by the time that HDR becomes mainstream in games, the X1900XT/X will able to do HDR+AA at high resolutions or with all the eye candy turned up in the newest, most demanding games.

HDR has already become mainstream in games All new games released this year will use HDR.

LOL- ALL of them MORPH? :roll:
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: munky
Originally posted by: Cookie Monster
NV shaders are superior than the ATi shaders. (Hence the reason why no AA bench favor NV most of the time) Its only the AA efficeny that makes it look like ATi is more efficent in shader heavy games.

You must be talking about the x1800, because in games that actually stress the card *cough* FEAR *cough* the x1900 is miles ahead of any competing Nv card, with or without AA.

Well come on. The damn card (X1900) has 48 pixel shaders!!!
 

linkgoron

Platinum Member
Mar 9, 2005
2,598
1,238
136
Originally posted by: Rollo
Originally posted by: M0RPH
Originally posted by: Matt2

I highly doubt by the time that HDR becomes mainstream in games, the X1900XT/X will able to do HDR+AA at high resolutions or with all the eye candy turned up in the newest, most demanding games.

HDR has already become mainstream in games All new games released this year will use HDR.

LOL- ALL of them MORPH? :roll:
Well, I think he means "All of the big titles".
Much better then taking one simulator game, that sucked ANYWAY, making a big thread about it, and telling people not to buy Ati cards because "future games" will use that feature, and in the end, being wrong about why ATi's performance was actually bad. HDR is actually going to be used.
HDR is going to be used in every major release.
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: Matt2
...

Buying a X1900XT/X purely because of HDR+AA is just as silly as buying a 6800GT for SM 3.0 or a 7800GTX for HDR was.

Nevertheless people are doing it. They feel good because they can say: "... it's good, it has HDR+AA, it's futureproof, ..." They need to justify $500 investment.

 

ayabe

Diamond Member
Aug 10, 2005
7,449
0
0
Originally posted by: Janooo
Originally posted by: Matt2
...

Buying a X1900XT/X purely because of HDR+AA is just as silly as buying a 6800GT for SM 3.0 or a 7800GTX for HDR was.

Nevertheless people are doing it. They feel good because they can say: "... it's good, it has HDR+AA, it's futureproof, ..." They need to justify $500 investment.



Or they want the best available single card solution. I doubt very many people are upgrading because of the AA+HDR feature.
 

CP5670

Diamond Member
Jun 24, 2004
5,678
778
126
A single X1900XT looks like an appealing choice to me for its superior AF, but I don't really care about HDR+AA. I hardly never use AA unless I have maxed out my resolution and still have power to spare, which I doubt will be the case in most HDR supporting games.
 

MegaWorks

Diamond Member
Jan 26, 2004
3,819
1
0
Originally posted by: Zenoth
I like the X1900, but I have no reasons to upgrade now.

I won't just do it because I can afford it, nor because it's new, or "a bit" better.

Actually I'm not sure if ATi needed to release such a product. Even after the X1800 fiasco. They should have just canceled the whole X1800 series and architecture in silence, and release the X1900 family between last December and this very month.

Personally I'll be waiting for R600 anyway.

I am curious enough to see how nVidia will do with G71. But even if it's technically better than R580, I'd have no reasons to buy it, since I get 30% price reduction for any ATi products I buy from my local store, since I have a contact working there. So, going with ATi is simply more affordable for me.

You work for microbytes? cipc? Frosty?
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: allies
Originally posted by: Rollo
Originally posted by: Ackmed
Once again, he only talks about speed. As I pointed out before, there is more to video cards than speed.

Not when it was X800 vs 6800U, eh Ackmed? Then it was all about the speed, never mind the missing features. Flip flop.


I think Ackmed was referring to PRICE POINT, as that's what his response was to. Not everyone can afford $750 cards, and if nvidia keeps releasing their top teir at that price point they have to have something else to compete at the $500 level.

Actually, I was referering to the more advanced GPU of the X1K vs. anything else NV has to offer. I have the same stance I have now, that I had then. Hes just too ignorant to see that.
 

SparkyJJO

Lifer
May 16, 2002
13,357
7
81
cycle always goes on and on and on - the thing is if you wait a bit longer after the G71, then you could get the ATI supercard, then wait a bit more, the next nVidia supercard, etc etc etc.... There is a point where you can't look ahead with computers too much and just have to go with what there are in computers now, and accept you're "latest-and-greatest" will be in 2nd or 3rd (or 4th or 5th....) place not too long down the road.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Do any of these cards support HDCP? Or does any vga card at all support it? Last I checked, they dont, so anyone planning to keep any current card for a few more years will be sorely disapointed. I doubt the g71 will be any different either...
 

Starbuck1975

Lifer
Jan 6, 2005
14,698
1,909
126
For those of us that don't upgrade our video cards on an annual basis, what is the best future proof card to get TODAY...even in the face of new releases over the next two years, which card NOW will hold up in terms of GPU gaming goodness.
 

ayabe

Diamond Member
Aug 10, 2005
7,449
0
0
Originally posted by: Starbuck1975
For those of us that don't upgrade our video cards on an annual basis, what is the best future proof card to get TODAY...even in the face of new releases over the next two years, which card NOW will hold up in terms of GPU gaming goodness.

I think that 'era' is dead, I had my 9800 Pro for a good 2+ years and it did everything I asked it to. Nowadays it seems like every new gaming engine that comes out requires a new GPU to power it, either because of features or the horrible toll it takes on framerates.
 

Starbuck1975

Lifer
Jan 6, 2005
14,698
1,909
126
I think that 'era' is dead, I had my 9800 Pro for a good 2+ years and it did everything I asked it to. Nowadays it seems like every new gaming engine that comes out requires a new GPU to power it, either because of features or the horrible toll it takes on framerates.
I have had my 9800 Pro for nearly 3 years, but I am also running an AMD Athlon XP1900+ and only 512MB of RAM...definitely in need of an upgrade.

 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: Starbuck1975
For those of us that don't upgrade our video cards on an annual basis, what is the best future proof card to get TODAY...even in the face of new releases over the next two years, which card NOW will hold up in terms of GPU gaming goodness.


I bolded the important part because this is the area that is very subjective. Does "gaming goodness" mean high details? I still have a ti4200 running in a kid's machine that still plays everything at settings of 1024x768 (SW BF2, LOTR BFME, WC3, HL2, KOTOR1&2, etc). Granted, there's no AA/AF, but those weren't even useful when the card was brand new.

The only game it WON'T play AT ALL is Battlefield 2.

IMO, the gaming industry has advanced more in the last 6-12 months than in the last 3 years combined. So a 3+ year-old card probably is not a good example.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
If there was one card that I had to pick as the most likely to last the longest in terms of usefulness, it would be the x1900 series. Given the trend in modern games, you cant argue with 48 pixel shaders. This isnt some SM3 BS from last gen that may or may not be important in future games, this is solid hardware on the chip, and unless the game industry makes a 180* turn and heads back into the pre-2001 era, all those shaders will be the card's strongest and most beneficial feature - not HDR+AA or HQ AF.
 

OvErHeAtInG

Senior member
Jun 25, 2002
770
0
0
Originally posted by: Starbuck1975
I think that 'era' is dead, I had my 9800 Pro for a good 2+ years and it did everything I asked it to. Nowadays it seems like every new gaming engine that comes out requires a new GPU to power it, either because of features or the horrible toll it takes on framerates.
I have had my 9800 Pro for nearly 3 years, but I am also running an AMD Athlon XP1900+ and only 512MB of RAM...definitely in need of an upgrade.

Agree with Munky, the X1900 is definitely the most appealing - at the $500 price point. And if you have the PSU to handle it. On the other hand, if you can get a deal on a 7800GT, that may be far preferable since it's half the price.

I think buying a highend card is like buying a new car. It's totally worth it if you have the DPI and just want to plug it in and forget about it. On the other hand you do take a huge hit on the depreciation for comsumer products like that. If you like to upgrade a lot, get a midrange card and upgrade every 12-18 months. If you don't, get the fastest thing out there and keep it for 3 years, like you did with the 9800 pro.
 

imported_michaelpatrick33

Platinum Member
Jun 19, 2004
2,364
0
0
I have you all beat. I own ATI stock (bought at 10.50 so please don't laugh too hard, :laugh: it's currently at 17.98) and a 7800GTX. I am a fanboi of both companies. Yiiiiiiiiiiiiiiiiiiiiiipppppppppppppppppppppppppppppeeeeeeeeeeeeeeeeeeeeeeeeeee
eeeeeeeeeeeeeee
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: michaelpatrick33
I have you all beat. I own ATI stock (bought at 10.50 so please don't laugh too hard, :laugh: it's currently at 17.98) and a 7800GTX. I am a fanboi of both companies. Yiiiiiiiiiiiiiiiiiiiiiipppppppppppppppppppppppppppppeeeeeeeeeeeeeeeeeeeeeeeeeee
eeeeeeeeeeeeeee

LOL, I also bough it last summer when it was low. Now I'm trying to decide whether to sell or keep.
 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
Originally posted by: ayabe
Originally posted by: Starbuck1975
For those of us that don't upgrade our video cards on an annual basis, what is the best future proof card to get TODAY...even in the face of new releases over the next two years, which card NOW will hold up in terms of GPU gaming goodness.

I think that 'era' is dead, I had my 9800 Pro for a good 2+ years and it did everything I asked it to. Nowadays it seems like every new gaming engine that comes out requires a new GPU to power it, either because of features or the horrible toll it takes on framerates.

Agreed. Anyone wanting to keep up decently in games these days should expect an upgrade every 12 months or so. The trick is to sell the old card before it devaluates too much and not hang on to it longer than necessary.

 

imported_michaelpatrick33

Platinum Member
Jun 19, 2004
2,364
0
0
Originally posted by: munky
Originally posted by: michaelpatrick33
I have you all beat. I own ATI stock (bought at 10.50 so please don't laugh too hard, :laugh: it's currently at 17.98) and a 7800GTX. I am a fanboi of both companies. Yiiiiiiiiiiiiiiiiiiiiiipppppppppppppppppppppppppppppeeeeeeeeeeeeeeeeeeeeeeeeeee
eeeeeeeeeeeeeee

LOL, I also bough it last summer when it was low. Now I'm trying to decide whether to sell or keep.

I think I am going to keep it (it is only 2500 shares) but I have a feeling that ATI has righted their ship. I am hoping for 20.00 but it is probably a dream. I am really kicking myself because I almost bought AMD stock back in circa 1999 when it was like 3.50 a share. I almost bought 16,000 shares. Now, I just want to scream! :laugh:
 

Alaa

Senior member
Apr 26, 2005
839
8
81
i think G80 and R600 are the cards to keep for a while because new engines will be out soon and neither G71 nor R580 will do enough in them, i think the next generation might be a start of a new era even if the cards werent that good but they are 8-9 months away..
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: PhatoseAlpha
If I want my games to look good, I'll pump up the resolution. That so hard to understand?
Modern AA, meaning multi-sample AA (MSAA) does exactly that, only it just "pumps up the resolution" of polygon edges to reduce aliasing (jagged edges).

Super-sample AA, which is what very old cards did and modern cards do in select few modes (NV in 8xS, both NV and ATI in their SLI/SuperAA and Transparency/Adaptive modes) might come off as "blurred" in screenshots, but in reality reduces aliasing (jaggies) on the entire screen, both polygons and textures. It costs a lot more performance b/c everything is being rendered at a higher resolution, not just a relatively few geometry edges.

Higher res all by itself improves the whole picture, both edges and textures (because it pushes lower-res MIP-maps further back). But 2xMSAA at 1600x1200 should offer higher-resolution edges than no AA at 2048x1536, because you get twice the edge-equivalent resolution (EER) of 16x12, vs. the plain edge resolution of 20x15.

Whether you prefer it is obviously personal, but technically more MSAA at one resolution down should offer smoother (*not* blurrier) edges, which is more true to life (you don't see the atoms in a door frame, but the pixels on your screen are a bit bigger).

It's quite simple: MSAA, which is what you get with most AA modes on modern cards, doesn't blur the image, it only increases the resolution of polygon edges. Unfortunately, not every edge in a game is a polygon (leaves, trees, fences, etc.), which is why newer cards offer SSAA on those parts of the image, and use MSAA on the rest. In that case (Transparency/Adaptive AA, meaning SSAA), I suppose you could consider the foliage and fences "blurred."