HDR+AA not working on X1800XTs

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: RobertR1
Originally posted by: Sc4freak
I'm pretty sure the hardware is physically capable of it, but no games (as of yet) support it.

You don't understand Rollo's thinking, freak. It only Ok for nvidia to release features that are not immediately available for play. When nvidia releases future tech, it's amazing, when Ati does it, it's a waste and must be scoffed at.
Not true RobertR1 - I wish that HDR+AA worked for every game that had HDR, because that would benefit owners of X1800 cards. (presuming it could do it at a playable framerate anyway)
To me, it won't matter, because AFAIK there will never be EXR HDR +AA on my current cards.

Rollo,
Are you busy doing due diligance by asking the the different gaming studios on the status of their HDR+AA patches? After all, it is fair and balanced news that we expect of you......

Unfortunately, game developers don't really care what I have to say since I'm just one gamer who doesn't even have the hardware to use this feature.

Will be interesting to see what comes of this. There's a long history of developers not adopting/using ATI only features, even the staff at Rage3d is the first to admit that. I would guess ATI has to pay them in some cases to do the work.

 

DaveBaumann

Member
Mar 24, 2000
164
0
0
As I said, this is just the natural path of things - while NVIDIA was the first to support FP16 blending, ATI has followed, and while ATI was the first to support FP16 blending and AA NVIDIA will follow. Just as HDR via FP16 blending didn't turn up the second it was available, becuase it requires developer support, neither will HDR and AA (especially for titles that are only just appearing that use FP16 blending that couldn't use ATI's hardware in development).
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: DaveBaumann
As I said, this is just the natural path of things - while NVIDIA was the first to support FP16 blending, ATI has followed, and while ATI was the first to support FP16 blending and AA NVIDIA will follow. Just as HDR via FP16 blending didn't turn up the second it was available, becuase it requires developer support, neither will HDR and AA (especially for titles that are only just appearing that use FP16 blending that couldn't use ATI's hardware in development).

Hey Wavey- good to see you here!

I'm only pointing this out because:

A. Many mistakenly think this works and are making buying choices based on it

B. I'm hoping some who posted that HDR was irrelevant on the nV40 due to slow performance and lack of support in titles will realize their hypocrisy and perhaps change their ways. Can always hope.

I'm all for the technology being advanced and giving developers tools to do new things, as I'm sure you know. ;) Anyway, good to see you here, hope you post here more often!
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Originally posted by: Rollo

B. I'm hoping some who posted that HDR was irrelevant on the nV40 due to slow performance and lack of support in titles will realize their hypocrisy and perhaps change their ways. Can always hope.

Just like the hypocrisy in you touting Nvidia's SM3 before it was used by developers and now writing ATI's HDR+AA off as irrelevant because it's not used by developers yet. Can you realize your own hypocrisy and change your ways? Doubtful.
 

crazydingo

Golden Member
May 15, 2005
1,134
0
0
Originally posted by: RobertR1
Originally posted by: Rollo
BTW- what was your name before you were banned? ;)
Put your "Jump To Conclusions" mat back in the closet. I haven't been banned from this board but I'll work on it, just for you.
Rollo resorts to accusations when he runs out of arguments/is cornered. ;) :laugh:


Originally posted by: M0RPH
Originally posted by: Rollo

B. I'm hoping some who posted that HDR was irrelevant on the nV40 due to slow performance and lack of support in titles will realize their hypocrisy and perhaps change their ways. Can always hope.
Just like the hypocrisy in you touting Nvidia's SM3 before it was used by developers and now writing ATI's HDR+AA off as irrelevant because it's not used by developers yet. Can you realize your own hypocrisy and change your ways? Doubtful.
owned. :)
 

DaveBaumann

Member
Mar 24, 2000
164
0
0
Originally posted by: RolloA. Many mistakenly think this works and are making buying choices based on it
The feature capability does work, I've used it. It just requires dev support.

 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
I get what everyone is saying... It's Crytek/EA/Ubisoft that hasn't released a patch yet, however, if the patch isn't available yet is this sort of advertising ethical?

http://www.ati.com/products/radeonx1k/imagequality.html

A scene from the computer game "Far Cry". The image on the left was created using conventional rendering methods. The screenshot on the right shows the same scene rendered with HDR lighting. Conventional graphics cards can not combine HDR with FSAA. The models of the Radeon X1000 series offer support for all FSAA modes even when HDR is used.

ATI specifically mentions FarCry... Nowhere do you see anything like, "*Coming soon", or "*May require software support". The average enduser doesn't know any better... Most of us didn't know any better, and it doesn't appear that [ H ] did either. So, I would think that at least everyone can agree that these are shady tactics on ATI's part regardless of your hardware preference or if you want Rollo, Ackmed, or your dog for mod. I'm not saying that NV is any better with their hardware encoder/decoder antics either, but that isn't an excuse for ATI to tout features that are not yet available to the enduser. Maybe it bugs me more in this case because I watch movies in my living room, not on my PC, but I do play games on my PC.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: nitromullet
I get what everyone is saying... It's Crytek/EA/Ubisoft that hasn't released a patch yet, however, if the patch isn't available yet is this sort of advertising ethical?

http://www.ati.com/products/radeonx1k/imagequality.html

A scene from the computer game "Far Cry". The image on the left was created using conventional rendering methods. The screenshot on the right shows the same scene rendered with HDR lighting. Conventional graphics cards can not combine HDR with FSAA. The models of the Radeon X1000 series offer support for all FSAA modes even when HDR is used.

ATI specifically mentions FarCry... Nowhere do you see anything like, "*Coming soon", or "*May require software support". The average enduser doesn't know any better... Most of us didn't know any better, and it doesn't appear that [ H ] did either. So, I would think that at least everyone can agree that these are shady tactics on ATI's part regardless of your hardware preference or if you want Rollo, Ackmed, or your dog for mod. I'm not saying that NV is any better with their hardware encoder/decoder antics either, but that isn't an excuse for ATI to tout features that are not yet available to the enduser. Maybe it bugs me more in this case because I watch movies in my living room, not on my PC, but I do play games on my PC.

But the fatures are available in the hardware, only the game devs need to use the features. Sort of like Nv marketing the whole PVP, SM3, HDR feature set - those would be useless without dev support also. Whether or not the devs are gonna use the features is a different story - I remember Trueform on the radeon 8500 being a marketing feature, and nothing much ever came of it. But in this case I think it will be used soon, since HDR is gaining popularity in modern games, implementing a proprietary HDR+AA method like Valve did is extra work, and playing without AA is a pretty lame option.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: nitromullet
I get what everyone is saying... It's Crytek/EA/Ubisoft that hasn't released a patch yet, however, if the patch isn't available yet is this sort of advertising ethical?

http://www.ati.com/products/radeonx1k/imagequality.html

A scene from the computer game "Far Cry". The image on the left was created using conventional rendering methods. The screenshot on the right shows the same scene rendered with HDR lighting. Conventional graphics cards can not combine HDR with FSAA. The models of the Radeon X1000 series offer support for all FSAA modes even when HDR is used.

ATI specifically mentions FarCry... Nowhere do you see anything like, "*Coming soon", or "*May require software support". The average enduser doesn't know any better... Most of us didn't know any better, and it doesn't appear that [ H ] did either. So, I would think that at least everyone can agree that these are shady tactics on ATI's part regardless of your hardware preference or if you want Rollo, Ackmed, or your dog for mod. I'm not saying that NV is any better with their hardware encoder/decoder antics either, but that isn't an excuse for ATI to tout features that are not yet available to the enduser. Maybe it bugs me more in this case because I watch movies in my living room, not on my PC, but I do play games on my PC.

QFT

And I'd own any dog as a mod- dogs can be bought with $.10 worth of bacon and are NOT impartial.

:laugh:
 

nts

Senior member
Nov 10, 2005
279
0
0
Originally posted by: Rollo
And I'd own any dog as a mod- dogs can be bought with $.10 worth of bacon and are NOT impartial.

And you are LOL :laugh:

Turn that bacon into NVIDIA stuff like Reference cards and...

No offense but you are the biggest fanboy I have ever seen.

:laugh:

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: munky
But the features are available in the hardware, only the game devs need to use the features.
That is the key here isn't it? History tells us that for the most part developers won't code for an ATI only feature, for whatever reason. I think Dave would even agree with that, and he has chosen one status with ATI.

Sort of like Nv marketing the whole PVP, SM3, HDR feature set - those would be useless without dev support also.
That is true, but whatever the reason, nVs developer relations have always been better than ATIs, and nV only features DO get written into games. (e.g. Far Cry, SC:CT, Riddick, etc.)

Whether or not the devs are gonna use the features is a different story - I remember Trueform on the radeon 8500 being a marketing feature, and nothing much ever came of it.
This is one good example of an ATI only feature that never went anywhere.

But in this case I think it will be used soon, since HDR is gaining popularity in modern games,
Why do you think that is Munky? Could it be because nV gave developers a EXR HDR capable card to work with a year and a half ago, and there's a huge installed user base of EXR HDR capable nVidia cards out there?
When do you think EXR HDR would be catching on if developers were limited to using ATI hardware? (Hint: they didn't have any till less than two months ago)

implementing a proprietary HDR+AA method like Valve did is extra work,
Which is probably why we've only seen it in a couple levels Valve created.

and playing without AA is a pretty lame option.
In your opinion. What if the HDR+AA is unplayable at even 10X7 on a X1800XT like HardOCP found, 16X12 HDR 0X8AF is playable? Going to give up on the "gaining popularity HDR" altogether, or run it at 800X600 to say you have both? ;)

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: nts
Originally posted by: Rollo
And I'd own any dog as a mod- dogs can be bought with $.10 worth of bacon and are NOT impartial.

And you are LOL :laugh:

Turn that bacon into NVIDIA stuff like Reference cards and...

No offense but you are the biggest fanboy I have ever seen.

:laugh:


I don't make any secret of the fact I have greatly preferred nVidias hardware for the last year and a half, and that I think the only ATI cards worth consideration are the X1800s, and they've been overpriced.

What I don't understand is why supposedly sentient people such as yourself post everyday of the week that I'm "biased" or that there is some reason I should care that I am?

I think everyone here pretty much knows:
1. I never lie or post erroneous information about computer hardware
2. ATI has annoyed me with their 3 year coast on the R300 feature set, numerous paper launches, strange multi card hardware they acknowledge they don't care much about
3. That I applaud nVidia for giving devs the nV40 feature set to make the games you'll play on ATIs catch-up card
4. That after having and using 4 SLI sets I think it's the best thing since sliced bread

Everyone knows the above about me, yet for reasons unknown, you and a couple other "valuable contributors" post mulitple times a day "Rollo is biased!"

No sh*t. I prefer Browning guns, Shimano reels, Lund boats, TOOL music, Jack Daniels, and Chevy trucks too. Biased!!

If I was the other members of this board, I would post in response to you:
STFU already. We've figured out Rollo likes nVidia cards. He has a right to his preferences, we don't care if you wish he preferred ATI like you.
 

crazydingo

Golden Member
May 15, 2005
1,134
0
0
Originally posted by: nts
Originally posted by: Rollo
And I'd own any dog as a mod- dogs can be bought with $.10 worth of bacon and are NOT impartial.

And you are LOL :laugh:

Turn that bacon into NVIDIA stuff like Reference cards and...

No offense but you are the biggest fanboy I have ever seen.

:laugh:
QFT !



Originally posted by: Rollo
Originally posted by: munky
But the features are available in the hardware, only the game devs need to use the features.
That is the key here isn't it? History tells us that for the most part developers won't code for an ATI only feature, for whatever reason.
So in your own little way, you are implying that future Nvidia products wont do HDR+AA. Oh god ...

Originally posted by: Rollo
Originally posted by: nts
Originally posted by: Rollo
And I'd own any dog as a mod- dogs can be bought with $.10 worth of bacon and are NOT impartial.

And you are LOL :laugh:

Turn that bacon into NVIDIA stuff like Reference cards and...

No offense but you are the biggest fanboy I have ever seen.

:laugh:

Originally posted by: Rollo
I think everyone here pretty much knows:

I lie whenever i can, post erroneous information about competitive hardware (like in this thread title suggest).

That I applaud nVidia for giving devs the nV40 feature set to make the games you'll play on ATIs catch-up card but at the same time be a hypocrite and dont accept the same when the tables are reversed.

That after having and using 4 SLI sets I think it's the best thing since sliced bread but also accept that it is not the thing for the masses.
Fixed it for you. :laugh:
 

nts

Senior member
Nov 10, 2005
279
0
0
Originally posted by: Rollo
That is the key here isn't it? History tells us that for the most part developers won't code for an ATI only feature, for whatever reason. I think Dave would even agree with that, and he has chosen one status with ATI.
This isn't exactly ATi only (in the spec) and developers need to change very little to have AA applied.
That is true, but whatever the reason, nVs developer relations have always been better than ATIs, and nV only features DO get written into games. (e.g. Far Cry, SC:CT, Riddick, etc.)
Always better, I disagree with that.

btw thank the FX generation for the NVIDIA developer programs.

How much financial marketing help do companies get by putting NVIDIA logos at startup?

This is one good example of an ATI only feature that never went anywhere.

It was implemented in a few games, Morrowind comes to mind. In the 9700 and beyond the feature is emulated (software). IMO it isn't needed in hardware until Geometry Shaders, was interesting to play with though.

Why do you think that is Munky? Could it be because nV gave developers a EXR HDR capable card to work with a year and a half ago, and there's a huge installed user base of EXR HDR capable nVidia cards out there?
EXR HDR is not the only solution and the current NVIDIA cards wont be running any new games with EXR HDR on.
Which is probably why we've only seen it in a couple levels Valve created.

Implementing any HDR requires new assets, it isn't something you just throw in.



What is the point you are trying to make Rollo (if you even have one)?

 

reever

Senior member
Oct 4, 2003
451
0
0
Originally posted by: Rollo
Originally posted by: nitromullet
I get what everyone is saying... It's Crytek/EA/Ubisoft that hasn't released a patch yet, however, if the patch isn't available yet is this sort of advertising ethical?

http://www.ati.com/products/radeonx1k/imagequality.html

A scene from the computer game "Far Cry". The image on the left was created using conventional rendering methods. The screenshot on the right shows the same scene rendered with HDR lighting. Conventional graphics cards can not combine HDR with FSAA. The models of the Radeon X1000 series offer support for all FSAA modes even when HDR is used.

ATI specifically mentions FarCry... Nowhere do you see anything like, "*Coming soon", or "*May require software support". The average enduser doesn't know any better... Most of us didn't know any better, and it doesn't appear that [ H ] did either. So, I would think that at least everyone can agree that these are shady tactics on ATI's part regardless of your hardware preference or if you want Rollo, Ackmed, or your dog for mod. I'm not saying that NV is any better with their hardware encoder/decoder antics either, but that isn't an excuse for ATI to tout features that are not yet available to the enduser. Maybe it bugs me more in this case because I watch movies in my living room, not on my PC, but I do play games on my PC.

QFT

And I'd own any dog as a mod- dogs can be bought with $.10 worth of bacon and are NOT impartial.

:laugh:


How is it ok for Nvidia to tout features that are not yet available to the end user? Can you answer that without another question or a blanket statement?
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: munky
Originally posted by: nitromullet
I get what everyone is saying... It's Crytek/EA/Ubisoft that hasn't released a patch yet, however, if the patch isn't available yet is this sort of advertising ethical?

http://www.ati.com/products/radeonx1k/imagequality.html

A scene from the computer game "Far Cry". The image on the left was created using conventional rendering methods. The screenshot on the right shows the same scene rendered with HDR lighting. Conventional graphics cards can not combine HDR with FSAA. The models of the Radeon X1000 series offer support for all FSAA modes even when HDR is used.

ATI specifically mentions FarCry... Nowhere do you see anything like, "*Coming soon", or "*May require software support". The average enduser doesn't know any better... Most of us didn't know any better, and it doesn't appear that [ H ] did either. So, I would think that at least everyone can agree that these are shady tactics on ATI's part regardless of your hardware preference or if you want Rollo, Ackmed, or your dog for mod. I'm not saying that NV is any better with their hardware encoder/decoder antics either, but that isn't an excuse for ATI to tout features that are not yet available to the enduser. Maybe it bugs me more in this case because I watch movies in my living room, not on my PC, but I do play games on my PC.

But the fatures are available in the hardware, only the game devs need to use the features. Sort of like Nv marketing the whole PVP, SM3, HDR feature set - those would be useless without dev support also. Whether or not the devs are gonna use the features is a different story - I remember Trueform on the radeon 8500 being a marketing feature, and nothing much ever came of it. But in this case I think it will be used soon, since HDR is gaining popularity in modern games, implementing a proprietary HDR+AA method like Valve did is extra work, and playing without AA is a pretty lame option.

I'm not going to dispute NV marketing the PVP last year (and I recall mentioning that in my post), but, that doesn't make what ATI is doing any better... ATI has screenshots of an existing game I currently own running HDR+AA, which, to me, implies that I can buy an ATI card, install FarCry, and play with HDR+AA. However, we all know this is not the case, which IMO is pretty shady advertising.

This thread isn't about who can dig up all the things that either ATI or NV have done wrong in the past... I'm sure we can all find examples of each doing things less than appropriate, but that doesn't make any of it right. This thread is about a feature that could potentially differentiate the X1800XT from it's competition and justtify its high price, that apparently no enduser can take advantage of at the moment. I have no problems with the current situation of software not supporting a feature that is supported by hardware. I do however have an issue with screenshots taken from a beta/test/unreleased patch used as a marketing tool with no disclaimer about this requiring software that isn't out yet whatsoever.
 

nts

Senior member
Nov 10, 2005
279
0
0
Originally posted by: Rollo
I don't make any secret of the fact I have greatly preferred nVidias hardware for the last year and a half, and that I think the only ATI cards worth consideration are the X1800s, and they've been overpriced.

Not to derail this thread anymore, taking to PMs.

 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: reever
Originally posted by: Rollo
Originally posted by: nitromullet
I get what everyone is saying... It's Crytek/EA/Ubisoft that hasn't released a patch yet, however, if the patch isn't available yet is this sort of advertising ethical?

http://www.ati.com/products/radeonx1k/imagequality.html

A scene from the computer game "Far Cry". The image on the left was created using conventional rendering methods. The screenshot on the right shows the same scene rendered with HDR lighting. Conventional graphics cards can not combine HDR with FSAA. The models of the Radeon X1000 series offer support for all FSAA modes even when HDR is used.

ATI specifically mentions FarCry... Nowhere do you see anything like, "*Coming soon", or "*May require software support". The average enduser doesn't know any better... Most of us didn't know any better, and it doesn't appear that [ H ] did either. So, I would think that at least everyone can agree that these are shady tactics on ATI's part regardless of your hardware preference or if you want Rollo, Ackmed, or your dog for mod. I'm not saying that NV is any better with their hardware encoder/decoder antics either, but that isn't an excuse for ATI to tout features that are not yet available to the enduser. Maybe it bugs me more in this case because I watch movies in my living room, not on my PC, but I do play games on my PC.

QFT

And I'd own any dog as a mod- dogs can be bought with $.10 worth of bacon and are NOT impartial.

:laugh:


How is it ok for Nvidia to tout features that are not yet available to the end user? Can you answer that without another question or a blanket statement?

Did you read the post...? Let me put this as simply as I possibly can... ATI's lies don't make NVIDIA lies any better or worse, and NVIDIA's lies don't make ATI's any better or worse. I don't think I ever claimed otherwise.
 

Zenoth

Diamond Member
Jan 29, 2005
5,202
216
106
Ok, I've just taken a few screenshots (thanks Steve) in Far Cry (patched to version 1.3).

With "\r_hdrrendering 7" in effect.

In-game settings were: Everything as high as it could be, except for resolution (1024 x 768), and Lightning quality (there is a bug with "Very High"), which was set at "High".

With 6x Adaptive A-A / 8x HQ A-F

Screenshots:

http://img204.imageshack.us/img204/4071/farcry00002ki.jpg
http://img204.imageshack.us/img204/8319/farcry00018or.jpg
http://img204.imageshack.us/img204/9392/farcry00037md.jpg
http://img204.imageshack.us/img204/7093/farcry00055cs.jpg
http://img204.imageshack.us/img204/4884/farcry00069ha.jpg
http://img204.imageshack.us/img204/9307/farcry00079jd.jpg
http://img204.imageshack.us/img204/963/farcry00095ox.jpg
http://img204.imageshack.us/img204/1237/farcry00100rb.jpg
http://img10.imageshack.us/img10/4901/farcry00115wd.jpg
http://img10.imageshack.us/img10/8411/farcry00129uq.jpg
http://img212.imageshack.us/img212/6380/farcry00149km.jpg

The screenshots format is .JPG, and I as I look at them, I see there's some lack of details, and they might not make justice to what it really is to see it in motion.

But it's better than nothing.

My conclusion: AA (or AAA, as in "Adaptive" AA) does indeed work with HDR. Well ... Far Cry's, at least.
 

KeepItRed

Senior member
Jul 19, 2005
811
0
0
Originally posted by: Rollo
2. ATI has annoyed me with their 3 year coast on the R300 feature set, numerous paper launches, strange multi card hardware they acknowledge they don't care much about

Interesting...mind if you link me up?
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Zenoth
Ok, I've just taken a few screenshots (thanks Steve) in Far Cry (patched to version 1.3).

With "\r_hdrrendering 7" in effect.

In-game settings were: Everything as high as it could be, except for resolution (1024 x 768), and Lightning quality (there is a bug with "Very High"), which was set at "High".

With 6x Adaptive A-A / 8x HQ A-F

Screenshots:

http://img204.imageshack.us/img204/4071/farcry00002ki.jpg
http://img204.imageshack.us/img204/8319/farcry00018or.jpg
http://img204.imageshack.us/img204/9392/farcry00037md.jpg
http://img204.imageshack.us/img204/7093/farcry00055cs.jpg
http://img204.imageshack.us/img204/4884/farcry00069ha.jpg
http://img204.imageshack.us/img204/9307/farcry00079jd.jpg
http://img204.imageshack.us/img204/963/farcry00095ox.jpg
http://img204.imageshack.us/img204/1237/farcry00100rb.jpg
http://img10.imageshack.us/img10/4901/farcry00115wd.jpg
http://img10.imageshack.us/img10/8411/farcry00129uq.jpg
http://img212.imageshack.us/img212/6380/farcry00149km.jpg

The screenshots format is .JPG, and I as I look at them, I see there's some lack of details, and they might not make justice to what it really is to see it in motion.

But it's better than nothing.

My conclusion: AA (or AAA, as in "Adaptive" AA) does indeed work with HDR. Well ... Far Cry's, at least.

Good contribution to this thread. It does appear that HDR+AA is working. Wonder why [ H ] couldn't figure that out...? Can you take some shots from an indoor level that really shows the contrast of HDR?
 

Zenoth

Diamond Member
Jan 29, 2005
5,202
216
106
Sure I could. But which level ?

Let me try to find one with good effects. I'll try to take the screenshots for indoor levels in a few moments. If not, I'll do it tommorow.

I need to return the game to my friend tommorow anyway, so that'll be my limit.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
That is true, but whatever the reason, nVs developer relations have always been better than ATIs, and nV only features DO get written into games. (e.g. Far Cry, SC:CT, Riddick, etc.)
What features might those be?

HDR in Far Cry? A slideshow and neither SLI nor AA works on nVidia cards. OTOH ATi has always run that game well despite being a TWIMTBP title. Every version has had some kind of rendering issue on nVidia cards ranging from banding to blocky shadows.

SC:CT? Are you referring to the SM 2.0 path that was put in for ATi cards that made them run faster than nVidia cards at equal to or better IQ?

Riddick? Are you talking about unplayable soft shadows again? Yet in the same thread you claim ATi's HDR + AA advantage is worthless because it's too slow. Fair and balanced indeed.

and that I think the only ATI cards worth consideration are the X1800s, and they've been overpriced.
Are they overpiced compared to purchasing three 5800 Ultras?

Or how about compared to purchasing fourteen NV4x cards?
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: KeepItRed
Originally posted by: Rollo
2. ATI has annoyed me with their 3 year coast on the R300 feature set, numerous paper launches, strange multi card hardware they acknowledge they don't care much about

Interesting...mind if you link me up?

Sorry not going to search it. In an interview a month or two ago they said they would not be integrating any Crossfire circuitry into their GPUs because to do so would add cost to EVERY card, and Crossfire is a very small percent of the market.
(with that attitude you can seel why)

He has a point and he doesn't: Yes it's a small market, but it might be bigger if they put out a better solution.

Anyway, Google is your friend, what is in this reply is my reasoning for saying they don't care much about it. It could also be interpreted as they care more about the single card market and want to save you all a few bucks a card if you're not Crossfiring.

nVidia obviously cares a lot about SLI, and has made it one of their main focuses over the last year. The improvements are testament to that.

I'm personally glad to see it, I was a 3DFX V2 SLI person, and am loving having it again.