Farcry 1.2 patch recalled

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: rbV5
Originally posted by: Rollo
I'll believe this when the Crytek guys say it's true.

Speaking of hacks, how do you enable SM3.0 support in Far Cry without hacking the Nvidia driver *inf file and using an unreleased version of DX? IMO, it is nice to have the option for us enthusiasts, but what's your take on enabling SM3.0 support in Far Cry for Nvidia cards?


My take on it at the time was "LOL-WTF- this is like DOS days and firing up Doom1 for multiplayer". Obviously it would be nice if the game recognized the card's capability and either defaulted to that path, or gave you a check box.
I didn't mind using beta DX9, worth it for the performance increase.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BFG10K
Yeah, he probably meant something else when he said:
Show me where your quote matches what Carmack said. Where did he say "I couldn't use a R300 in Doom III?"

You comment was simply a nonsensical troll, much like the rest of the pro-nv drivel you constantly post, Mr "collector".
Sure BFG. When he said he made the tradeoff for speed because he had bumped into the instruction limits of the R300 core, he was really trying to say,"I didn't need a higher instruction count than the R300 core, I just wanted to use the nV30 because the noise annoyed me, and I wanted it to run slower"
Doesn't match my quote at all, you must be Carmacks buddy, you seem to understand him so well. Sorry about my obvious misinterpretation. :roll:

Carmack couldn't even use an R300 on Doom3 due to it's feeble instruction limits.
Yeah? And how fast do you think an NV30 would be running when executing instruction counts that exceed an R300's? Or have you changed your tune so that performance now doesn't matter but paper features do?
Hello? The point of what I said was nVidia is getting developers the features to code more advanced games first, not necessarily run them lightning fast first generation? As far as us as consumers goes, my positition has always been "it's better to have it than not" and that we should reward companies R&D by buying forward looking cards so they keep giving them to us?
You're confusing my saying DX9 PS2 was not an overwhelming reason to buy a R300 last year with somehow being in conflict with this. The nV35 had DX9 support, it's mix of 16/32 bit precision wasn't enough for me to say, "No way! 24 or nothing!" in the absence of PS2 games?

Only a marketing person would call 24 bit full precision
Then I guess Microsoft, the designers of Direct3D, are just marketers. And if FP24 is partial precision then how would you describe nVidia's encouragement of developers to use FP16 on the NV40? What do you think Mr Sweeney would say given he doesn't appear to like FP24?
Please link me to where MS says that after DX9c is released, 24 bit is considered their full precision?

of course, Rollo, ATI enabled feature=hack, Nvidia enabled feature=TWIMTBP
Of course. Rollo is the biggest nv fanboy currently in this forum and the only thing that has changed since his 5800U fetish is that he's become more blatant as the days pass.
[/quote]
BFG, I've puchased over $1700 worth of ATI cards for use in my primary box in the last five years. In other words, I've supported them in the only way they care about at all.
How about you ol' buddy? I know you've had your 9700Pro for the last two years, and a Ti4600 for the year before that, were you buying $1400 worth of ATI cards in 1999-2000? I didn't think so- did you buy any?

So who do you think ATI wants as a customer more, BFG? "nVidia's biggest fanboy" whose bought their top tier card every year for the past five at the Best Buy down the road? Or BFG who buys one lousy card and starts yelling how other people aren't supporting ATI? I've got news for you BFG- you can yell all you like, at the end of the day they'll care more about their paying customers then fan boys.

BTW- by not buying an X800, I'm sending them the message I want to send them- that they can't sell me the same core three years in a row, and they'll have to innovate again before I pay them again. I hope others do the same and help drive the technology forward.
I did the same thing when I skipped the Voodoo 3 that was SLI on one card.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Rollo
BFG, I've puchased over $1700 worth of ATI cards for use in my primary box in the last five years. In other words, I've supported them in the only way they care about at all.
How about you ol' buddy? I know you've had your 9700Pro for the last two years, and a Ti4600 for the year before that, were you buying $1400 worth of ATI cards in 1999-2000? I didn't think so- did you buy any?

You buy ATI, use it for a few months, b!tch about their drivers, etc and then turn around and sell them to another would-be new ATI card buyer; you're not helping ATI that much!

So who do you think ATI wants as a customer more, BFG? "nVidia's biggest fanboy" whose bought their top tier card every year for the past five at the Best Buy down the road? Or BFG who buys one lousy card and starts yelling how other people aren't supporting ATI? I've got news for you BFG- you can yell all you like, at the end of the day they'll care more about their paying customers then fan boys.

I don't see how bragging about unwise purchases makes you a better customer of ATI; again, it's not like you're actually hanging onto most of these cards for any substantial period of time, and the user who buys your card after a couple of months could have been giving ATI his money instead of you.

BTW- by not buying an X800, I'm sending them the message I want to send them- that they can't sell me the same core three years in a row, and they'll have to innovate again before I pay them again. I hope others do the same and help drive the technology forward.
I did the same thing when I skipped the Voodoo 3 that was SLI on one card.

GeForce 1 > GeForce 1 DDR > GeForce 2 GTS > GeForce 2 Pro/Ultra > GeForce 4 MX;
GeForce 3 > GeForce 3 Ti200/Ti500 > GeForce 4 Ti4200/4400/4600

Both companies milk one architecture for lenghty periods of time, and we've all bought them. The GeForce4 Ti series was one of the best cards ever for it's heyday, IMO, and it was using "recycled" DX8 features from the GF3, exactly like the X800 series uses the 9800 series' key features.

However, as I said before, when Nvidia comes out with the shiny new design, then all of a sudden a radical new architecture is what you're interested in, Rollo. Last year (and the year before) you didn't think the 9800 Pro was such a slam-dunk choice over the FX5800/5900 cards, despite it's more complete DX9 support, yet this year, in a similarly close race, you think the FX6800 series is the only way to go over the X800 series.

Last year, "partial precision" (FP 16 vs FP 24) wasn't such a big issue for you, yet this year, a much smaller leap from FP24 to FP32 makes all the difference in the world to you.

It's not that we are attacking you personally, Rollo, but your arguments are inconsistent and sway with Nvidia's situation, and with what you currently have running in your box. And I know it's tough, but aside from the great price you got on the 6800nu, it's not that great a card compared to the other next-gen offerings. I think I speak for everyone when I say that you don't need to remind us how much of a great purchase the FX6800nu is compared to the 9800XT, either ;) .

Is it that tough to say "Personally, I recommend the GeForce FX series (even the lesser 6800nu, the ugly stepsister to the 6800GT), but I have a personal preference towards Nvidia."
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: jiffylube1024
I think I speak for everyone when I say that you don't need to remind us how much of a great purchase the FX6800nu is compared to the 9800XT, either ;).

Yeah, you could compare just about any graphics card to a 9800XT and it will look like the bargain of the century. Except maybe the Parhelia. Why don't you compare your 6800NU to the Parhelia, Rollo? ;)
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: SickBeast
Originally posted by: jiffylube1024
I think I speak for everyone when I say that you don't need to remind us how much of a great purchase the FX6800nu is compared to the 9800XT, either ;).

Yeah, you could compare just about any graphics card to a 9800XT and it will look like the bargain of the century. Except maybe the Parhelia. Why don't you compare your 6800NU to the Parhelia, Rollo? ;)

The Parhelia cannot hold a candle to a 9800XT let alone a 6800NU. Whats to compare?
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: keysplayr2003
The Parhelia cannot hold a candle to a 9800XT let alone a 6800NU. Whats to compare?

I'm just making the point that if he's going to compare a current mid/high-end GPU to a last-gen ultra high-end GPU, he might as well compare it to the Parhelia while he's at it. The 9800XT is an overpriced waste of a graphics card ATM. It costs around the same as an X800PRO. Why doesn't he compare the 6800NU to the X800PRO? Because he doesn't want to look silly when the X800PRO slaps around his graphics card like a rag doll.

I'm basically making the point that if you're going to compare two graphics cards, you should generally compare cards that were aimed at similar markets and are similarly priced. For example, the 6800U should be compared with the X800XT. There is currently nothing to properly compare the 6800NU to, but the 9800XT is the last card I would be using for such a comparison. The closest cards right now are the X800PRO and the 6800GT. At least they're current-gen.
 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
Originally posted by: keysplayr2003
So, I broke down and bought FarCry. I must admit, I was wrong about the game. It is very nicely done. The demo did it no justice. I left my PC as is and loaded the game up. All settings on default very high and 4x/8x AA/AF 1024x768 and 56.72's DX9.0b. I start the campaign. the very first scene in the game is where your looking through a type of sewer drain. I didn't move a muscle and just looked at the FRAPS reading in the upper left hand corner and sighed. 25fps. YUCK. Continued to play and it never went below 25 and went as high as 65 to 70.

So I said the hell with it. Installed FW 61.76's, DX9.0c, the hotfix for DX9.0c, the registry edit for DX9.0c, and the FarCry 1.2 patch. I saw a lot of improvement on a 5950U in Tom's FarCry 1.2 article so I decided to try all this stuff.

Low and behold, I started the game and was looking down the same sewer pipe. There was lighting there that wasn't before. Like sunbeams coming through crack above. I didn't move a muscle and observed FRAPS again. 32fps. Shadows were now cast correctly when they were a unrendered white before the patch. I know my 5900U does not support SM3.0. but I am benefitting from all the other fixes in the patch. I am very pleased that my Mid range card on my mid range PC can play this game comfortably thanks to crytek's patch.

Just had to share. If anyone wants benchies for any reason with my setup, let me know.

Keys

Keys,
i would have never imagined what a sweet game it is until I played it. Are you getting into it? Glad to see the patch fixed some visuals.
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Originally posted by: SickBeast
Originally posted by: keysplayr2003
The Parhelia cannot hold a candle to a 9800XT let alone a 6800NU. Whats to compare?

I'm just making the point that if he's going to compare a current mid/high-end GPU to a last-gen ultra high-end GPU, he might as well compare it to the Parhelia while he's at it. The 9800XT is an overpriced waste of a graphics card ATM. It costs around the same as an X800PRO. Why doesn't he compare the 6800NU to the X800PRO? Because he doesn't want to look silly when the X800PRO slaps around his graphics card like a rag doll.

I'm basically making the point that if you're going to compare two graphics cards, you should generally compare cards that were aimed at similar markets and are similarly priced. For example, the 6800U should be compared with the X800XT. There is currently nothing to properly compare the 6800NU to, but the 9800XT is the last card I would be using for such a comparison. The closest cards right now are the X800PRO and the 6800GT. At least they're current-gen.
And neither is the comparison to a 5950U made. Wonder why that is?? Is the 5950U not as overpriced as the 9800XT?
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: gururu
Originally posted by: keysplayr2003
So, I broke down and bought FarCry. I must admit, I was wrong about the game. It is very nicely done. The demo did it no justice. I left my PC as is and loaded the game up. All settings on default very high and 4x/8x AA/AF 1024x768 and 56.72's DX9.0b. I start the campaign. the very first scene in the game is where your looking through a type of sewer drain. I didn't move a muscle and just looked at the FRAPS reading in the upper left hand corner and sighed. 25fps. YUCK. Continued to play and it never went below 25 and went as high as 65 to 70.

So I said the hell with it. Installed FW 61.76's, DX9.0c, the hotfix for DX9.0c, the registry edit for DX9.0c, and the FarCry 1.2 patch. I saw a lot of improvement on a 5950U in Tom's FarCry 1.2 article so I decided to try all this stuff.

Low and behold, I started the game and was looking down the same sewer pipe. There was lighting there that wasn't before. Like sunbeams coming through crack above. I didn't move a muscle and observed FRAPS again. 32fps. Shadows were now cast correctly when they were a unrendered white before the patch. I know my 5900U does not support SM3.0. but I am benefitting from all the other fixes in the patch. I am very pleased that my Mid range card on my mid range PC can play this game comfortably thanks to crytek's patch.

Just had to share. If anyone wants benchies for any reason with my setup, let me know.

Keys

Keys,
i would have never imagined what a sweet game it is until I played it. Are you getting into it? Glad to see the patch fixed some visuals.

Yes, definately getting into it. Visuals are very nice. I would just like to be able to crank up all the goodies to the max. And holy shiznit does that flashlight eat up frames. I was looking at a pipe and my fps was near 100 ( I was close to it and right in front of a wall). I turned the flashlight and it cut my fps almost in half to about 58. Dammmm. Well, looks like I'm going to stick with my 5900U for quite some time to come. Because by the looks of it, the card I want (besides being PCI-E) doesn't look like it will be coming down in price anywhere in the near future. I dont think I have ever seen the supply and demand factor like it is now. Unless I just didn't pay close enough attention to the last 2 gens.
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Originally posted by: keysplayr2003
I have ever seen the supply and demand factor like it is now. Unless I just didn't pay close enough attention to the last 2 gens.
Previous gen launches have not been like this. This has been the worst GPU product launch I have ever seen (from both nVidia and ATi). The cards are still very hard to find, and the prices are still not settled in to what they should be.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: oldfart
Originally posted by: keysplayr2003
I have ever seen the supply and demand factor like it is now. Unless I just didn't pay close enough attention to the last 2 gens.
Previous gen launches have not been like this. This has been the worst GPU product launch I have ever seen (from both nVidia and ATi). The cards are still very hard to find, and the prices are still not settled in to what they should be.

I think what happened was that nVidia forced ATi's hand this round. R420 was not scheduled for release until the end of the summer; if that had happened, we would not be having these supply problems. Both products were rushed to market. nVidia wanted to quickly move beyond NV30 and regain the performance crown. ATi had to do something to counterattack. In the end they misled us all.
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Originally posted by: SickBeast
Originally posted by: oldfart
Originally posted by: keysplayr2003
I have ever seen the supply and demand factor like it is now. Unless I just didn't pay close enough attention to the last 2 gens.
Previous gen launches have not been like this. This has been the worst GPU product launch I have ever seen (from both nVidia and ATi). The cards are still very hard to find, and the prices are still not settled in to what they should be.

I think what happened was that nVidia forced ATi's hand this round. R420 was not scheduled for release until the end of the summer; if that had happened, we would not be having these supply problems. Both products were rushed to market. nVidia wanted to quickly move beyond NV30 and regain the performance crown. ATi had to do something to counterattack. In the end they misled us all.
Is it something as simple as the scarcity of the DDR3 ram?
 

ponyo

Lifer
Feb 14, 2002
19,688
2,811
126
Originally posted by: oldfart
Originally posted by: SickBeast
Originally posted by: oldfart
Originally posted by: keysplayr2003
I have ever seen the supply and demand factor like it is now. Unless I just didn't pay close enough attention to the last 2 gens.
Previous gen launches have not been like this. This has been the worst GPU product launch I have ever seen (from both nVidia and ATi). The cards are still very hard to find, and the prices are still not settled in to what they should be.

I think what happened was that nVidia forced ATi's hand this round. R420 was not scheduled for release until the end of the summer; if that had happened, we would not be having these supply problems. Both products were rushed to market. nVidia wanted to quickly move beyond NV30 and regain the performance crown. ATi had to do something to counterattack. In the end they misled us all.
Is it something as simple as the scarcity of the DDR3 ram?

Scarcity of DDR3 ram is the main reason behind the supply problems of these cards from what I read.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Naustica
Originally posted by: oldfart
Originally posted by: SickBeast
Originally posted by: oldfart
Originally posted by: keysplayr2003
I have ever seen the supply and demand factor like it is now. Unless I just didn't pay close enough attention to the last 2 gens.
Previous gen launches have not been like this. This has been the worst GPU product launch I have ever seen (from both nVidia and ATi). The cards are still very hard to find, and the prices are still not settled in to what they should be.

I think what happened was that nVidia forced ATi's hand this round. R420 was not scheduled for release until the end of the summer; if that had happened, we would not be having these supply problems. Both products were rushed to market. nVidia wanted to quickly move beyond NV30 and regain the performance crown. ATi had to do something to counterattack. In the end they misled us all.
Is it something as simple as the scarcity of the DDR3 ram?

Scarcity of DDR3 ram is the main reason behind the supply problems of these cards from what I read.

That's what I heard also.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Jiffy:
Are you related to Apoppin?
:roll:
You buy ATI, use it for a few months, b!tch about their drivers, etc and then turn around and sell them to another would-be new ATI card buyer; you're not helping ATI that much!
BS. Utter, unadulteraterated, claptrap, BS.
I bought my 9700Pro at BB for around $400, used it 8 months, and sold it for $210 after I had owned it a year
Could Jiffy be wrong?

I bought my 9800Pro for $380 at newegg, traded it to DaPunisher for his 5800U, eight months later.

I didn't "steal any high end ATI sales". :roll: For that matter, I didn't complain about the drivers, and as my posts are an open book you have access to, why don't you link proof to your accusations?
Could it be because you can't?

I don't see how bragging about unwise purchases makes you a better customer of ATI; again, it's not like you're actually hanging onto most of these cards for any substantial period of time, and the user who buys your card after a couple of months could have been giving ATI his money instead of you.
See above, smart guy. I used them all except the 8500 (which wouldn't run on my motherboard which had won Tom's "Best Of" a couple month's earlier) as long as I use any card. How were they "unwise purchases"? :roll:

GeForce 1 > GeForce 1 DDR > GeForce 2 GTS > GeForce 2 Pro/Ultra > GeForce 4 MX;
GeForce 3 > GeForce 3 Ti200/Ti500 > GeForce 4 Ti4200/4400/4600
Thanks for proving my point, exactly.
GF1(new core design, adds TL, etc)>GF2 more pipes (X800 type move)
GF2>GF3 (new core design adds single pass quad texturing, programmable pixel and vertex processors, DX8 compatibility)
Your inexplicable decision to throw the budget GF4 MX into the middle of that timeline boggles the mind as it was a budget part at the end of the timeline.
nVidia's core path has been new part, refine, new part, refine for a long time now. Can you tell us about any three consecutive calendar years where they put out basically the same core?

It's not that we are attacking you personally, Rollo, but your arguments are inconsistent and sway with Nvidia's situation, and with what you currently have running in your box. And I know it's tough, but aside from the great price you got on the 6800nu, it's not that great a card compared to the other next-gen offerings.
What other next gen cards at the MSRP $300 price point are better than it, Jiffy? Since all the other next gen cards are MSRP $400 or $500, and compete with each other, this doesn't make much sense?
How great of a card do you think it is compared to your 9800Pro?

Is it that tough to say "Personally, I recommend the GeForce FX series (even the lesser 6800nu, the ugly stepsister to the 6800GT), but I have a personal preference towards Nvidia."
I've said that I have a personal preference to the nV40 line over the R420 line about 858 times in print now, so I don't get your point?
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: oldfart
Previous gen launches have not been like this. This has been the worst GPU product launch I have ever seen (from both nVidia and ATi). The cards are still very hard to find, and the prices are still not settled in to what they should be.

That's because the newest video cards are all made out of paper...
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Its pretty funny Rollo, how a bunch of folks want you to "confess". I can't get my mind around comments like this:

--------------

quote from Jiffylube:
Is it that tough to say "Personally, I recommend the GeForce FX series (even the lesser 6800nu, the ugly stepsister to the 6800GT), but I have a personal preference towards Nvidia."
end of quote.

--------------

It's like they have this deep seeded lust or need for you to say something like this. Really weird.


You are steered towards NV40 this gen. You were steered towards R300/350 last gen.
You are not pro nvidia or ati. You are anti-ATI/Nvidia fanboy, which can come across as biased depending on who is looking.

Its not too difficult to understand this, (as if it really needed understanding in the first place).

Anyway, I heard you say you ordered a GT? Feel like parting with your 6800? Just thought I'd ask.

Oh wait, you can't. Then you would be accused of stealing high-end sales from nvidia. Sorry. ;)
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: keysplayr2003
Its pretty funny Rollo, how a bunch of folks want you to "confess". I can't get my mind around comments like this:

--------------

quote from Jiffylube:
Is it that tough to say "Personally, I recommend the GeForce FX series (even the lesser 6800nu, the ugly stepsister to the 6800GT), but I have a personal preference towards Nvidia."
end of quote.

--------------

It's like they have this deep seeded lust or need for you to say something like this. Really weird.


You are steered towards NV40 this gen. You were steered towards R300/350 last gen.
You are not pro nvidia or ati. You are anti-ATI/Nvidia fanboy, which can come across as biased depending on who is looking.

Its not too difficult to understand this, (as if it really needed understanding in the first place).

Anyway, I heard you say you ordered a GT? Feel like parting with your 6800? Just thought I'd ask.

Oh wait, you can't. Then you would be accused of stealing high-end sales from nvidia. Sorry. ;)


LOL- it is pretty weird. I can tell you this though- they were saying I was biased toward nVidia when I was happily using my 9700P/9800P as well.

As far as the GT goes, I've been waffling on it. I want to see how D3 performs on the 6800, and I haven't decided if I do upgrade to go GT or Ultra.

What I should have sold is that 5800U. As cool as it is, my four year olds games don't need it, and I never use his computer. (AXP 1700+...ack....ptoooey... ;) )
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Rollo
LOL- it is pretty weird. I can tell you this though- they were saying I was biased toward nVidia when I was happily using my 9700P/9800P as well.

I remember the fiasco when you had the 9700PRO. The entire video forum was up in arms over the ordeal. You wanted to trade/sell it so that you could buy a 5800U (I think it was a 5800U anyways). At that point in time, the reviews were very harsh on NV30 and they basically said that it was slower, hotter, and more expensive than the 9700PRO. On top of that it took up two slots. That's probably why people were saying you were a fanboy. It was a long, drawn out process, and nobody could really understand why you were doing it. There aren't many people who are willing to drop $400 on a graphics card "just to try something new".
 
Apr 14, 2004
1,599
0
0
I don't really car that it doesn't work for ATI cards.
The only mistake I see them having made is not labelling it the "nV40 patch" and telling everyone else not to bother.
This is another thing I don't get- who cares if they have a financial relationship.

If these 3 quotes don't say fanboi I don't know what does. Even if you wanted to try something "new" with your graphics card, I don't see why you wouldn't simply buy a 5800 U and not penalizing your own gaming experience with a piece of trash video card. You must really like Nvidia to make such a sacrifice.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: GeneralGrievous
I don't really car that it doesn't work for ATI cards.

If I owned a 6800, I wouldn't care either. ATI cards worked beautifully with far cry correct? Nvidia cards did not and had many problems correct? So who "needed" this patch more than the other? Not ATI.

The only mistake I see them having made is not labelling it the "nV40 patch" and telling everyone else not to bother.

Yes. This would have "maybe" stopped a few ATI'ers from trying it out. But you know, if its good enough for nvidia.......

This is another thing I don't get- who cares if they have a financial relationship.

Dunno.


If these 3 quotes don't say fanboi I don't know what does. Even if you wanted to try something "new" with your graphics card, I don't see why you wouldn't simply buy a 5800 U and not penalizing your own gaming experience with a piece of trash video card. You must really like Nvidia to make such a sacrifice.


Sacrifice? Did somebody gut a lamb? Or behead a chicken? General, the 5800U was/is in the same league as a 5900nu in many areas. You think the entire line of 5xxx card are trash. You tell me. What kind of trash card can play FarCry at 1024x768x32 all settings to High/Very High, 2X quincunx/ 4X AF and keep the framerates above 32 at all times. My 5900U can with a 2.8/533fsb processor.

Its just not as important as you make it out to be General. Really. If you can't get your mind around a concept, like trying out different hardware even though it is slower in certain things compared to the hardware you are giving up, dont sweat it so much. Let it go. It's not important for you to "accept" it.
And it most certainly is not your call. ;)
 
Apr 14, 2004
1,599
0
0
You think the entire line of 5xxx card are trash.
I do.

You tell me. What kind of trash card can play FarCry at 1024x768x32 all settings to High/Very High, 2X quincunx/ 4X AF and keep the framerates above 32 at all times.
Plenty of cards can do that. With better IQ I might add. $100 used 9700 could even fit the bill.

Its just not as important as you make it out to be General.
I simply assumed that someone spending that much money on graphics cards thought that performance was rather important. Thank you for correcting my miscalculation.

If you can't get your mind around a concept, like trying out different hardware even though it is slower in certain things compared to the hardware you are giving up, dont sweat it so much.
Do you really buy into this? I honestly have not heard of anyone else buying slower hardware to "try it out".
But I suppose you are right, this sort of logic is not meant to be understood by the rest of us.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: GeneralGrievous
You think the entire line of 5xxx card are trash.
I do.

Your perogative. Did you own one? If so, what did you have.

You tell me. What kind of trash card can play FarCry at 1024x768x32 all settings to High/Very High, 2X quincunx/ 4X AF and keep the framerates above 32 at all times.
Plenty of cards can do that. With better IQ I might add. $100 used 9700 could even fit the bill.

Yes, I imagine a 9700 could do just fine. Although IQ is fine on my card. I'm not unreasonable.

Its just not as important as you make it out to be General.
I simply assumed that someone spending that much money on graphics cards thought that performance was rather important. Thank you for correcting my miscalculation.

Of course it's important. Just don't get so nuts about it.

If you can't get your mind around a concept, like trying out different hardware even though it is slower in certain things compared to the hardware you are giving up, dont sweat it so much.
Do you really buy into this? I honestly have not heard of anyone else buying slower hardware to "try it out".
But I suppose you are right, this sort of logic is not meant to be understood by the rest of us.

Buy into what exactly? Nobody asked you to "buy" anything, yet you choose to criticize a choice another person made because you didn't like it. Not eveyone in the world is just like you and are perfectly capable "and allowed" to make whatever decisions they wish. I know you aren't going to understand this, but I wouldn't mind owning a 5800 just to see what it could do. But see, that would mean taking out my 5900U and installing the 5800 in its place. I would really like to see what they could do, even if it is a bit slower. It's not all about who is the fastest dude. It's about who is the happiest with what they purchased. Being the fastest has its rewards, but others like to dabble a bit.
 

fwtong

Senior member
Feb 26, 2002
695
5
81
It's only a matter of time when games will be specifically optimzed for one particular line of video cards. Video game companies will either be ATI companies and make games specifically for ATI video cards, or Nvidia companies and make games specifically for Nvidia cards. Either that, or they'll have to release 2 versions of each game, one for Nvidia video cards and one for ATI video cards.