Farcry 1.2 patch recalled

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: keysplayr2003
What kind of trash card can play FarCry at 1024x768x32 all settings to High/Very High, 2X quincunx/ 4X AF and keep the framerates above 32 at all times. My 5900U can with a 2.8/533fsb processor.

Look here. The 1.2 patch for Far Cry which eliminates all the visual anomolies on the NV30 cards also causes them to lose performance. The 9800XT is now doubling the 5950U in every single benchmark for that game. Far Cry isn't the best game to bring up to make your point ATM.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: fwtong
It's only a matter of time when games will be specifically optimzed for one particular line of video cards. Video game companies will either be ATI companies and make games specifically for ATI video cards, or Nvidia companies and make games specifically for Nvidia cards. Either that, or they'll have to release 2 versions of each game, one for Nvidia video cards and one for ATI video cards.


Neither company would settle for that. Why code for half the gaming population when you can code for all and make more money. The motivation factor here is corporate greed.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: SickBeast
Originally posted by: keysplayr2003
What kind of trash card can play FarCry at 1024x768x32 all settings to High/Very High, 2X quincunx/ 4X AF and keep the framerates above 32 at all times. My 5900U can with a 2.8/533fsb processor.

Look here. The 1.2 patch for Far Cry which eliminates all the visual anomolies on the NV30 cards also causes them to lose performance. The 9800XT is now doubling the 5950U in every single benchmark for that game. Far Cry isn't the best game to bring up to make your point ATM.

Thats not my case. My visual anomolies dissappeared with 1.2 and increased my minimum fps from 25 to 32. I dont know what my max framerate is yet because it jumps all over the place. The minimum is what I was more concerned with so I paid close attention to it. Something is wrong with Xbit's graphs there because my rig is substantially slower than their test rig. Look at sig. And I'm using AA/AF albeit low settings. I'm happy with this result.
Anyway, my point was for the General to chill a bit and not take all of this sooooooo seriously. I like discussing PC hardware but when the General types chime in, it takes the interest away with it.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
keys, this is a potentially insulting question, but are you sure you have AA enabled? IIRC, nV required you to set it from either the game or the ctrl panel, but not both. I've also read two reviews that mentioned problems applying AA/AF.

I'm just surprised your 5900 runs FC so well, considering that review.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Pete
keys, this is a potentially insulting question, but are you sure you have AA enabled? IIRC, nV required you to set it from either the game or the ctrl panel, but not both. I've also read two reviews that mentioned problems applying AA/AF.

I'm just surprised your 5900 runs FC so well, considering that review.

Not insulted at all Pete.

I have 2X Quincunx AA and 4X AF turned on in the Nvidia control panel. Not the game. There is noticable difference in quality because when I do this, things don't "crawl" anymore, or substantially less. And objects, such as grass, are rendered further away instead of when I get just about right up to it. Distance rendering? I guess?

I don't benchmark very often so if you can tell me how to show you my results using Fraps or another program, let me know.

Its a BFG 5900U. Nothing in my system is overclocked. umm. what else. I did a 60 second fps capture using the F11 key in fraps. I checked out the .csv file for min/max/avg. Heres what I got.

Frames_____Time (ms)____Min____Max____Avg
2458_______60000_______27_____56_____40.967

This was the map "Treehouse" at the very first checkpoint if that helps. I never saw the fps go down to 27. Lowest I saw was 32 so it must have dropped down to 27 for just a nanosecond so I didn't detect it.
But anyway, I think its doing pretty damn good. What do you think?

On a side note, here is a pic of things to come: MYDOOM!!! Psyched!!!

Notice the little pewter "Mancubus" figurine from Doom.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Sure BFG. When he said he made the tradeoff for speed because he had bumped into the instruction limits of the R300 core, he was really trying to say,"I didn't need a higher instruction count than the R300 core, I just wanted to use the nV30 because the noise annoyed me, and I wanted it to run slower"
What the hell are you babbling about? Answer the question: how fast do you expect the NV30 to execute shaders that exceed the R300's maximum length?

Hello? The point of what I said was nVidia is getting developers the features to code more advanced games first, not necessarily run them lightning fast first generation?
And what good is that if the card can't run said features at acceptable speeds?

As far as us as consumers goes, my positition has always been "it's better to have it than not" and that we should reward companies R&D by buying forward looking cards so they keep giving them to us?
You mean like when you pronounced that nobody needed SM 2.0 and that features were irrelevant as long as the cards ran at similar speeds?

You're confusing my saying DX9 PS2 was not an overwhelming reason to buy a R300 last year with somehow being in conflict with this.
Confusing? I'm not confused. You're either a blatant troll or you have serious intellectual issues. There's simply no other way to describe your moronic reasoning.

Please link me to where MS says that after DX9c is released, 24 bit is considered their full precision?
Please link me to where MS says FP16 has ever been consider full precision?

Now explain to me why nVidia is enouraging developers to use FP16 on the NV40? Is this your definition of "forward looking", as you were discussing above?

BFG, I've puchased over $1700 worth of ATI cards for use in my primary box in the last five years.
And? I've spent thousands of dollars on petrol and electricity over the years but that doesn't mean I like the places I buy it from.

How about you ol' buddy?
:roll:
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
General, the 5800U was/is in the same league as a 5900nu in many areas.
The entire NV3x line was sorely lacking in the shader performance department.

What kind of trash card can play FarCry at 1024x768x32 all settings to High/Very High, 2X quincunx/ 4X AF and keep the framerates above 32 at all times. My 5900U can with a 2.8/533fsb processor.
Your 5900 is running the SM 1.x path with inferior IQ and precision compared to that of the standard SM 2.0 patch used on Radeon cards.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: BFG10K
General, the 5800U was/is in the same league as a 5900nu in many areas.
The entire NV3x line was sorely lacking in the shader performance department.

What kind of trash card can play FarCry at 1024x768x32 all settings to High/Very High, 2X quincunx/ 4X AF and keep the framerates above 32 at all times. My 5900U can with a 2.8/533fsb processor.
Your 5900 is running the SM 1.x path with inferior IQ and precision compared to that of the standard SM 2.0 patch used on Radeon cards.

I don't doubt that your correct BFG as I am not an expert in shader paths. NV3x may surely have been lacking in shader performacne, but certainly not lacking in gaming performance I hope you would agree as I have posted my rudimentary benchmarks in FarCry. It looks beautiful although I'm sure not as beautiful as on a R3xx/R420/NV40 no doubt, but still very impressive nontheless. Leaves are shiny and detailed (its what I noticed most. And the water is exquisite. Maybe I should post a few screenies so you can point out how terrible it is for me. I don't see it. Not being sarcastic.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Rollo
I don't see how bragging about unwise purchases makes you a better customer of ATI; again, it's not like you're actually hanging onto most of these cards for any substantial period of time, and the user who buys your card after a couple of months could have been giving ATI his money instead of you.
See above, smart guy. I used them all except the 8500 (which wouldn't run on my motherboard which had won Tom's "Best Of" a couple month's earlier) as long as I use any card. How were they "unwise purchases"? :roll:

I would call the purchase unwise based on how you sold them so quickly for about half of what you paid for, but then again I'm one of those types that usually sticks a generation behind to save money.

GeForce 1 > GeForce 1 DDR > GeForce 2 GTS > GeForce 2 Pro/Ultra > GeForce 4 MX;
GeForce 3 > GeForce 3 Ti200/Ti500 > GeForce 4 Ti4200/4400/4600
Thanks for proving my point, exactly.
GF1(new core design, adds TL, etc)>GF2 more pipes (X800 type move)
GF2>GF3 (new core design adds single pass quad texturing, programmable pixel and vertex processors, DX8 compatibility)
Your inexplicable decision to throw the budget GF4 MX into the middle of that timeline boggles the mind as it was a budget part at the end of the timeline.
nVidia's core path has been new part, refine, new part, refine for a long time now. Can you tell us about any three consecutive calendar years where they put out basically the same core?

Sorry, I forgot about the famous "rule of three" that you invented where a GPU company isn't allowed to release a chip three times based on essentially the same core featureset :p . Even though Nvidia has done this feat, 3dfx made it's bread and butter this way, etc.

It's not that we are attacking you personally, Rollo, but your arguments are inconsistent and sway with Nvidia's situation, and with what you currently have running in your box. And I know it's tough, but aside from the great price you got on the 6800nu, it's not that great a card compared to the other next-gen offerings.
What other next gen cards at the MSRP $300 price point are better than it, Jiffy?[/quote]
Well, none obviously, because there are no other next-gen cards currently at the $300 price point; however I believe that the step up to a $400 GT is well worth the money (if you're going to spend $300 on a card anyway, it better last...)

Since all the other next gen cards are MSRP $400 or $500, and compete with each other, this doesn't make much sense?
How great of a card do you think it is compared to your 9800Pro?

I think the 6800nu is a vastly superior card to my 9800 Pro, which I paid $175 for. However if I would buy a next-gen card, I wouldn't get a 'borderline' card like the 6800nu, which obviously is faster than a previous-gen 9800 Pro; I'd opt for the significantly faster GT.

Is it that tough to say "Personally, I recommend the GeForce FX series (even the lesser 6800nu, the ugly stepsister to the 6800GT), but I have a personal preference towards Nvidia."
I've said that I have a personal preference to the nV40 line over the R420 line about 858 times in print now, so I don't get your point?[/quote]

You've always favoured Nvidia, not just this generation and that is the point that gets us all annoyed - you constantly deny this fact, even though you unloaded your 9800 Pro for a 5800U (if this doesn't show bias, what does?).
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,039
32,530
146
You've always favoured Nvidia, not just this generation and that is the point that gets us all annoyed - you constantly deny this fact, even though you unloaded your 9800 Pro for a 5800U (if this doesn't show bias, what does?).
I don't agree with you on this point. The 9800p treated me great, but outside of FarCry not one game I played benefited enough from the 9800p to make any difference to me. Even Halo ran great on the 5800U@10x7 *native res of the LCD i was using* Rollo continued to reiterate that there just weren't many games *at least worthwhile ones* taking advantage of the PS2 superiority of the Radeon, and there really wasn't. It's quickly changing now and I agree he is a bit too pumped about SM3, but the trade we made actually benefits him in the end. His card is rare and will gain value as a collectors item *really it will* but the 9800pro 128mb will only continue to lose value as it's performance falls further back from newer cards.

I would take the 9800p over the 5800U now for gaming *and I did ;) * as more PS2+ games are hitting the market, but at the time we worked the deal there was only 1 game I found worth playing that the 5800U failed to play exceptably@the same settings I used for the 9800p. I guess I'm saying that the higher FPS and features of the 9800p just didn't make a big difference up till FarCry, and the extra FPS weren't a big deal because both provided readily playable experiences outside of that one title. Consequently I don't think it made him bias for swapping them, just a vid card afficianado :)
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
even though you unloaded your 9800 Pro for a 5800U (if this doesn't show bias, what does?).

9800pro for a 5800u isn't a bad trade, in fact, following a thread by Duvie (? I think) the 5800u is a monster workstation card waiting to be unleashed (and perhaps the last of its kind)....somebody will always want a card like that.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
keys, I trust the FRAPS info you posted--no need for further proof. Thanks for the info. 30fps minimum sounds good for a single-player FPS with some AA and AF.
 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
Originally posted by: Pete
keys, I trust the FRAPS info you posted--no need for further proof. Thanks for the info. 30fps minimum sounds good for a single-player FPS with some AA and AF.

yea, no disputing that. it is what it is. ;)
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Ok! Very cool. Thanks Pete, Gururu. This is kind of why I was skeptical of the 5950 results on that Xbit article. Something must be off there. Dunno.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
NV3x may surely have been lacking in shader performacne, but certainly not lacking in gaming performance I hope you would agree as I have posted my rudimentary benchmarks in FarCry.
Your benchmark is an apples vs oranges comparison. Force the game to use the SM 2.0 path and it'll run at half the speed of the R3xx series.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Its just a pissing contest in this thread, nothing new is being said. WTG guys.

Yeah all the graphics cards on the side you dont like only the run the game at 640x480x16 with no AA/AF at 15fps.

You got us fanbois, way to go. :roll:
 
Mar 19, 2003
18,289
2
71
Originally posted by: Acanthus
Its just a pissing contest in this thread, nothing new is being said. WTG guys.

Yeah all the graphics cards on the side you dont like only the run the game at 640x480x16 with no AA/AF at 15fps.

You got us fanbois, way to go. :roll:

Yeah...and here I was looking for information about the 1.2 patch. Silly me. :confused:
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
The 5800U has stirred up more controversy for me than my dangling cross earring and red dyed hair style did in the mid 80s!
;)


Oh well. Those of you determined to call me a "nVidia fanboy", enjoy, you're welcome to your opinions. (don't be surprised if I buy and start posting about other brands in the future though)

BTW- Doom3 on 5800U/AXP1700+ benches coming soon. ;)
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: BFG10K
NV3x may surely have been lacking in shader performacne, but certainly not lacking in gaming performance I hope you would agree as I have posted my rudimentary benchmarks in FarCry.
Your benchmark is an apples vs oranges comparison. Force the game to use the SM 2.0 path and it'll run at half the speed of the R3xx series.

Would you please tell me why I would want to destroy half my performance? Is my goal here not to be able to play the game? What merit does this suggestion have that will do me any good? Here I just told you that I was happy with my IQ and my performance, and you tell me exactly how to take that all away.
So tell me, are you here for good or evil? Or is it whatever suits your purpose at the time...

Sheesh.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,039
32,530
146
Originally posted by: Rollo
The 5800U has stirred up more controversy for me than my dandling cross earring and red dyed hair style did in the mid 80s!
;)


Oh well. Those of you determined to call me a "nVidia fanboy", enjoy, you're welcome to your opinions. (don't be surprised if I buy and start posting about other brands in the future though)

BTW- Doom3 on 5800U/AXP1700+ benches coming soon. ;)
Heck, I'd of just given you the card if I'd of known it was going to be the turd in the swimming pool it turned into :p The point about modding it to a much more expensive workstation card is excellent too.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Would you please tell me why I would want to destroy half my performance?
Are you interested in performing a valid comparison or not?

Here I just told you that I was happy with my IQ and my performance, and you tell me exactly how to take that all away.
No, I told you how to compare it accurately to other cards.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: DAPUNISHER
Originally posted by: Rollo
The 5800U has stirred up more controversy for me than my dandling cross earring and red dyed hair style did in the mid 80s!
;)


Oh well. Those of you determined to call me a "nVidia fanboy", enjoy, you're welcome to your opinions. (don't be surprised if I buy and start posting about other brands in the future though)

BTW- Doom3 on 5800U/AXP1700+ benches coming soon. ;)
Heck, I'd of just given you the card if I'd of known it was going to be the turd in the swimming pool it turned into :p The point about modding it to a much more expensive workstation card is excellent too.

Fool! To give out 5800Us, you have to own 5800Us! That would mean you are a nVidiot fanboy, irrespective of the ability to mod them into $1000 workstation cards!

When will you nVidiots ever learn?!?!?!?!

The only decent cards made have pictures of Fred Flintstone's pet dinosaur on the HSF!

;)
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: BFG10K
Would you please tell me why I would want to destroy half my performance?
Are you interested in performing a valid comparison or not?

Here I just told you that I was happy with my IQ and my performance, and you tell me exactly how to take that all away.
No, I told you how to compare it accurately to other cards.

So in Far Cry, your saying that my 5xxx card defaults to the 1.1 spec? And that I'd have to force 2.0 manually?

So what did the 61.76 driver, DX9.0c, and the 1.2 Crytek patch do for me? How can I tell if Far Cry is being run in 1.1 or 2.0?

And by the way, I am not concerned with how well R300 cards run this game. I don't own one so there is no reason to compare the R300 to my card. I am not disputing the fact that the R300 runs the game better. All I care about is how it runs on my card.