Renaming F.E.A.R.'s .Exe increases performance on ATi cards ?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Rollo
Originally posted by: Creig
Originally posted by: Rollo
LOL
ATI says "it was all just a big misunderstanding- we were trying to cheat on the demo to sell cards- and it hosed the retail version!" and we all believe them.
Not me, Ackmed, I think I'll withhold judgement till some websites analyze the wonder fix.

You're truly amazing, Rollo.
Thanks Creig! Nice of you to say!


Originally posted by: Rollo
I remember Quack, 3DMark, Trylinear, etc. and think that ATI is in a position these days that they'd say or do about anything to try and save some face.

Yes, yes... And nV was caught cheating big time on 3DMark. But as we all know, your memory is amazingly selective.
No, I just like to stay "on topic". The thread is about ATI app specific driver optomizations, not nVidia app specific driver optomizations.
You'll notice I didn't mention S3, Matrox, Intel, 3DFX, Rendition, or PowerVR either?


Know why?


Same reason- off topic. ;):beer:

-Amazing Rollo

[/quote]

If the IQ stays the same, there's nothing wrong with app-specific optimizations. This aplies to both companies - Ati's FEAR and Doom3 optimizations, or Nv's FartCry and shader replacement optimizations. But the nv30 cheats were a new low for Nv.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Rollo
Originally posted by: Creig
Originally posted by: Rollo
I remember Quack, 3DMark, Trylinear, etc. and think that ATI is in a position these days that they'd say or do about anything to try and save some face.

Yes, yes... And nV was caught cheating big time on 3DMark. But as we all know, your memory is amazingly selective.
No, I just like to stay "on topic". The thread is about ATI app specific driver optomizations, not nVidia app specific driver optomizations.
You'll notice I didn't mention S3, Matrox, Intel, 3DFX, Rendition, or PowerVR either?

Know why?

Same reason- off topic. ;):beer:

-Amazing Rollo

Actually, the original topic of the thread was why renaming an app could cause the performance to change. Dragging up old optimization/'cheating' scandals from ATI or NVIDIA is already going off-topic.

In any case, the verdict on this one seems to be that it doesn't change IQ, and it is a driver bug causing a performance hit.
 

crazydingo

Golden Member
May 15, 2005
1,134
0
0
Originally posted by: Rollo
Originally posted by: Creig
Yes, yes... And nV was caught cheating big time on 3DMark. But as we all know, your memory is amazingly selective.
No, I just like to stay "on topic". The thread is about ATI app specific driver optomizations, not nVidia app specific driver optomizations.
You'll notice I didn't mention S3, Matrox, Intel, 3DFX, Rendition, or PowerVR either?


Know why?


Same reason- off topic. ;):beer:

-Amazing Rollo
[/quote]
Coming from a guy who goes into the X1800 XT thread and brings up the 512MB GTX, this is hilarious. :laugh: One day you will get there Rollo. ;) (becoming turtle mark II) :D
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Rollo
No, I just like to stay "on topic". The thread is about ATI app specific driver optomizations, not nVidia app specific driver optomizations.

Actually, no. The thread topic is "Renaming F.E.A.R.'s .Exe increases performance on ATi cards ?", not "Let's dredge up what ATI may or may not have done years ago".

So much for you staying "on topic"...
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: keysplayr2003
Originally posted by: Ackmed
It has been said that its the same IQ. Unless you have proof otherwise, then its the same. There is a very good thread about it on the real forums.

How bout, unless you have proof it's the same, it's otherwise. That works also here.
You have a point, but try not to make it one sided all the time. That is one of the biggest problems in this forum. I can't believe you actually said "I take the lead ATI programmers word." That is so un-Ackmed like.

He is a regular there, as was as other from ATi. They put not only their rep, but their companies rep on the line when posting. He has never posted something that wasnt correct from what I have seen. I would believe the same from the head driver programmer from NV as well, if I knew their posting habits as well. Its not one sided. People have said they cant tell the difference, I believe them and Cat Maker.

rollo, believe quack was a cheat all you want, I dont. But I guess you cant stay on topic, even though you claim that you are.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Matthias99

Actually, the original topic of the thread was why renaming an app could cause the performance to change. Dragging up old optimization/'cheating' scandals from ATI or NVIDIA is already going off-topic.

In any case, the verdict on this one seems to be that it doesn't change IQ, and it is a driver bug causing a performance hit.

I disagree Mathias.

I think it's a little to good to be true that ATI missed something as big as this and some clever kids on the internet just happened to find a way to make one of the most popular and benchmarked games out right now run 10-15fps faster with no IQ loss.

What I was commenting on is ATI saying "Great find! Thank you kids, although it is a high paid job for many of our programming staff to make popular games run faster at better IQ on our hardware, none of us even noticed that our foolish app specific optomizations make the retail version of the game run much SLOWER! Thank God you ingenious kids did our jobs for us, we only designed the hardware and wrote the drivers, how could we be expected to know all of our work was screwing us?!?"


LOL

Yeah, right!

Don't you think it's fairly LIKELY ATI tested the game without any optomizations at some point?

Don't you think they have retail before it hits the market?

Don't you think their supposedly professional staff would have noticed something as hokey as this "fix"?

Mathias, you're a nice guy, and a smart guy, but if you honestly think this is just some "big mixup" I've got a KILLER V2 SLI rig to sell you for only half what I paid for it.

That's why I'll believe it when I see it, and that's why I want to see some testing of it.

This "fix" is the equivalent of "You mean we should have put the card in the PCIE slot? THAT'S why it wasn't working?!?!?"- way too easy, way too convenient, way unlikely.




 

route66

Senior member
Sep 8, 2005
295
0
0
Originally posted by: crazydingo
Coming from a guy who goes into the X1800 XT thread and brings up the 512MB GTX, this is hilarious. :laugh: One day you will get there Rollo. ;) (becoming turtle mark II) :D

QFT.

Also, I don't see why Rollo thinks ATi is trying to hide something by purposefully making it slower - ever hear of mistakes?

Now please, go away and play Sacrifice or something...

 

nts

Senior member
Nov 10, 2005
279
0
0
Rollo, do you actually know (read) what you type or do you just tune in and out?

I'm sorry but what you are saying makes very little sense. Go read your post and think it through.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: route66
Originally posted by: crazydingo
Coming from a guy who goes into the X1800 XT thread and brings up the 512MB GTX, this is hilarious. :laugh: One day you will get there Rollo. ;) (becoming turtle mark II) :D

QFT.

Also, I don't see why Rollo thinks ATi is trying to hide something by purposefully making it slower - ever hear of mistakes?

Now please, go away and play Sacrifice or something...


I think you're still misunderstanding me- I'm saying they're hiding something by saying this is a good way to make it FASTER.

I'm saying they needed whatever was in the drivers making it slower to render comparable IQ, and that they knew that and couldn't release it unoptomized for fear of another Quack scandal.

I'm saying that they're covering up to hopefully get close to this performance with a driver that will render close enough to what it should.

I like the part about "Good find! It won't be in our driver release tomorrow, but we'll put it in sometime!"

LOL- how hard is it to change one character in the name of the executable and make the release tomorrow if this is really totally without issues?

Sounds rough- change FEAR.exe to FAR.exe, and now your drivers provide the magic performance increase and everyone is happy.

Or could it be they know why they can't do that and are stalling, letting you do that, and if the reason is uncovered they have no liability?

-Amazing Rollo
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: nts
Rollo, do you actually know (read) what you type or do you just tune in and out?

I'm sorry but what you are saying makes very little sense. Go read your post and think it through.


Dude, I work in the software industry. I don't code, but I do plenty of beta testing. If something we wrote can run 15% -20% faster by us changing one letter in a line of code with no adverse effects:
A. It would have been done before the product saw market
B. It would be done yesterday upon realization.

You "think about it". You're way too trusting.
 

route66

Senior member
Sep 8, 2005
295
0
0
Originally posted by: Rollo
I'm saying they needed whatever was in the drivers making it slower to render comparable IQ, and that they knew that and couldn't release it unoptomized for fear of another Quack scandal.

Do you have proof that IQ is worse? No, no one does. In fact some people are saying it looks fine. You don't have to believe everything people say, I don't, but your opinion is far from fact. Actually, your crackpot theory is pretty laughable.

LOL- how hard is it to change one character in the name of the executable and make the release tomorrow if this is really totally without issues?

Well, now we know that you don't work in any kind of software development house with a Q&A department....
 

nts

Senior member
Nov 10, 2005
279
0
0
Originally posted by: Rollo
I think you're still misunderstanding me- I'm saying they're hiding something by saying this is a good way to make it FASTER.

Hiding, how do you come to that conclusion?

I'm saying they needed whatever was in the drivers making it slower to render comparable IQ, and that they knew that and couldn't release it unoptomized for fear of another Quack scandal.

So no optimizations (memory controller ones probably) with default quality, IQ will be worse then optimized quality...

I'm saying that they're covering up to hopefully get close to this performance with a driver that will render close enough to what it should.

I like the part about "Good find! It won't be in our driver release tomorrow, but we'll put it in sometime!"

LOL- how hard is it to change one character in the name of the executable and make the release tomorrow if this is really totally without issues?

You do know that ATi's internal driver is 2 to 3 months/revisions ahead of what is released. You do know that they go through stages in development and you dont change something in a late stage (unless you want to start testing and everything again).

Yes NVIDIA released their Beta of the week which usually break another thing to make something faster.

Sounds rough- change FEAR.exe to FAR.exe, and now your drivers provide the magic performance increase and everyone is happy.

Or could it be they know why they can't do that and are stalling, letting you do that, and if the reason is uncovered they have no liability?

-Amazing Rollo

Yes and we never landed on the moon...


 

nts

Senior member
Nov 10, 2005
279
0
0
Originally posted by: Rollo
Originally posted by: nts
Rollo, do you actually know (read) what you type or do you just tune in and out?

I'm sorry but what you are saying makes very little sense. Go read your post and think it through.


Dude, I work in the software industry. I don't code, but I do plenty of beta testing. If something we wrote can run 15% -20% faster by us changing one letter in a line of code with no adverse effects:
A. It would have been done before the product saw market
B. It would be done yesterday upon realization.

You "think about it". You're way too trusting.


Just FYI I work as a developer (that means coding). Not all changes are simply changing a letter or conditional around, you have no idea how their drivers are coded and what effect it may have somewhere else. Also you don't just make last minute changes to drivers that are to be released the next day (unless you want to delay them), they would atleast need to run through their base tests again.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: route66
Well, now we know that you don't work in any kind of software development house with a Q&A department....

Dude, if the optomizations weren't needed, they wouldn't be there, simple as that. If simply re-naming the executable in the drivers had no ill effect on running FEAR, "FEAR.exe" would become "FEER.exe" until they had time to remove it entirely.

What catastrophe are you implying would happen if this is as harmless as all claim? Do you think that Doom3 will run worse if they change their optomization for FEAR? Isn't that what the "application specific" is all about? No one is posting "Curses, I'm missing half my desk top because I renamed the FEAR.exe" are they? :roll:

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: nts
Originally posted by: Rollo
Originally posted by: nts
Rollo, do you actually know (read) what you type or do you just tune in and out?

I'm sorry but what you are saying makes very little sense. Go read your post and think it through.


Dude, I work in the software industry. I don't code, but I do plenty of beta testing. If something we wrote can run 15% -20% faster by us changing one letter in a line of code with no adverse effects:
A. It would have been done before the product saw market
B. It would be done yesterday upon realization.

You "think about it". You're way too trusting.


Just FYI I work as a developer (that means coding). Not all changes are simply changing a letter or conditional around, you have no idea how their drivers are coded and what effect it may have somewhere else. Also you don't just make last minute changes to drivers that are to be released the next day (unless you want to delay them), they would atleast need to run through their base tests again.


Great, you're a programmer.

As it's your job to code, can you even conceive of releasing product to the public as fubared as this supposedly is?

LOL

'What do you know? We shouldn't have done anything at all!"
 

route66

Senior member
Sep 8, 2005
295
0
0
Originally posted by: Rollo
Originally posted by: route66
Well, now we know that you don't work in any kind of software development house with a Q&A department....

Dude, if the optomizations weren't needed, they wouldn't be there, simple as that. If simply re-naming the executable in the drivers had no ill effect on running FEAR, "FEAR.exe" would become "FEER.exe" until they had time to remove it entirely.

What catastrophe are you implying would happen if this is as harmless as all claim? Do you think that Doom3 will run worse if they change their optomization for FEAR? Isn't that what the "application specific" is all about? No one is posting "Curses, I'm missing half my desk top because I renamed the FEAR.exe" are they? :roll:

You're not answering the question: what do you have to lose by renaming the file to "FEER.EXE"? Apparently nothing - no one has reported any problems or loss of IQ. This is simply just ineptitude on ATIs part, nothing malicious. Until you have proof that ATI is hiding something then please STFU. Not much more to say on this...

 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
I'm of the opinion that ATi simply screwed up optimizing for FEAR and the game runs better when it's not detected by the drivers. I don't think there's anything nefarious going on here.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
BTW NTS- what do you think they use as a baseline when they begin their optomizations?

We all knew the retail version of FEAR ran significantly different than the the demo.

Do you think it's possible ATI knew that as well?

Do you think it's possible they started with the retail version, unoptomized, and worked forward from there to create the driver optomizations for that game?

Or are you saying they probably just said "What the heck? This worked good for the demo, and it's only our livelihood riding on this, lets use the same code for the retail and go drink instead of make sure nothing has changed!"

LOL- I may not code, but if that is the state of affairs in the ATI driver dept, woe unto ATI, they are freaking DOOMED. Nobody does business that slipshod.

(although I suppose you could use the laughable 16X12 60Hz limitation built into X850 Crossfire as evidence they do?)
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: 5150Joker
I'm of the opinion that ATi simply screwed up optimizing for FEAR and the game runs better when it's not detected by the drivers. I don't think there's anything nefarious going on here.

You might be right 5150, but I have a hard time believing ATIs driver dept is that stupid.

No way they would have made that mistake, they have to have QA to prevent it.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Rollo
Originally posted by: 5150Joker
I'm of the opinion that ATi simply screwed up optimizing for FEAR and the game runs better when it's not detected by the drivers. I don't think there's anything nefarious going on here.

You might be right 5150, but I have a hard time believing ATIs driver dept is that stupid.

No way they would have made that mistake, they have to have QA to prevent it.


I dunno man I think you're giving them too much credit. Like you mentioned, they did let the 60 hz x-fire limitation squeak by didnt they?
 

Steelski

Senior member
Feb 16, 2005
700
0
0
Originally posted by: 5150Joker
Originally posted by: Rollo
Originally posted by: 5150Joker
I'm of the opinion that ATi simply screwed up optimizing for FEAR and the game runs better when it's not detected by the drivers. I don't think there's anything nefarious going on here.

You might be right 5150, but I have a hard time believing ATIs driver dept is that stupid.

No way they would have made that mistake, they have to have QA to prevent it.


I dunno man I think you're giving them too much credit. Like you mentioned, they did let the 60 hz x-fire limitation squeak by didnt they?

Thats a hardware problem that can not be fixed at all if the older X800 cards dont support dualthingy...................i cant remember the name of it but its the one that is required for the apple cinema display.
its different for the X1800 series.

 

nts

Senior member
Nov 10, 2005
279
0
0
Originally posted by: Rollo
BTW NTS- what do you think they use as a baseline when they begin their optomizations?

They start getting drops of the game as soon as it begins development (something that can be tested). The demo and retail version besides the assests should be the same (except for the updated game code ofcourse).

We all knew the retail version of FEAR ran significantly different than the the demo.

Do you think it's possible ATI knew that as well?

Yes, working with the developers they are probably aware of the changes being made. To make sure it runs fine on their hardware.

Do you think it's possible they started with the retail version, unoptomized, and worked forward from there to create the driver optomizations for that game?

The question here is, what version did/do they have? They dont get separate special demo and retail versions. Its one copy with whatever assests.

They may have started optimizing on the demo game code. Low level code for graphics should not change, usually development of the engine is finished first then its adding the assests and game specific things. If something huge changed from the demo to the retail version then whatever they were optimizing before may have an adverse effect.

I would be very interested to know what changed, it had to have been something pretty low level and big to have that kind of an effect.

Or are you saying they probably just said "What the heck? This worked good for the demo, and it's only our livelihood riding on this, lets use the same code for the retail and go drink instead of make sure nothing has changed!"

No.

Since it wasn't specified what IF statement they had messed up it might have been a date stamp check on the exe. IF before release apply optimization ELSE dont.

LOL- I may not code, but if that is the state of affairs in the ATI driver dept, woe unto ATI, they are freaking DOOMED. Nobody does business that slipshod.

(although I suppose you could use the laughable 16X12 60Hz limitation built into X850 Crossfire as evidence they do?)

Thats just flamebait,

I maintain my stance that the 5.11 driver was too late to have that change made, if they want they can release a quick fix or fix it with the 5.12. This is probably already in their release notes with a solution.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: 5150Joker
Originally posted by: Rollo
Originally posted by: 5150Joker
I'm of the opinion that ATi simply screwed up optimizing for FEAR and the game runs better when it's not detected by the drivers. I don't think there's anything nefarious going on here.

You might be right 5150, but I have a hard time believing ATIs driver dept is that stupid.

No way they would have made that mistake, they have to have QA to prevent it.


I dunno man I think you're giving them too much credit. Like you mentioned, they did let the 60 hz x-fire limitation squeak by didnt they?


I suppose you could be right. :(
 

EightySix Four

Diamond Member
Jul 17, 2004
5,122
52
91
Guys chill... It happens... Someone made a mistake.
"He who is without sin among you, let him throw a stone at her first"

;) Don't mean to be religious, but trying to make a point.
 

elkinm

Platinum Member
Jun 9, 2001
2,146
0
71
Does this bug apply to all drivers or just the latest ones. I am still using 5.7 and have not updated. Will I see an improvement?