Renaming F.E.A.R.'s .Exe increases performance on ATi cards ?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Rollo, it's called a BUG! Both ATI and nVidia have LISTS of them attached to each driver release.

There's no conspiracy, only a backwards "if" statement. If you have proof then by all means share it with the rest of us so that we can evaluate it ourselves. Otherwise, quit your incoherent ATI bashing and let the thread get back on topic.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Dude, I work in the software industry. I don't code, but I do plenty of beta testing. If something we wrote can run 15% -20% faster by us changing one letter in a line of code with no adverse effects:
A. It would have been done before the product saw market
B. It would be done yesterday upon realization.
Do you even have the most basic understanding of the concept of device drivers and how they interact with applications? Do you understand what an application specific optimization is and what its limitations are?

Perhaps you should get a clue, "beta tester", before mouthing off about things you know nothing about.
 

jasonja

Golden Member
Feb 22, 2001
1,864
0
0
Originally posted by: Rollo
Originally posted by: Matthias99

Actually, the original topic of the thread was why renaming an app could cause the performance to change. Dragging up old optimization/'cheating' scandals from ATI or NVIDIA is already going off-topic.

In any case, the verdict on this one seems to be that it doesn't change IQ, and it is a driver bug causing a performance hit.

I disagree Mathias.

I think it's a little to good to be true that ATI missed something as big as this and some clever kids on the internet just happened to find a way to make one of the most popular and benchmarked games out right now run 10-15fps faster with no IQ loss.

What I was commenting on is ATI saying "Great find! Thank you kids, although it is a high paid job for many of our programming staff to make popular games run faster at better IQ on our hardware, none of us even noticed that our foolish app specific optomizations make the retail version of the game run much SLOWER! Thank God you ingenious kids did our jobs for us, we only designed the hardware and wrote the drivers, how could we be expected to know all of our work was screwing us?!?"


LOL

Yeah, right!

Don't you think it's fairly LIKELY ATI tested the game without any optomizations at some point?

Don't you think they have retail before it hits the market?

Don't you think their supposedly professional staff would have noticed something as hokey as this "fix"?

Mathias, you're a nice guy, and a smart guy, but if you honestly think this is just some "big mixup" I've got a KILLER V2 SLI rig to sell you for only half what I paid for it.

That's why I'll believe it when I see it, and that's why I want to see some testing of it.

This "fix" is the equivalent of "You mean we should have put the card in the PCIE slot? THAT'S why it wasn't working?!?!?"- way too easy, way too convenient, way unlikely.


For once Rollo you got something right. It's quite obvious that ATI noticed this well before some kid on the net did. CatalystMaker is a marketing/PR guy, not a lead programmer. He's just making the kiddies feel good.

{edit} Well of course now I read Rollo thinks this is also a conspiracy to hide IQ cheats with no proof or reason. You're showing your lack of ANY programming experience because any number of things could slow a program down that have NO change on the eventual outcome. Put on your aluminum foil beanie hat Rollo...
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: jasonja
Originally posted by: Rollo
Originally posted by: Matthias99

Actually, the original topic of the thread was why renaming an app could cause the performance to change. Dragging up old optimization/'cheating' scandals from ATI or NVIDIA is already going off-topic.

In any case, the verdict on this one seems to be that it doesn't change IQ, and it is a driver bug causing a performance hit.

I disagree Mathias.

I think it's a little to good to be true that ATI missed something as big as this and some clever kids on the internet just happened to find a way to make one of the most popular and benchmarked games out right now run 10-15fps faster with no IQ loss.

What I was commenting on is ATI saying "Great find! Thank you kids, although it is a high paid job for many of our programming staff to make popular games run faster at better IQ on our hardware, none of us even noticed that our foolish app specific optomizations make the retail version of the game run much SLOWER! Thank God you ingenious kids did our jobs for us, we only designed the hardware and wrote the drivers, how could we be expected to know all of our work was screwing us?!?"


LOL

Yeah, right!

Don't you think it's fairly LIKELY ATI tested the game without any optomizations at some point?

Don't you think they have retail before it hits the market?

Don't you think their supposedly professional staff would have noticed something as hokey as this "fix"?

Mathias, you're a nice guy, and a smart guy, but if you honestly think this is just some "big mixup" I've got a KILLER V2 SLI rig to sell you for only half what I paid for it.

That's why I'll believe it when I see it, and that's why I want to see some testing of it.

This "fix" is the equivalent of "You mean we should have put the card in the PCIE slot? THAT'S why it wasn't working?!?!?"- way too easy, way too convenient, way unlikely.


For once Rollo you got something right. It's quite obvious that ATI noticed this well before some kid on the net did. CatalystMaker is a marketing/PR guy, not a lead programmer. He's just making the kiddies feel good.

{edit} Well of course now I read Rollo thinks this is also a conspiracy to hide IQ cheats with no proof or reason. You're showing your lack of ANY programming experience because any number of things could slow a program down that have NO change on the eventual outcome. Put on your aluminum foil beanie hat Rollo...


LOL- I've never said I was a programmer and never pretended to be. The closest I get is some SQL.

The point is you don't have to be a programmer to see what a farce this whole deal is.

As you seem to agree with me that this wasn't discovered by lovable hackers, is it such a stretch to think Catalyst Maker the PR guy is just buying time till the driver team can replicate that speed while rendering the image correctly?

I think it's pretty obvious to anyone not determined to defend the honor of ATI to the death that if the game ran better without any optomizations, they would have noticed that and not bothered.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Having played the FEAR demo and full version on both an X850XT PE and a 7800GT, I can say that something was off on the X850XT PE. While the demo stuttered on both cards, it was at least playable and framerates were decent otherwise. Going from the demo to the full version of FEAR on the X850XT PE netted absolutely no increase whatsoever; indeed the same section from the demo actually played a bit slower on the X850XT PE. Something felt off, so the fact that a bug has been discovered does not surprise me whatsoever.
 

nts

Senior member
Nov 10, 2005
279
0
0
Originally posted by: Rollo
...
As you seem to agree with me that this wasn't discovered by lovable hackers, is it such a stretch to think Catalyst Maker the PR guy is just buying time till the driver team can replicate that speed while rendering the image correctly?
...

There is absolutely no proof that image quality is affected in any way, and IQ taking a hit doesn't even make sense in this case.

Why do you keep saying IQ is affected when you have no proof?

 

jasonja

Golden Member
Feb 22, 2001
1,864
0
0
Originally posted by: Rollo
Originally posted by: jasonja
Originally posted by: Rollo
Originally posted by: Matthias99

Actually, the original topic of the thread was why renaming an app could cause the performance to change. Dragging up old optimization/'cheating' scandals from ATI or NVIDIA is already going off-topic.

In any case, the verdict on this one seems to be that it doesn't change IQ, and it is a driver bug causing a performance hit.

I disagree Mathias.

I think it's a little to good to be true that ATI missed something as big as this and some clever kids on the internet just happened to find a way to make one of the most popular and benchmarked games out right now run 10-15fps faster with no IQ loss.

What I was commenting on is ATI saying "Great find! Thank you kids, although it is a high paid job for many of our programming staff to make popular games run faster at better IQ on our hardware, none of us even noticed that our foolish app specific optomizations make the retail version of the game run much SLOWER! Thank God you ingenious kids did our jobs for us, we only designed the hardware and wrote the drivers, how could we be expected to know all of our work was screwing us?!?"


LOL

Yeah, right!

Don't you think it's fairly LIKELY ATI tested the game without any optomizations at some point?

Don't you think they have retail before it hits the market?

Don't you think their supposedly professional staff would have noticed something as hokey as this "fix"?

Mathias, you're a nice guy, and a smart guy, but if you honestly think this is just some "big mixup" I've got a KILLER V2 SLI rig to sell you for only half what I paid for it.

That's why I'll believe it when I see it, and that's why I want to see some testing of it.

This "fix" is the equivalent of "You mean we should have put the card in the PCIE slot? THAT'S why it wasn't working?!?!?"- way too easy, way too convenient, way unlikely.


For once Rollo you got something right. It's quite obvious that ATI noticed this well before some kid on the net did. CatalystMaker is a marketing/PR guy, not a lead programmer. He's just making the kiddies feel good.

{edit} Well of course now I read Rollo thinks this is also a conspiracy to hide IQ cheats with no proof or reason. You're showing your lack of ANY programming experience because any number of things could slow a program down that have NO change on the eventual outcome. Put on your aluminum foil beanie hat Rollo...


LOL- I've never said I was a programmer and never pretended to be. The closest I get is some SQL.

The point is you don't have to be a programmer to see what a farce this whole deal is.

As you seem to agree with me that this wasn't discovered by lovable hackers, is it such a stretch to think Catalyst Maker the PR guy is just buying time till the driver team can replicate that speed while rendering the image correctly?

I think it's pretty obvious to anyone not determined to defend the honor of ATI to the death that if the game ran better without any optomizations, they would have noticed that and not bothered.

I think your conspiracy theory is crap... the only thing I agree with you on is that ATI was already aware of the issue. and didn't need somebody in the forums to point it out to them. They had access to final game and to be released drivers long before the public does. Why do they need to buy time? There are NO IQ differences. The facts are you just love to turn everything like this that happens to ATI into some sort of scandal, but when nVidia has "bugs" they are just bugs.
 

crazydingo

Golden Member
May 15, 2005
1,134
0
0
Originally posted by: jasonja
CatalystMaker is a marketing/PR guy, not a lead programmer. He's just making the kiddies feel good.
I thought CatalystMaker is Terry Makedon himself. :confused: You know the guy who broke 5150joker's heart. :brokenheart: :D

Originally posted by: nts
Originally posted by: Rollo
...
As you seem to agree with me that this wasn't discovered by lovable hackers, is it such a stretch to think Catalyst Maker the PR guy is just buying time till the driver team can replicate that speed while rendering the image correctly?
...

There is absolutely no proof that image quality is affected in any way, and IQ taking a hit doesn't even make sense in this case.

Why do you keep saying IQ is affected when you have no proof?
Agreed.

If there was any difference in IQ, we would have had couple of articles already.

<fixed typos>
 

jasonja

Golden Member
Feb 22, 2001
1,864
0
0
Originally posted by: nts
Originally posted by: Rollo
...
As you seem to agree with me that this wasn't discovered by lovable hackers, is it such a stretch to think Catalyst Maker the PR guy is just buying time till the driver team can replicate that speed while rendering the image correctly?
...

There is absolutely no proof that image quality is affected in any way, and IQ taking a hit doesn't even make sense in this case.

Why do you keep saying IQ is affected when you have no proof?

Because he's Rollo and slandering ATI is what he does. Don't look for reason in his actions, think of him as one of those rednecks driving a big Ford truck with a Calvin taking a whiz on a Chevy symbol sticker on the rear window.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: crazydingo

I thought CatalystMaker is Terry Makedon himself. :confused: You know the guy who broke 5150joker's heart. :brokenheart: :D

Broke my heart? You mean when he brushed me and a bunch of others off when we asked for alpha antialiasing? In that case, it's ok, I made enough noise about it and then showed my lack of enthusiasm for ATi by buying a GTX.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: nts
Originally posted by: Rollo
...
As you seem to agree with me that this wasn't discovered by lovable hackers, is it such a stretch to think Catalyst Maker the PR guy is just buying time till the driver team can replicate that speed while rendering the image correctly?
...

There is absolutely no proof that image quality is affected in any way, and IQ taking a hit doesn't even make sense in this case.

Why do you keep saying IQ is affected when you have no proof?

Why do you say it isn't when you have no proof?

IQ taking a hit DOES make sense, remember the deal with nVidia getting faster fps on some bench recently because a bug in the drivers made their cards not render shadows properly? Same could be happening here.

 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: jasonja
Originally posted by: nts
Originally posted by: Rollo
...
As you seem to agree with me that this wasn't discovered by lovable hackers, is it such a stretch to think Catalyst Maker the PR guy is just buying time till the driver team can replicate that speed while rendering the image correctly?
...

There is absolutely no proof that image quality is affected in any way, and IQ taking a hit doesn't even make sense in this case.

Why do you keep saying IQ is affected when you have no proof?

Because he's Rollo and slandering ATI is what he does. Don't look for reason in his actions, think of him as one of those rednecks driving a big Ford truck with a Calvin taking a whiz on a Chevy symbol sticker on the rear window.


So should we likewise think of ATi fans as sissies that drive pink scooters and cry when they get a hangnail? Just wondering.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: jasonja
Originally posted by: nts
Originally posted by: Rollo
...
As you seem to agree with me that this wasn't discovered by lovable hackers, is it such a stretch to think Catalyst Maker the PR guy is just buying time till the driver team can replicate that speed while rendering the image correctly?
...

There is absolutely no proof that image quality is affected in any way, and IQ taking a hit doesn't even make sense in this case.

Why do you keep saying IQ is affected when you have no proof?

Because he's Rollo and slandering ATI is what he does. Don't look for reason in his actions, think of him as one of those rednecks driving a big Ford truck with a Calvin taking a whiz on a Chevy symbol sticker on the rear window.


LOL, dude, I'm from the Midwest, and probably get to laugh at more of those than you!

Heh- and I drive a Silverado. ;)
 

crazydingo

Golden Member
May 15, 2005
1,134
0
0
Originally posted by: 5150Joker
Originally posted by: crazydingo
I thought CatalystMaker is Terry Makedon himself. :confused: You know the guy who broke 5150joker's heart. :brokenheart: :D
Broke my heart? You mean when he brushed me and a bunch of others off when we asked for alpha antialiasing?
No, I was referring to the incident where he donated a card to some guy and you got upset over it and felt neglected. :p
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Rollo
IQ taking a hit DOES make sense, remember the deal with nVidia getting faster fps on some bench recently because a bug in the drivers made their cards not render shadows properly? Same could be happening here.

You just don't get it do you? This is not Quake/Quake, this is the opposite. Quake/Quack was an optimization that improved performance (one which these days both ATI and Nvidia frequently do since it didn't sacrifice image quality and would be considered quite benign by today's standards).

This is an error whereby changing the EXE name increases performance due to a stupid coding error. It is an acknowledged error in an optimization.

Your skill for putting a negative and conspiratorial spin on all things ATI is impressive, I have to give you that. And your penchant for avoiding the mods is even more skillful, given that the content of many of your latest posts are the epitome of FUD.
 

jasonja

Golden Member
Feb 22, 2001
1,864
0
0
Originally posted by: crazydingo
I thought CatalystMaker is Terry Makedon himself. :confused: You know the guy who broke 5150joker's heart. :brokenheart: :D

He is... Terry is a product manager, but forum posting = marketing
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: crazydingo
Originally posted by: 5150Joker
Originally posted by: crazydingo
I thought CatalystMaker is Terry Makedon himself. :confused: You know the guy who broke 5150joker's heart. :brokenheart: :D
Broke my heart? You mean when he brushed me and a bunch of others off when we asked for alpha antialiasing?
No, I was referring to the incident where he donated a card to some guy and you got upset over it and felt neglected. :p


Oh LOL yeah, damn that DW!
 

crazydingo

Golden Member
May 15, 2005
1,134
0
0
Originally posted by: jasonja
Originally posted by: crazydingo
I thought CatalystMaker is Terry Makedon himself. :confused: You know the guy who broke 5150joker's heart. :brokenheart: :D
He is... Terry is a product manager, but forum posting = marketing
Aight. He doesnt post much though 50 posts over 2 years.
 

jasonja

Golden Member
Feb 22, 2001
1,864
0
0
Originally posted by: 5150Joker
Originally posted by: jasonja
Originally posted by: nts
Originally posted by: Rollo
...
As you seem to agree with me that this wasn't discovered by lovable hackers, is it such a stretch to think Catalyst Maker the PR guy is just buying time till the driver team can replicate that speed while rendering the image correctly?
...

There is absolutely no proof that image quality is affected in any way, and IQ taking a hit doesn't even make sense in this case.

Why do you keep saying IQ is affected when you have no proof?

Because he's Rollo and slandering ATI is what he does. Don't look for reason in his actions, think of him as one of those rednecks driving a big Ford truck with a Calvin taking a whiz on a Chevy symbol sticker on the rear window.


So should we likewise think of ATi fans as sissies that drive pink scooters and cry when they get a hangnail? Just wondering.


Way to miss my point... but hey at least Rollo got it.

p.s. Don't knock my scooter... it gets great MPH, and a hangnail can really hurt!
 

jasonja

Golden Member
Feb 22, 2001
1,864
0
0
Originally posted by: Rollo
Originally posted by: nts
Originally posted by: Rollo
...
As you seem to agree with me that this wasn't discovered by lovable hackers, is it such a stretch to think Catalyst Maker the PR guy is just buying time till the driver team can replicate that speed while rendering the image correctly?
...

There is absolutely no proof that image quality is affected in any way, and IQ taking a hit doesn't even make sense in this case.

Why do you keep saying IQ is affected when you have no proof?

Why do you say it isn't when you have no proof?

IQ taking a hit DOES make sense, remember the deal with nVidia getting faster fps on some bench recently because a bug in the drivers made their cards not render shadows properly? Same could be happening here.


So your theory is guilty until proven innocent for ATI and vice versa for nVidia. You know what else makes sense? There was an optimization that helped the Fear demo, but then turned out to hurt the full version of the game. This makes a lot more sense than IQ considering several people have tested it and said there is no different in IQ!
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: jiffylube1024
Your skill for putting a negative and conspiratorial spin on all things ATI is impressive, I have to give you that. And your penchant for avoiding the mods is even more skillful, given that the content of many of your latest posts are the epitome of FUD.

Thanks Jiffy, I think you're pretty smart as well, and always have. :):beer:

 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Rollo
Originally posted by: Matthias99
In any case, the verdict on this one seems to be that it doesn't change IQ, and it is a driver bug causing a performance hit.

I disagree Mathias.

First, off, my username has two 't's in it. It's right there in front of you; please take a half a second and spell it right.

I think it's a little to good to be true that ATI missed something as big as this

Have you seen the kind of bugs -- real bugs, not just performance issues -- that make it into both ATI's and NVIDIA's drivers from time to time? Neither company has the resources or time to sit there exhaustively testing every single application and combination of hardware and settings for every driver release.

Also, if you look at the first time this was posted (this thread is a repost), the FIRST THING I suggested was that ATI might be purposefully disabling some general CatalystAI optimization that looks bad in FEAR. But that does not seem to be the case (or if it is affecting IQ, it is in a pretty subtle way).

Don't you think it's fairly LIKELY ATI tested the game without any optomizations at some point?

Yes. However, the explanation that was given was that this was an optimization designed for the FEAR demo (and that it had a positive effect there). I could easily see them not thinking to retest all of the optimizations again for the retail version (and as other posters have pointed out, they may not have even had a specific "demo" build, and the code could have changed under them when the game went to retail).

Don't you think they have retail before it hits the market?

Depends on the developer, I would think. They probably have more access to more code than you or I, but I can't give a blanket statment saying whether or not they have access to final retail code for a game before it comes out (let alone when they would have it relative to the release). They could have easily been testing the drivers (which have a 1-2 month lead time) off of an earlier build of the game, and then something changed in the game engine in the retail release or with a post-retail patch.

Don't you think their supposedly professional staff would have noticed something as hokey as this "fix"?

I would think that sanity-checking that application-specific optimizations actually have a positive effect would be standard practice, but I don't know enough about their testing and this exact situation. ATI's performance didn't seem to change much between the FEAR beta and retail, so they may not have thought to go back and go through any optimizations they had done for it with a fine-toothed comb. Maybe they would have caught it next week and fixed it in next month's driver anyway. I don't know, and neither do you.

Mathias, you're a nice guy, and a smart guy, but if you honestly think this is just some "big mixup" I've got a KILLER V2 SLI rig to sell you for only half what I paid for it.

:disgust:

That's why I'll believe it when I see it, and that's why I want to see some testing of it.

I'd like to see it tested systematically as well (and a better explanation from ATI as to what they were trying to do and why it went awry).

But when people who have tested it say there is no IQ loss that they can see, and an ATI rep posts saying it's a bug due to an optimization for the demo that adversely affects performance in the retail game, that sounds like a pretty reasonable explanation. I don't see anything here to suggest that ATI was 'cheating', and the more reasonable explanation is that it is just a bug.

This "fix" is the equivalent of "You mean we should have put the card in the PCIE slot? THAT'S why it wasn't working?!?!?"- way too easy, way too convenient, way unlikely.

No, frankly, it's not. Performance issues in complex 3D drivers and applications are not that easy to diagnose or fix.

I like the part about "Good find! It won't be in our driver release tomorrow, but we'll put it in sometime!"

LOL- how hard is it to change one character in the name of the executable and make the release tomorrow if this is really totally without issues?

You do realize that drivers like this (at least in good software development/QA setups) go through an extensive testing period, don't you? They'd have to hold up the entire 5.11 release just to test this one bugfix. It's easier and safer to put it in the pipeline for 5.12.

I'm saying they needed whatever was in the drivers making it slower to render comparable IQ, and that they knew that and couldn't release it unoptomized for fear of another Quack scandal.

'Quack', as it turned out, was also an ATI driver bug -- IIRC, they were applying an R1XX optimization to the R2XX hardware by mistake. The next version of their drivers fixed the IQ loss and kept the performance gain. Funny how you haven't seemed to mention that.
 

kurt454

Senior member
May 30, 2001
773
0
76
Picked up the performance on mine. I am currently running an X800GTO. I can't tell any differences in IQ. If there are any, I guess you would have to take screenshots to see it. /shrugs
 

VERTIGGO

Senior member
Apr 29, 2005
826
0
76
I got some small performance increases with the AI disabled (renaming the .exe), but i've found the single greatest factor to be the AA setting. Even temporal 4x /2 didn't help it out. I'm not sure if this is a problem with my particular setup, or if everyone sees the same:

I run 1280x1024 with all settings maxed except soft shadows, FEAR2.exe:

MIN 13
AVG 22
MAX 56

59% below 25
38% between 25 and 40
3% above 40

As soon as I disabled AA, I got this:

MIN 27
AVG 45
MAX 115

0% below 25
37% between 25 and 40
63% above 40

Clearly, until I decide on an SLI/CSF setup, my AAAddicted ass will have to settle for jaggies:disgust: