Originally posted by: Rollo
Originally posted by: Matthias99
In any case, the verdict on this one seems to be that it doesn't change IQ, and it is a driver bug causing a performance hit.
I disagree Mathias.
First, off, my username has two 't's in it. It's right there in front of you; please take a half a second and spell it right.
I think it's a little to good to be true that ATI missed something as big as this
Have you seen the kind of bugs -- real bugs, not just performance issues -- that make it into both ATI's and NVIDIA's drivers from time to time? Neither company has the resources or time to sit there exhaustively testing every single application and combination of hardware and settings for every driver release.
Also, if you look at the first time this was posted (this thread is a repost), the FIRST THING I suggested was that ATI might be purposefully disabling some general CatalystAI optimization that looks bad in FEAR. But that does not seem to be the case (or if it is affecting IQ, it is in a pretty subtle way).
Don't you think it's fairly LIKELY ATI tested the game without any optomizations at some point?
Yes. However, the explanation that was given was that this was an optimization designed for the FEAR demo (and that it had a positive effect there). I could easily see them not thinking to retest all of the optimizations again for the retail version (and as other posters have pointed out, they may not have even had a specific "demo" build, and the code could have changed under them when the game went to retail).
Don't you think they have retail before it hits the market?
Depends on the developer, I would think. They probably have more access to more code than you or I, but I can't give a blanket statment saying whether or not they have access to final retail code for a game before it comes out (let alone when they would have it relative to the release). They could have easily been testing the drivers (which have a 1-2 month lead time) off of an earlier build of the game, and then something changed in the game engine in the retail release or with a post-retail patch.
Don't you think their supposedly professional staff would have noticed something as hokey as this "fix"?
I would think that sanity-checking that application-specific optimizations actually have a positive effect would be standard practice, but I don't know enough about their testing and this exact situation. ATI's performance didn't seem to change much between the FEAR beta and retail, so they may not have thought to go back and go through any optimizations they had done for it with a fine-toothed comb. Maybe they would have caught it next week and fixed it in next month's driver anyway. I don't know, and neither do you.
Mathias, you're a nice guy, and a smart guy, but if you honestly think this is just some "big mixup" I've got a KILLER V2 SLI rig to sell you for only half what I paid for it.
:disgust:
That's why I'll believe it when I see it, and that's why I want to see some testing of it.
I'd like to see it tested systematically as well (and a better explanation from ATI as to what they were trying to do and why it went awry).
But when people who have tested it say there is no IQ loss that they can see, and an ATI rep posts saying it's a bug due to an optimization for the demo that adversely affects performance in the retail game, that sounds like a pretty reasonable explanation. I don't see anything here to suggest that ATI was 'cheating', and the more reasonable explanation is that it is just a bug.
This "fix" is the equivalent of "You mean we should have put the card in the PCIE slot? THAT'S why it wasn't working?!?!?"- way too easy, way too convenient, way unlikely.
No, frankly, it's not. Performance issues in complex 3D drivers and applications are not that easy to diagnose or fix.
I like the part about "Good find! It won't be in our driver release tomorrow, but we'll put it in sometime!"
LOL- how hard is it to change one character in the name of the executable and make the release tomorrow if this is really totally without issues?
You do realize that drivers like this (at least in
good software development/QA setups) go through an extensive testing period, don't you? They'd have to hold up the entire 5.11 release just to test this one bugfix. It's easier and safer to put it in the pipeline for 5.12.
I'm saying they needed whatever was in the drivers making it slower to render comparable IQ, and that they knew that and couldn't release it unoptomized for fear of another Quack scandal.
'Quack', as it turned out, was
also an ATI driver bug -- IIRC, they were applying an R1XX optimization to the R2XX hardware by mistake. The next version of their drivers fixed the IQ loss and kept the performance gain. Funny how you haven't seemed to mention
that.