• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

It seems NVidia also have low performance in Max Payne 2

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Max Payne 2 plays fast on ATI and Nvidia cards. It is a well coded game. (NOW HERE THIS - GAME COMPANYS) learn to code and or optimize YOUR games as good as Max Payne 2 :light:
 
Any talk of drivers being less than accurate *cough* aside (since we've all heard enough of that lately) I'm just amazed to see nV pull anything in the IQ department over ATI. It's refreshing really, it's been forever that the basic thought would be ATI=Quality and nV=Speed. To see the rolls reversed is amazing, I think if there were more color in the screen ATI might have had a little bump in that direction. Then again, maybe that has changed too?

Now if nV is going to start having driver problems I'll be really upset. 😉



Crap, they've already slowed down releases.... nooooooo 😀


Edit: BTW I hear that loud and clear Kingmike, I'm sick of games taking the cutting edge in graphic solutions that dont work right.
You can really do some amazing things in DX8 that nobody really explored IMHO.
I look at the Nature test in 3D#2k1 and think OK where's the game that looks like this?
Ohh no you cant have that, we have new tech that looks a tiny bit better but runs much slower. 🙁
 
Originally posted by: stardust
Please pay attention to the Van's lights rite above Max payne's head. Notice that the ATI drivers were less rendered? (those lights are very close to the center of the screen so it shouldn't be affected by ATI's flower like AA filtering.

ATI drivers

Nvidia drivers

Very interesting indeed.

Oh my god, and the rain doesn't match up either! Someone is cheating 😉.

Seriously though, did you actually look at the pictures? The difference is so minute it's laughable. It could be caused by anything, like a slight difference in POV of the two Max's.


Originally posted by: gorillaman
It is minute, but just from one screen shot to the other. But it's not so minute when the cards render billions of pixels and shaders per second. Every little bit counts when your dealing with collosal numbers of operations per millisecond.

This is a contradiction. The headlight is a very minute detail, one which if ATI was "cheating" on would make a 0.0001fps difference in a scene like that.

Perhaps you can write ATI to remove the D3D_MP2_detect_headlight cheat in the next driver, eh 😛 .
 
Originally posted by: jiffylube1024
Originally posted by: stardust
Please pay attention to the Van's lights rite above Max payne's head. Notice that the ATI drivers were less rendered? (those lights are very close to the center of the screen so it shouldn't be affected by ATI's flower like AA filtering.

ATI drivers

Nvidia drivers

Very interesting indeed.

Oh my god, and the rain doesn't match up either! Someone is cheating 😉.

Seriously though, did you actually look at the pictures? The difference is so minute it's laughable. It could be caused by anything, like a slight difference in POV of the two Max's.


Originally posted by: gorillaman
It is minute, but just from one screen shot to the other. But it's not so minute when the cards render billions of pixels and shaders per second. Every little bit counts when your dealing with collosal numbers of operations per millisecond.

This is a contradiction. The headlight is a very minute detail, one which if ATI was "cheating" on would make a 0.0001fps difference in a scene like that.

Perhaps you can write ATI to remove the D3D_MP2_detect_headlight cheat in the next driver, eh 😛 .


R E P O S T (read pete's post)

I did not mean the 2 screenshots to justify why ATi had a better score, I just wanted everyone to know that nvidia's drivers are now at peak IQ maybe even showing a LITTLE bit more than ATI here and there in their honest effort to go in the right direction.
 
Originally posted by: jiffylube1024
It could be caused by anything, like a slight difference in POV of the two Max's.

POV is the same, it's the beginning of the level, camera angel's are set.
 
Originally posted by: stardust


I did not mean the 2 screenshots to justify why ATi had a better score, I just wanted everyone to know that nvidia's drivers are now at peak IQ maybe even showing a LITTLE bit more than ATI here and there in their honest effort to go in the right direction.

How does this screenshot prove that Nvidia's drivers have better IQ than ATI?

And the POV isn't identical because the rain and other effects are not identical. It is at a very slight different point in time in the timedemo.
 
Originally posted by: stardust
Originally posted by: jiffylube1024
It could be caused by anything, like a slight difference in POV of the two Max's.

POV is the same, it's the beginning of the level, camera angel's are set.

No, it's not the same. First of all, it's a different point in time, evident by the rain, although you can't really help that. BUT, someone moved the mouse before the screenshot was taken, because the two Max's are facing slightly different directions!
 
If you played the game, Max moves autonomously

open 2 windows, maximize then, scroll all the way down and switch between the 2 pics, see for yourself that none of the other world entities have shifted
 
Guys, you haven't seen character idle animations before? It's not hard to believe those two screens are taken from the same location. You should be able to eyeball-confirm it by switching between the two pics quickly in an app like IRFanView.

nV's overall AF quality has usually been regarded as better than ATi's, but the Det 52's have dropped quality in D3D apps (trilinear->brilinear, all texture stages beyond the first are only 2xAF)--not promising.
 
It's clear that just about any mid range to upper class GPU can run this game very well.

Neither ATi or Nvidia had any time to optimize drivers for this game. Pete, you stated yourself that Remedy kept the graphics engine under raps.

I could imagine with such a detailed game, they likely did a lot of tweaking to boost the maximum amount of FPS, which is why it likely runs so well on every well known GPU. However I'm unconvinced that explains why the NV7000 falls 20 FPS behind the 9600XT, as well as the 5950 vs the 9800xt. This sounds nothing more than another excuse.

Should we have to wait for Nvidia to make optimized drivers for every new game now? That is what it seems like. We are still waiting for more optimizations for Halo, come this Christmas Nvidia is going to have to do a lot of house keeping to be able to keep up.

I cannot in good faith recommend any FX card to any user after seeing time again, the FX falling dramatically short on framerate/IQ in any of these new games. I tried to keep an un biased view, but my faith in Nvidia is extremely getting difficult to hold on to.

 
--------------------------------------------------------------------------------
Originally posted by: jiffylube1024

Quote

--------------------------------------------------------------------------------
Originally posted by: stardust
Please pay attention to the Van's lights rite above Max payne's head. Notice that the ATI drivers were less rendered? (those lights are very close to the center of the screen so it shouldn't be affected by ATI's flower like AA filtering.

ATI drivers

Nvidia drivers

Very interesting indeed.
--------------------------------------------------------------------------------



Oh my god, and the rain doesn't match up either! Someone is cheating .

Seriously though, did you actually look at the pictures? The difference is so minute it's laughable. It could be caused by anything, like a slight difference in POV of the two Max's.


Actually I see very distinct features, but not those of which that come from the back of Max's head.

#1. The Nvidia pic shows rough car edges compared to ATI's.

#2. ATI shows a lot more gloss on the paint and reflection on the side and hood of the car.

This is just from one pic, one frame. Just imagine seeing IQ problems like that time and again in each frame. No thank you. To a Nvidia owner you may not know what you are missing, but as a ATI owner I can easily see what you are missing out.
 
Didn't say ATI cheated Jiffy. And I like how you disregarded my last post completely and only saw what you wanted.
Others will agree. If you have a small detail (like van headlights or whatever) that is not completely rendered at anywhere from 30-150 fps or whatever, frees up processing power for something else. Like benchmarking. I would like to know why you don't think so...Nvidia was guilty for this a short while ago. They are now on the right track with their drivers and I am glad of that. It's not a matter of "who is cheating" but "who's drivers do a better job at the given task" at this point. It could be one single pixel that is not rendered and that will save something over a few million operations per second. Read my posts for what they are and not anything else.

Things have changed from the fastest cards being most important, to the best image quality being most important. ATI users used this to their advantage when Nvidia was faster in any given game and said speed wasn't the most important thing, but IQ. Now that may come back and bite all ATI users in the butt. Will they all go back to their "faster is better" now that Nvidia has caught up in IQ. I think they will. But, people need to do whatever makes them feel better. I understand that. Me?, I just like to sit back and watch the show with earnest.

GM

Another thing, don't keep your vision narrowed to just the van's headlights. Chances are that there is more than just the headlights. I havent looked myself but broaden your vision to the whole screen shot.
 
But ATi has both the better overall speed and the better overall IQ, that's why everyone recommended ATi so easily over nVidia the past few months. Now that nV is regaining some ground in speed, it'll come down to IQ again, and ATi will (in theory) win in shaders (FP24 vs. FP16) and in AA (jittered & gamma-corrected vs. plain old ordered grid), and lose in AF.

I seriously, seriously doubt applying less AF to a pair of headlights will free "a few million operations per second"--more likely, it'll save a touch of bandwidth. And you really can't say that will affect the framerate by more than a frame or two, at most. 3DCenter showed that nV didn't save much by dropping AF quality on the whole scene (brilinear and max 2xAF), so I don't expect a slightly blurrier headlight and pipe to rock the benchmark charts.
 
Originally posted by: gorillaman


Things have changed from the fastest cards being most important, to the best image quality being most important. ATI users used this to their advantage when Nvidia was faster in any given game and said speed wasn't the most important thing, but IQ. Now that may come back and bite all ATI users in the butt. Will they all go back to their "faster is better" now that Nvidia has caught up in IQ. I think they will. But, people need to do whatever makes them feel better. I understand that. Me?, I just like to sit back and watch the show with earnest.

GM

Another thing, don't keep your vision narrowed to just the van's headlights. Chances are that there is more than just the headlights. I havent looked myself but broaden your vision to the whole screen shot.


Chances are there are more things rendered improperly on the ATI if you assume they are cheating. However, I did not make this assumption, and looking at both screenshots I saw no discernable difference in rendering anything that would make any real dent in framerate.

Things have changed from the fastest cards being most important, to the best image quality being most important. ATI users used this to their advantage when Nvidia was faster in any given game and said speed wasn't the most important thing, but IQ. Now that may come back and bite all ATI users in the butt.

I said this before and I'll say it again - how does a slightly blurry headlight taken from a slightly different POV on the ATI card translate into all of a sudden Nvidia having the IQ crown?

Didn't say ATI cheated Jiffy. And I like how you disregarded my last post completely and only saw what you wanted.
Others will agree. If you have a small detail (like van headlights or whatever) that is not completely rendered at anywhere from 30-150 fps or whatever, frees up processing power for something else. Like benchmarking. I would like to know why you don't think so...Nvidia was guilty for this a short while ago. They are now on the right track with their drivers and I am glad of that. It's not a matter of "who is cheating" but "who's drivers do a better job at the given task" at this point. It could be one single pixel that is not rendered and that will save something over a few million operations per second. Read my posts for what they are and not anything else.

How much processing power does slightly rendering a headlight differently free up? So close to nothing it's silly to mention. And for that matter, why do you even assume that the headlight is not being rendered correctly, andthink it's ATI cheating?

You can knitpick details like this all you want, but it's comparing apples to oranges if you think a slightly blurry headlight is analogous to Nvidia clipping planes or not rendering fog.
 
Originally posted by: gorillamanThings have changed from the fastest cards being most important, to the best image quality being most important. ATI users used this to their advantage when Nvidia was faster in any given game and said speed wasn't the most important thing, but IQ. Now that may come back and bite all ATI users in the butt. Will they all go back to their "faster is better" now that Nvidia has caught up in IQ. I think they will.
Bingo! It all boils down to the debate being shifted to what the Nvidia-haters want it to be. Remember when NV was beating the crap out of EVERY ATI card and the Fanchimps would hoot about how great the 2D quality was? It was like, "Who cares if 3D performance is a fraction of Nvidia's speed and quality? Our 2D IQ RULES!!! Suck it up, Nvidiots!!!"
rolleye.gif


If Nvidia came out with a card tomorrow that was 50% faster with superior IQ and cost half as much, the Fanchimps would simply dismiss it as proof that ATI was owning Nvidia and thus ATI is superior for forcing Nvidia to come out with a better card.
rolleye.gif
rolleye.gif


And WTF is with all this "lost faith"?!? They're friggin' VIDEO CARDS, not Supreme Deities, not that the endless Fanchimp worship would lead you to believe.
 
If Nvidia came out with a card tomorrow that was 50% faster with superior IQ and cost half as much, the Fanchimps would simply dismiss it as proof that ATI was owning Nvidia and thus ATI is superior for forcing Nvidia to come out with a better card.

And WTF is with all this "lost faith"?!? They're friggin' VIDEO CARDS, not Supreme Deities, not that the endless Fanchimp worship would lead you to believe.


The 9700p was revolutionary in GPU architectures. It did not sacrifice marginal performance for IQ. So all of what you said is hearsay and subjective.

And Yes I lost faith, and you're right, ATi is no deity. So once again I seem to be arguing with someone who actually thought the "Underground Rail Road" of the American slave age, was an actual underground railroad.

Lets not push insults. State the facts, not locker room chatter on your favorite video card please.
 
Originally posted by: DefRef
Originally posted by: gorillamanThings have changed from the fastest cards being most important, to the best image quality being most important. ATI users used this to their advantage when Nvidia was faster in any given game and said speed wasn't the most important thing, but IQ. Now that may come back and bite all ATI users in the butt. Will they all go back to their "faster is better" now that Nvidia has caught up in IQ. I think they will.

Bingo! It all boils down to the debate being shifted to what the Nvidia-haters want it to be. Remember when NV was beating the crap out of EVERY ATI card and the Fanchimps would hoot about how great the 2D quality was? It was like, "Who cares if 3D performance is a fraction of Nvidia's speed and quality? Our 2D IQ RULES!!! Suck it up, Nvidiots!!!"
rolleye.gif


If Nvidia came out with a card tomorrow that was 50% faster with superior IQ and cost half as much, the Fanchimps would simply dismiss it as proof that ATI was owning Nvidia and thus ATI is superior for forcing Nvidia to come out with a better card.
rolleye.gif
rolleye.gif


And WTF is with all this "lost faith"?!? They're friggin' VIDEO CARDS, not Supreme Deities, not that the endless Fanchimp worship would lead you to believe.

If you could just make one post without using derogatory labels and focus on the issues, that would be great. Calling everyone who defends ATI a Fanchimp doesn't give your argument very much credibility.

And I think you need to see a shrink about this "Nvidia-hater" nonsense - people don't hate video card companies!!! There is a difference between bias, and becoming disenfranchised with someone/something for past infidelities.
 
I love how you Fanchi....uh...ATI aficionados😛 get all whiny and defensive when called out on your extreme prejudice and outrageous bias against Nvidia. You think that anyone who dares contradict your mob of poo-flinging poor sports is engaging in locker room cheering for their favorite video card, even when NO SUCH CHEERING EXISTS.

Pitiful.

(Of course, if you have enough people blowing smoke up each others arses, it's likely that they'll all start thinking themselves the righteous ones.)

Also, do you guys have the slightest grasp of the English language and what words REALLY mean?!? Judging from the rampant slaughter of spelling and general misuse of words, I suspect the real problem you guys have with my posts is YOU DON'T UNDERSTAND WHAT I'M SAYING!!! Try this:
There is a difference between bias, and becoming disenfranchised with someone/something for past infidelities.
"Disenfranchised"?!? You mean you lost your right to vote?!?!?😕 Could you possibly mean "disillusioned" or "disenchanted"? And "infidelities"?!?!? WTF is that? Did Nvidia cheat on you like your girlfriend or something?!? If you feel that Nvidia has taken some questionable steps to improve their benches and that your money is best spent elsewhere, that's a legitimate concern, but that's NOT what you're saying. There's a hella difference between "expression" and "communication" and the ATI advocates routinely rush past the boundry between rational and patently insane commentary.

Tell ya what, Sport - when you can show a basic level of literacy and reading comprehension, you can talk at me about credibility and bias.

FACTS: Nvidia slapped ATI's whack ass around for years until the 9700 was introduced. There was nothing revolutionary about the 9700 other than it being an efficient iteration of brute force architecture. Nvidia got screwed over by their fab in the move to the 13 nanometer process, causing delays and general funkiness which they're still digging out from. (Oh well, win some, lose the rest.) One thing lost in all the nattering about Nvidia being slower in PS2.0 functions is that wrangling 32 bits is more calculation-intensive than 24 bits (remember how you could boost frame rates by switching to 16-bit textures?), but the loyal ATI zealots give ATI a pass for taking a easier way. The new Catalyst drivers have all sorts of problems and incompatibilities, as well as a possible return to the CHEATING from the old "Quack" days, but the ATI herd are blaming the Nvidia TMIMTBP campaign as a conspiracy, but the $6 million ATI/Valve marriage resulting in Gabe "When's lunch?" Newell using out-dated drivers and generally talking smack against the company that didn't pay the grift, is proof that ATI is teh r0x0rZ!!!

BTW, where is the HL2 benchmark we were promised on Sept. 30th? Heck, WHERE'S THE FRIGGIN' GAME WE WERE PROMISED?!?!?!?

Sorry, kids, but if there were stickers of Calvin peeing on an Nvidia logo (a la Ford/Chevy/Dodge), you'd have them all over your cases and Trapper Keepers. Wrapping your hatred - and that's what it is, just as the 3dfx Zombies did - of Nvidia around a blob of bogus hurts and betrayals is disingenuous at best and flat-out laughable in general. If Nvidia doesn't offer the value you want, then go to ATI and God love ya, but you can't do that and the intense worship of ATI and the REAL excuse-making that goes on for ATI's missteps reveal that the truly bias and uncivilized parties in this debate are firmly in the ATI camp. (You don't really see this sort of crap in the Intel vs. AMD discussions, do you?)

Sometimes you've just gotta call a shovel a shovel.
 
No, this is deciding on which card is better to buy for the type of gaming you want to experience. It is not a flame war between Ati vs Nvidia fans.

The only thing you stated that was a known fact was Nvidia getting "screwed over" with the .13 micron process. If you all ready forgotten, this was Nvidia's decision; they made a bad judgement and have been suffering the consequences of falling behind ever since. Maybe not necessarily their hardware has been suffering, but as you can imagine Nvidia had to spend a company fortune to remake those failed FX cards. You make it sound like Nvidia is the victim, which is horribly wrong. People who bought those 450 dollar cards with the failed micron process were the victims.

I wouldn't even bother trying to think over the rest of what you wrote. Since most of it is regurgitated monstrosities I hear in every other Ati vs Nvidia thread.

You brought up, I think, Driver optimizations and cheats? I too think that was BS and thought it was propaganda used by ATi fans to bash Nvidia. But that still doesn't change the fact of why Nvidia's new 400 dollar FX cards lack in games like Max Payne 2. No, not by a marginal frame or two, but by at least a margin of 20 frames per second. Halo almost the same thing, trails behind by a good 10 frames per second. This on top of the delayed Half Life 2 wich Valve repeatedly stated had troubles developing for the FX hardware.

Because ATi fans played their cards about Nvidia cheating, Nvidia fans played their cards about ATi drivers being unstable. How naive to think that there is no Nvidia owners out there that didn't have a monitor blow up after a driver update, or to think Nvidia's drivers run stable on every chipset made. Since Nvidia has such a high reputation from their Ti4 DX8 drivers, no one would believe any hardware conflict or slow down were to be suspect to Nvidia's drivers.

Bottom line, the 9800 out performs the FX by a great deal in Max Payne 2 without any driver optimizations from either party. Which is what the topic states, does it not?

Or must you continue to act like Johnny Cochran, ignore the facts, and bash other peoples English "literacy" merit? Because I don't really take much seriousness in a member who quotes a word like "r0xers" for supporting their concentric arguments. Hopefully this, will bring us back on some sort of topic.
 
Ho hum where to start.....

There was nothing revolutionary about the 9700 other than it being an efficient iteration of brute force architecture.
Iteration? R300 was the first of the series bub. Brutue force? hardly, brute force is the NV30 series, lacking so much in architecture you have to ramp up clockspeeds almost to the point where your manufacturer tells you it will not hurt its bottom line by wasting precious silicon making your product. The R300 was everything Nvidia said was NOT possible, an 8 pipeline efficient DX9 architecture, on 0.15, with a 256-bit memory bus. Nvidia, in multiple conferences and appearances said ATI was stupid to think they could pull the R300 off, and they proved them wrong on all counts.

One thing lost in all the nattering about Nvidia being slower in PS2.0 functions is that wrangling 32 bits is more calculation-intensive than 24 bits (remember how you could boost frame rates by switching to 16-bit textures?), but the loyal ATI zealots give ATI a pass for taking a easier way.

32 bits is more calculation intensive, and is actually considered overkill for most shading work, nobody is going to need to sample a 524288x524288 texture for many, many years. And who is taking the easier way here, ATI with 24-bit, perfectly balanced for shading work, accurately able to sample a 2048x2048 texture with 4-5 bits of subpixel precision left, or Nvidia with its16bit-mode operation that is only able to accurately sample a 256x256 texture that is used 95% of the time?

The new Catalyst drivers have all sorts of problems and incompatibilities, as well as a possible return to the CHEATING from the old "Quack" days

Catalysts having problems? you think the detonators aren't bug ridden aswell? You should go visit some Nvidia sites and look at all of the problems they are having with their cards. I know some people who even refuse to upgrade past drivers in the 30 series or the 44.03 because the other drivers cause so many problems for them, not to mention the forced options you can't turn off. And puhlease be quiet with the quack crap, if you weren't ignorant about the whole situation you wouldnt even be talking about it right now or bring it up everytime somebody finds a new cheat in Nvidias drivers. The whole thing lasted 3 weeks, was fixed with an actual performance increase, and actually was a driver bug because of drivers not yet ready for the new 8500 chip

but the ATI herd are blaming the Nvidia TMIMTBP campaign as a conspiracy, but the $6 million ATI/Valve marriage resulting in Gabe "When's lunch?" Newell using out-dated drivers and generally talking smack against the company that didn't pay the grift, is proof that ATI is teh r0x0rZ!!!

News flash buddy, Ati wasn't the only one gunning for this thing you know, Nvidia was an active bidder also, they didn't exactly get out of the whole situation scot free, thats like trying to steal something, but then stopping because somebody else stole it before you could. How do you which drivers were out-dated either? EVERY det 50 series driver had problems rendering fog in certain games, how exactly is the problems Gabe and the gang observed because of "out-dated" drivers, which are out officially now and still contain the same errata they pointed out? What do you think would have happened if Nvidia out-bid ATI? Would you be going on the board talking about how Nvidia fixed the benchmarks intentionally trying to make ATI look bad? didn't think so.
 
Originally posted by: DefRef

Also, do you guys have the slightest grasp of the English language and what words REALLY mean?!? Judging from the rampant slaughter of spelling and general misuse of words, I suspect the real problem you guys have with my posts is YOU DON'T UNDERSTAND WHAT I'M SAYING!!! Try this:
There is a difference between bias, and becoming disenfranchised with someone/something for past infidelities.
"Disenfranchised"?!? You mean you lost your right to vote?!?!?😕 Could you possibly mean "disillusioned" or "disenchanted"

Yeah, I did mean to say something along the lines of disenchanted - my bad.


And "infidelities"?!?!? WTF is that? Did Nvidia cheat on you like your girlfriend or something?!?

Infidelity as in their breach of trust or outright deception of buyers re: their drivers and optimizations.

Your discovery of one misused word in my post (which I am sorry for) totally deconstructed my argument. Bravo!

Tell ya what, Sport - when you can show a basic level of literacy and reading comprehension, you can talk at me about credibility and bias.

Tell you what Sport, after you're done sniffing your own turds and calling them potpourri you can talk to me about "fanboyism" and bias.

There was nothing revolutionary about the 9700 other than it being an efficient iteration of brute force architecture.

If there was a more sure sign of bias I'd like to see it.


Sometimes you've just gotta call a shovel a shovel.

And sometimes you have to call a fanboy a fanboy, DefRef, and that would be you. Until you can at least consider the other side of the argument (ie ATI) then this is what you will be in everyone's minds.
 
Originally posted by: Pete
DefRef,

Grow up.

Yes, and don't put silly words in spanish... If someone here is "aficionado" that's you. Or are you trying to show us you know a word in spanish?
 
Back
Top