Originally posted by: DAPUNISHER
Thanks, and props to both of you, for the hard work, and enduring of the monotony that is benchmarking. And partially for our edification no less! :beer:
I have a couple questions, and don't know if it has been covered: Do either of you think the reason each found the opposite card to offer a "smoother" experience, could be because of the cards having been paired with that IHV's chipset? Perhaps the bios and/or drivers are just better optimized or tuned for their own products? Because it sounds like something that isn't quantifiable by data like the min/avg/max FPS, but something almost intangible, that can only be obtained by observation.
While I'm not trying to defend HOCP, it is something they try to impart in their non-apples-to-apples comparisons. i.e. The "playable" settings. Which is my other question: Has this endeavor caused either of you to find validity to that methodology now, if you didn't before?
Originally posted by: keysplayr2003
Originally posted by: quattro1
Sucks you went through all that trouble and did not benchmark Far Cry correctly. On your site, you show pics of the 8800GTS looking weird when you enabled HDR. That is because you have AA turned on. If you are trying to get HDR+AA in Far Cry, you need to use the 1.4 Beta patch, not the released 1.4 patch.
The patch can be found here:
http://downloads.guru3d.com/download.php?det=1293
Once this patch is installed, you must type in the console, in this order:
r _FSAA 2 now press Enter.
r _HDRRendering 1-11
If you have HDR on first before setting r_fsaa 2, it will still look broken.
Now you are thinking ATI looks fine with the 1.4 released patch with HDR, but is AA on and is AA actually being done? Take a look...
Any chance you will correct this?
Rather than try to get HDR working on a game that the devs really never intended to include HDR on, and install a "BETA" patch (if you've noticed, the word "beta" was taboo in all of my benches. No beta's allowed.) to slap on Crytek's version of HDR that NOBODY was really ever truly happy with anyway, I felt running with every other bell and whistle including maxxed AA and AF would reflect a more realistic usage of the game. If I did the beta patch, it just wouldn't feel right. Just scotch taped. So, there is really nothing to correct. Everyone I talked to about it says they turn off HDR in FarCry anyway saying it was to overdone and overpowering and could not find just the right level. So, forget HDR in FarCry. There are plenty of other games that properly utilize HDR. Lost Coast for example looks really nice with HDR.
Originally posted by: keysplayr2003
Originally posted by: DAPUNISHER
Thanks, and props to both of you, for the hard work, and enduring of the monotony that is benchmarking. And partially for our edification no less! :beer:
I have a couple questions, and don't know if it has been covered: Do either of you think the reason each found the opposite card to offer a "smoother" experience, could be because of the cards having been paired with that IHV's chipset? Perhaps the bios and/or drivers are just better optimized or tuned for their own products? Because it sounds like something that isn't quantifiable by data like the min/avg/max FPS, but something almost intangible, that can only be obtained by observation.
While I'm not trying to defend HOCP, it is something they try to impart in their non-apples-to-apples comparisons. i.e. The "playable" settings. Which is my other question: Has this endeavor caused either of you to find validity to that methodology now, if you didn't before?
That is a seriously a good question. I have an NForce chipset, and apoppin has the P36 Crossfire chipset right? Interesting indeed.
16x12 is about as high as either single card will manage well with full details plus 4xAA/16AF ... higher res would require turning down some details to make it "playable" ... kinda hard to compare ala HardOCP style of mixed testing using subjective criteria for "playability"Wish it had some high rez benchies
Originally posted by: quattro1
Originally posted by: keysplayr2003
Originally posted by: quattro1
Sucks you went through all that trouble and did not benchmark Far Cry correctly. On your site, you show pics of the 8800GTS looking weird when you enabled HDR. That is because you have AA turned on. If you are trying to get HDR+AA in Far Cry, you need to use the 1.4 Beta patch, not the released 1.4 patch.
The patch can be found here:
http://downloads.guru3d.com/download.php?det=1293
Once this patch is installed, you must type in the console, in this order:
r _FSAA 2 now press Enter.
r _HDRRendering 1-11
If you have HDR on first before setting r_fsaa 2, it will still look broken.
Now you are thinking ATI looks fine with the 1.4 released patch with HDR, but is AA on and is AA actually being done? Take a look...
Any chance you will correct this?
Rather than try to get HDR working on a game that the devs really never intended to include HDR on, and install a "BETA" patch (if you've noticed, the word "beta" was taboo in all of my benches. No beta's allowed.) to slap on Crytek's version of HDR that NOBODY was really ever truly happy with anyway, I felt running with every other bell and whistle including maxxed AA and AF would reflect a more realistic usage of the game. If I did the beta patch, it just wouldn't feel right. Just scotch taped. So, there is really nothing to correct. Everyone I talked to about it says they turn off HDR in FarCry anyway saying it was to overdone and overpowering and could not find just the right level. So, forget HDR in FarCry. There are plenty of other games that properly utilize HDR. Lost Coast for example looks really nice with HDR.
Then why post two images of the 8800 with HDR and AA on at the same time? Your screenshots are exactly what happens when you turn those on together. The 8800 renders HDR perfectly fine in Far Cry if you turn off AA. Or if you want both, use the beta patch.
Also, you need the beta patch for both ATI and NVIDIA to get HDR+AA together. Even though ATI did not show that weirdness, was it doing HDR or AA? Without the beta patch no.
Far Cry was not built to run AA+HDR together, though it always had HDR from day 1. You have it backwards in your site.
Either way nice job on the review, except this part 🙂
Originally posted by: apoppin
you are quite welcome ... i planned to do something for myself ... and then got a little carried away after talking to Keys.Originally posted by: DAPUNISHER
Thanks, and props to both of you, for the hard work, and enduring of the monotony that is benchmarking. And partially for our edification no less! :beer:
I have a couple questions, and don't know if it has been covered: Do either of you think the reason each found the opposite card to offer a "smoother" experience, could be because of the cards having been paired with that IHV's chipset? Perhaps the bios and/or drivers are just better optimized or tuned for their own products? Because it sounds like something that isn't quantifiable by data like the min/avg/max FPS, but something almost intangible, that can only be obtained by observation.
While I'm not trying to defend HOCP, it is something they try to impart in their non-apples-to-apples comparisons. i.e. The "playable" settings. Which is my other question: Has this endeavor caused either of you to find validity to that methodology now, if you didn't before?
"the smoother" might be caused by bias ... i really don't know. i also had a x850xt and a 7800GS and didn't find the GS "smoother" where FPS were equal ... i would DISREGARD both Keys' and my opinion on the subjective issues [including noise]
Heck, it might be the mouse 😛
[--i have several, no difference]
--or maybe the MBs ... mine IS a crossfire MB, Keys' is SLI - but i doubt it
and as to HardOCP - no, i DETEST the way they do it. i can make EITHER video card "look better' by manipulating the variables ... imo, Kyle's "conclusion" shows he DID manipulate it to make the 320 look much better then the 2900xt
i.e. simply crank up the AA and the GTS wins ... or at a higher resolution; IF i HAD to game at 16x12 - right now -i'd pick the GTS ... if i needed "max AA", i'd also pick the GTS
IF HardOCP's method is to be useful - and it CAN be useful - he need a MUCH larger group of variables to test. As it stands - right now - i won't visit there - except for 'amusement' - again. And i can't take his results seriously as i experienced a faster card vs the 2900xt for myself
Keys and i have have different rigs, yet our final results support each other ... not HardOCP's conclusions ... take it FWIW
is there a free demo?Any chance we can get some Supreme Commander numbers? Its the only game i really care about.
it doesn't interest me
if anyone cares to send me a legal copy, i will bench it on my 2900xt - the GTS is gone back to Best Buy
[and return it of course after uninstalling it]
it is extraordinarily CPU dependent; runs better on QC then DC and i would need to bench at a minimum of 16x12 to show the real differences; my CRT is fine as long as i do not use the switch to turn it off ... i am enjoying Dual-display now - except there is no room anymore for anything else.There is a free demo.
Originally posted by: keysplayr2003
Others will disagree and say they "need" Vista 64. Though the reasoning escapes me ATM.
Originally posted by: apoppin
it is extraordinarily CPU dependent; runs better on QC then DC and i would need to bench at a minimum of 16x12 to show the real differences; my CRT is fine as long as i do not use the switch to turn it off ... i am enjoying Dual-display now - except there is no room anymore for anything else.There is a free demo.
and if i can bench off the demo, i will be glad to ... BUT you have to understand, i am on dial-up and 200MB is pretty much an "overnight" ... and sometimes the installer fails because of the D/L manager. Worst of all, when you DO bench it, you find it is not necessarily indicative of the full game's performance ... i.e. the CoJ demo runs worse than the CoJ full game ... and they rarely patch demos or bother to fine-tune drivers for them
Finally, if you STILL want me to do it, it is after i finish D/L'ing Steam's Lost Coast [so i can save it], the Overlord and Lost Planet updates ... and a slew of other applications and benches ... and of course with no GTS to compare to - Keys or someone else will have to post their benches and you can extrapolate the performance deltas.
... as i said .. i'm back to benchmarking in about 2 weeks and i think i will have recovered sufficiently by then ...
--damnit, i am a GAMER and i *need* to play some good PC games
... and by that time, i expect we will have Cat 7.7 released
... the 'lucky' Cats 😉
EDIT: seriously, i am glad to revisit my benchmarks with the new drivers and i will be glad to add new ones by [reasonable] request. I expect to add Lost Planet and Overlord as i am playing them next. Of course, ONCE i get the benchmark, it is useful for many months as i just keep updating the games to the latest patch. And - this time - i will not have a new OS - actually 2 new OSes - to wrestle with as well as brand new HW to OC and stabilize.
what i am really looking forward to is seeing how x-fire does with these benches ... will i get the same results as HardOCP?
stay tuned ... i'd say i hope to upgrade with the next sale XT sale after Crysis - IF things go OK for me and mine.
n7 is the man to ask ... i'd suggest you PM him before he gets rid of one of his GPUs ... he can test the ultra vs the 1GB 2900xt 😉Eh screw it then, since you sent back the GTS anyway theres really nothing to compare it against. It is however a fantastic game and i suggest trying the demo
BUT and it is a Big One ... you are complaining an dcomplaining while i find Vista a JOY to useOriginally posted by: n7
Originally posted by: keysplayr2003
Others will disagree and say they "need" Vista 64. Though the reasoning escapes me ATM.
Reasoning is extremely simple.
32-bit is a dead future, & any gamer/enthusiast looking ahead even a couple years knows it.
4 GB RAM on Vista will be the norm very quickly, as Vista does a very decent with micro-management of RAM IMO, meaning there's really no such thing as too much RAM for Vista.
Also, considering how cheap high resolution displays are getting, with future games, i know very well 2-3 GB of RAM simply isn't going to suffice.
Heck, IMO, it doesn't already.
Originally posted by: nitromullet
So, the very short version of this is:
-GTS/X overall faster than XT 512MB/1GB in DX9 benchmarks
-Keys prefers the GTS, citing greater smoothness overall
-apoppin prefers the XT 512MB at 1280x1024 and 1400x900 resolutions, but would probably give the GTS the nod at higher resolutions
-n7 prefers the XT 1GB at the 2560x1600, citing better driver support in Vista x64, but overall isn't really impressed with either card given the price of each
Interesting and somewhat inconclusive results. That isn't a bad thing IMO... I think what I take away from all the hard work you guys have put into this is that although ATI was out the door later with R600 than NV was with G80, both products compete against each other quite well.
excuse me ... higher than 16x12?Originally posted by: n7
I very much appreciated keys & apoppin's detailed benching, but the one thing lacking was higher resolution benching IMHO 😛
I do think that the GTS is certainly a better deal at < 1600x1200, possibly even 1920x1200, but at higher resolutions + AA/AF, i do find the 2900 XT 512 MB to be better overall, at least based on some reviews like X-bits.
I kinda wish i could bench both those cards at 2560x1600, as i feel running cards @ higher resolutions than they are really capable of gives a very good indication of the future.
Basically, it seems which one seems to almost come to which games you play, preference of certain features, etc., etc.
As for my HD 2900 XT 1 GB vs. 8800 GTX, my opinion is still somewhat pending.
I installed the leaked 162.15 Vista x64 drivers (the ones that got pulled) recently, & thus far, i've been impressed with them, though only a couple games.
CoJ DX10 bench no longer crashes/BSODs (like the 158.24s were), & in the little UT2k4 i played, the semi-lockups didn't appear.
So maybe, just maybe, nV is actually working on some of the issues.
I need more time in actual games with them to say for sure, but if two main issues i was having don't surface again, there's no question i'll be keeping the 8800 GTX, as my main concerns with it were drivers, not performance.
Vice versa on the 2900 XT 1 GB...
READ post No. 5 ... it is mine ... and [1],2,3, and 4 have all Key's benchesI can't be bothered ...
Originally posted by: apoppin
welcome back !
Car crash!
wth happened?
are you OK?
sooo what did I miss
Originally posted by: apoppin
absolutely NOT worth replacing it
-nor is it worth replacing a 2900xt with a GTS 640
it comes down to economics and what you prefer
[or if 'noise' and power are issues, the GTS is the choice]