Wait, what.. why would you play at 1366x768 and use AA to emulate higher resolution instead of just playing at a higher resolution?
Not a fair comparison.
ATI is next gen. nvidia doesnt have its next gen yet.
Once Fermi is out then you can go Fermi vs ATI 5890 ... thanks
Also, how do you show fps in ME2? I'm running at max details @ 1280x1024 w/ 4xSSAA and 8xAA and would be curious to see what kind of fps I'm getting. It feels ~60fps smooth 95% of the time but would be interesting to double check.
Not everyone has a monitor with a native resolution of 1900x1200. AA is very useful for making games look smoother at low resolution.
Also, how do you show fps in ME2? I'm running at max details @ 1280x1024 w/ 4xSSAA and 8xAA and would be curious to see what kind of fps I'm getting. It feels ~60fps smooth 95% of the time but would be interesting to double check.
How exactly are you running 4xSSAA and 8xAA at the same time? Does nhance allow you to force SSAA on top of forced MSAA?
Or before I dive off the deep end with curiosity.. did you mean 8xAF? 😀
How exactly are you running 4xSSAA and 8xAA at the same time? Does nhance allow you to force SSAA on top of forced MSAA?
Or before I dive off the deep end with curiosity.. did you mean 8xAF? 😀
How exactly are you running 4xSSAA and 8xAA at the same time? Does nhance allow you to force SSAA on top of forced MSAA?
Or before I dive off the deep end with curiosity.. did you mean 8xAF? 😀
You told ME that upgrading the GPU on MY system would make an impact because of benchmarks you saw on ANOTHER system. Epic fail.You obviously have comprehension problems. "These benches" as in Mass Effect 2 benches would make bigger difference with GPU. Guess what? It does! Or did you fail to see linear scaling with GPU?
I can't believe I have to spell this out for you. I thought you people where good at maths? Clearly not or you have no education whatsoever.Athlon x1 2000mhz 20% overclock =2400mhz 1 core
Athlon x2 2000mhz 20% overclock =2400mhz 2 cores
Why should I. I specifically chose this LCD for gaming. I mentioned this several times. Get over it and move on.Real option as in 1998 resolution. Talk to me when you can use SSAA @ 1080P.
IQ = Image quality. Might have been funny in kindergarden, but now it just makes you look like a child.Your amazing IQ. You made a funny!
I play at 1366 x 768. And as above it's a choice I made.That's before you told me you play 1024x768. Come back to 21st century. We stopped playing in 1024x768 more than 10 years ago.
It's "They got fed up" and "on the other hand". Please learn some basic grammar.They get fed up because I give them something to think about. You in other hand give 1 big blank. Get an education you.
Something to think about? Regarding Dot pitch? You basically thought a higher Dot pitch will give you more detail. And everybody was explaining you that this isn't the case. And you went on and on and on like a troll that they must be wrong.something to think about
Not a fair comparison.
ATI is next gen. nvidia doesnt have its next gen yet.
Once Fermi is out then you can go Fermi vs ATI 5890 ... thanks
You can use FRAPS. It's a free download. You just start the program before you start the game...
Please note that ME1 had a frame cap of 62fps. ME2 might be similar. You can remove this cap by editing a ini file if you like. You can refer to tweakguides ME article. It has all the details 🙂
For some reason editing the config file would make the game unusable, it wouldn't start, even setting the file at default wouldn't work, odd.
Not everyone has a monitor with a native resolution of 1900x1200. AA is very useful for making games look smoother at low resolution.
Also, how do you show fps in ME2? I'm running at max details @ 1280x1024 w/ 4xSSAA and 8xAF and would be curious to see what kind of fps I'm getting. It feels ~60fps smooth 95% of the time but would be interesting to double check.
That is hardly quad optimized. I have single core game that runs between both cores @ 60% and 40%. That doesn't mean it's dual core optimized.
Who gave you the idea that a multicore game must put 100% load on all cores? :\
This is an ARMA2 bench, one of the most multicore optimized games i know of:
http://www.pcgameshardware.com/aid,687620/ArmA-2-tested-Benchmarks-with-18-CPUs/Practice/
Unless you want to move the "goalposts" and redefine what multi-threaded means?
Wow so that's a 20% increase, a 20% increase and a 20% increase from the starting point! I think we all see your point now.. you know that percentages are used for relative comparisons and not absolute ones? Let's just change the workload to 1, 2 and 4, wow now we get only 0.4 more workload done with a dualcore instead of 40.. 🙁 Oh but still 20% more.Firstly: performance = workload = work getting done.
Secondly: assumption: multi threaded application that scales perfectly
Single Core: Gets 100% workload done
Dual Core: Gets 200% workload done
Quad Core: Gets 400% workload done
Still with me? Good keep reading.
Now we are applying a 20% overclock to every CPU:
Single Core: Gets 120% workload done
Dual Core: Gets 240% workload done
Quad Core Gets 480% workload done
See now? 20, 40 and 80 %. WOW. I hope you are having an AHHH moment.
Arma 2 isn't truly multi-threaded. Actually there isn't a true multithreaded game in PC. If it was it would double performance in cpu scaling tests but it does not. You get the 20% better frame rates from quads @ same clock as dual core instead of 100%.
As I thought, you are trying to move the goalposts.
Again, even if a game is running 16 threads, that is NO assurance it will max out 16 cores.
The Virtual Reality Engine 3 is multicore:
http://www.arma2.com/supply/presskit/download/20-realvirtualityengine-gc2008.html?lang=en
Dosn't matter if "your" definition of "multicore" means it must max out all availble cores, you stance is flawed.
And to think games scale linear with the number of CPU cores is wrong, by that definition SLI/Crossfire should also scale linear.
ARMA2 is multicore, like it or not...it's your definiton of "multicore" that is incorrect...as I suspected.
By your definition of spreading the cores between the CPU automatically becomes multithreaded which is false only to have less than 100% pegged CPU.
In case of Arma you are getting 20% better frame rates between dual and quad. This performance improvement could come from more Cache of the Quad processors over dual core and is not necessarily quad optimized.
Developers claim lot of things but it doesn't mean it's necessarily true. According to Crytek Crysis is quad optimized. Bethesda claimed oblivion was dual core optimized and so on.
SLI and Crossfire does scale linearly 2x fold when it has proper optimization as do multicore optimized games. Go look at some benches.
FarCry 2 ships with the most impressive benchmark tool weve ever seen in a PC game. Part of this is due to the fact that Ubisoft actually tapped a number of hardware sites (AnandTech included) from around the world to aid in the planning for the benchmark.
You are still clining to flawed notion of multicore, please stop.
I am not the only one who has pointed out that you are wrong.
And please whow me these 100% scaling multi-GPU games, I can't see them here:
http://www.anandtech.com/video/showdoc.aspx?i=3643&p=17
http://www.anandtech.com/video/showdoc.aspx?i=3650&p=4
http://www.anandtech.com/video/showdoc.aspx?i=3658&p=5
And even when we look at FarCry 2:
http://www.anandtech.com/showdoc.aspx?i=3505&p=7
You won't find you "magical" 100% scaling present.
You overlook the act that threads need to wait for other treads to complete in a game engine.
Only things that scale ~100% are encoding/decoding and other In-Order instructions...game engines dosn't.
But I would love for you to post data in you favour, your word is not enough for me, sorry?