BallaTheFeared
Diamond Member
- Nov 15, 2010
- 8,115
- 0
- 71
I don't know if I'm being daft or something, but PCPR graphs and charts aren't making any sense to me at all. The 690 and 680 SLI should have worse percentile frametimes cause their graphs are a spiky mess most of the time.
I don't know if I'm being daft or something, but PCPR graphs and charts aren't making any sense to me at all. The 690 and 680 SLI should have worse percentile frametimes cause their graphs are a spiky mess most of the time.
Also about leaving out crossfire, they say it sometimes it performs worse than FRAPS reports and sometimes better. If it reporting more frames than are drawn, wouldn't the results always be worse? Sounds like some kind of bug to me.
I guess we'll see when they show results of their new testing.
Here is the same data gathered by our new capture system - the CrossFire configuration looks MUCH worse with many frames hitting near 0ms of screen time. That would be great if they were ALL like that but unfortunately they also scale up to 20ms and higher quite often. Also notice NVIDIA's is actually MORE uniform indicating that there is some kind of smoothing going on after the frame leaves the game engine's hands.
Just Pre-ordered one from the Egg.
No, you just don't understand the hardware unfortunately, but I'm glad everyone can see your ignorance as well as your childish behavior, good to reference in the future. The same bottleneck happens at higher resolutions for other cards for the same reasons (limited by ROP's or or any number of other components) and they again they lose efficiency.
I don't know if I'm being daft or something, but PCPR graphs and charts aren't making any sense to me at all. The 690 and 680 SLI should have worse percentile frametimes cause their graphs are a spiky mess most of the time.
Also about leaving out crossfire, they say it sometimes it performs worse than FRAPS reports and sometimes better. If it reporting more frames than are drawn, wouldn't the results always be worse? Sounds like some kind of bug to me.
I guess we'll see when they show results of their new testing.
Dang... not gonna lie I was tempted after I saw this.
http://www.newegg.com/Product/Produc...82E16814121724
For the lazy.
Might be best to wait for the custom version of the card
Wow, they're really coming out of the woodwork on this one. Seriously, there's things to really appreciate about this card, even for the AMD fans. Believe it or not, this card will probably drive AMD to make better cards in the future if you only buy their cards. You can certainly be critical about the price, but it's not like another FX5xxx series or something, sheesh.
The rumour also says that DirectX 11 will only support GTX 590 in Crysis 2.
"average fps is the most important for gaming and MINIMUM fps determines how smooth your gaming experience will be..." -this has been the whole purpose of gaming benchmark until now...
http://forums.overclockers.co.uk/showthread.php?t=18455827
recently frame latency test came out just when nvi has $3.4B cash in their hand and company has more cash than ever(kudos to NV MARKETING MAFIAS)....guess what nv anarchy going in BACKHAND ...we know very well how much of a cheater nvi really is....(just guessing how many paid shill sites are there for nvi:whiste::whiste:*****here are some prime examples-
http://gamingbolt.com/rumour-crytek-was-paid-money-to-delay-support-of-directx-11-for-crysis-2
http://www.aggressivewarriors.com/showthread.php?tid=4146
http://www.androidauthority.com/jen-hsun-huang-nvidia-intel-68612/
http://semiaccurate.com/forums/showthread.php?p=33626
https://forums.geforce.com/default/...rove-that-the-geforce-gtx-590-is-the-fastest/
http://www.semiaccurate.com/forums/archive/index.php/t-839.html
similar as how every COD gets 9+ rating from ign-
http://www.zeldainformer.com/news/f...iew-scores-are-skewed-due-to-public-relations
I can't wait for someone to even attempt to explain this nonsense. No amount of "frame metering" can cause SLI to have better lantency than single GPU. It's impossible.
Oh but wait. I'm sure the nvidia review guide urged all websites to include these results, they just forgot to mention don't include SLI. Because SLI has less microstutter than single GPU. Whoops. Sorry to anyone who bought Titan over SLI 680s -- in which the 680s ironically have less microstutter.
