• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

***Official Reviews Thread*** Nvidia Geforce GTX Titan - Launched Feb. 21, 2013

Page 23 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I don't know if I'm being daft or something, but PCPR graphs and charts aren't making any sense to me at all. The 690 and 680 SLI should have worse percentile frametimes cause their graphs are a spiky mess most of the time.

Also about leaving out crossfire, they say it sometimes it performs worse than FRAPS reports and sometimes better. If it reporting more frames than are drawn, wouldn't the results always be worse? Sounds like some kind of bug to me.

I guess we'll see when they show results of their new testing.
 
I don't know if I'm being daft or something, but PCPR graphs and charts aren't making any sense to me at all. The 690 and 680 SLI should have worse percentile frametimes cause their graphs are a spiky mess most of the time.

I can't wait for someone to even attempt to explain this nonsense. No amount of "frame metering" can cause SLI to have better lantency than single GPU. It's impossible.

Oh but wait. I'm sure the nvidia review guide urged all websites to include these results, they just forgot to mention don't include SLI. Because SLI has less microstutter than single GPU. Whoops. Sorry to anyone who bought Titan over SLI 680s -- in which the 680s ironically have less microstutter.
 
I don't know if I'm being daft or something, but PCPR graphs and charts aren't making any sense to me at all. The 690 and 680 SLI should have worse percentile frametimes cause their graphs are a spiky mess most of the time.

Also about leaving out crossfire, they say it sometimes it performs worse than FRAPS reports and sometimes better. If it reporting more frames than are drawn, wouldn't the results always be worse? Sounds like some kind of bug to me.

I guess we'll see when they show results of their new testing.

Does this help?

Here is the same data gathered by our new capture system - the CrossFire configuration looks MUCH worse with many frames hitting near 0ms of screen time. That would be great if they were ALL like that but unfortunately they also scale up to 20ms and higher quite often. Also notice NVIDIA's is actually MORE uniform indicating that there is some kind of smoothing going on after the frame leaves the game engine's hands.

fr-3_0.png
 
No, you just don't understand the hardware unfortunately, but I'm glad everyone can see your ignorance as well as your childish behavior, good to reference in the future. The same bottleneck happens at higher resolutions for other cards for the same reasons (limited by ROP's or or any number of other components) and they again they lose efficiency.

lower the rez and the gpu finishes the frames so fast that you become CPU bound. This is why at lower resolutions we see intel CPUs score much higher fps than an AMD system. If your CPU is holding back the GPU and it will be at lower resolutions, then your gpu is setting and waiting on the CPU. Its really not surprising that most of the cards will clump up in a CPU bound low rez setting. Your not stressing the GPU at all. Turn up the resolution so that it is loaded trying to pump out pixels as fast as it can. The higher the resolution and settings, the more you become GPU bound. This is where AMD and intel CPUs perform close to the same. It is because the CPU is waiting on the GPU to get done with each frame.

If you want to measure the power effiency of the GPU then your best bet is to max out the settings so your GPU is not sitting and waiting on the CPU so it can process a frame.

man its not very complicated
 
I don't know if I'm being daft or something, but PCPR graphs and charts aren't making any sense to me at all. The 690 and 680 SLI should have worse percentile frametimes cause their graphs are a spiky mess most of the time.

Also about leaving out crossfire, they say it sometimes it performs worse than FRAPS reports and sometimes better. If it reporting more frames than are drawn, wouldn't the results always be worse? Sounds like some kind of bug to me.

I guess we'll see when they show results of their new testing.

Not just me then?...only thing I can think of is the hardware/software of NV counters this when throwing the results to the screen, sure looks weird, unless they have them the wrong way around in th graph?
 
According to Tom's, one of the intended targets of this card is those who want to be able to cram a lot of power into something like a Falcon Northwest Tiki. I am that demo, but a grand is way, way too much for me for this card. Hoping to see Haswell + a single GPU card that can beat my 580's in tandem for a better price, but I don't see it happening this year.
 
"average fps is the most important for gaming and MINIMUM fps determines how smooth your gaming experience will be..." -this has been the whole purpose of gaming benchmark until now...

http://forums.overclockers.co.uk/showthread.php?t=18455827

recently frame latency test came out just when nvi has $3.4B cash in their hand and company has more cash than ever(kudos to NV MARKETING MAFIAS)....guess what nv anarchy going in BACKHAND ...we know very well how much of a cheater nvi really is....(just guessing how many paid shill sites are there for nvi:whiste::whiste:*****here are some prime examples-


http://gamingbolt.com/rumour-crytek-was-paid-money-to-delay-support-of-directx-11-for-crysis-2

http://www.aggressivewarriors.com/showthread.php?tid=4146


http://www.androidauthority.com/jen-hsun-huang-nvidia-intel-68612/


http://semiaccurate.com/forums/showthread.php?p=33626



https://forums.geforce.com/default/...rove-that-the-geforce-gtx-590-is-the-fastest/


http://www.semiaccurate.com/forums/archive/index.php/t-839.html

similar as how every COD gets 9+ rating from ign-

http://www.zeldainformer.com/news/f...iew-scores-are-skewed-due-to-public-relations
 
Wow, they're really coming out of the woodwork on this one. Seriously, there's things to really appreciate about this card, even for the AMD fans. Believe it or not, this card will probably drive AMD to make better cards in the future if you only buy their cards. You can certainly be critical about the price, but it's not like another FX5xxx series or something, sheesh.
 
Wow, they're really coming out of the woodwork on this one. Seriously, there's things to really appreciate about this card, even for the AMD fans. Believe it or not, this card will probably drive AMD to make better cards in the future if you only buy their cards. You can certainly be critical about the price, but it's not like another FX5xxx series or something, sheesh.

Yeah I read the first link he posted until I got here...

The rumour also says that DirectX 11 will only support GTX 590 in Crysis 2.

COO COO

COO COO
 
"average fps is the most important for gaming and MINIMUM fps determines how smooth your gaming experience will be..." -this has been the whole purpose of gaming benchmark until now...

http://forums.overclockers.co.uk/showthread.php?t=18455827

recently frame latency test came out just when nvi has $3.4B cash in their hand and company has more cash than ever(kudos to NV MARKETING MAFIAS)....guess what nv anarchy going in BACKHAND ...we know very well how much of a cheater nvi really is....(just guessing how many paid shill sites are there for nvi:whiste::whiste:*****here are some prime examples-


http://gamingbolt.com/rumour-crytek-was-paid-money-to-delay-support-of-directx-11-for-crysis-2

http://www.aggressivewarriors.com/showthread.php?tid=4146


http://www.androidauthority.com/jen-hsun-huang-nvidia-intel-68612/


http://semiaccurate.com/forums/showthread.php?p=33626



https://forums.geforce.com/default/...rove-that-the-geforce-gtx-590-is-the-fastest/


http://www.semiaccurate.com/forums/archive/index.php/t-839.html

similar as how every COD gets 9+ rating from ign-

http://www.zeldainformer.com/news/f...iew-scores-are-skewed-due-to-public-relations

What the hell is this, I don't even
 
i just found a 690 GTX online for £640 vs Titan at £840 i think the 690 is the best buy now given that its as fast as the 7970 CF but uses less power
 
20-25% faster than the 7970GE when not cold booted, gets creamed in both OpenCL and DirectCompute while pulling anywhere from less than the 680 to 50W more than the 7970GE. I have to say, at stock Titan is horribly underwhelming for something that costs nearly three times as much. How does it scale with clocks, can overclocking salvage it?
 
I can't wait for someone to even attempt to explain this nonsense. No amount of "frame metering" can cause SLI to have better lantency than single GPU. It's impossible.

Oh but wait. I'm sure the nvidia review guide urged all websites to include these results, they just forgot to mention don't include SLI. Because SLI has less microstutter than single GPU. Whoops. Sorry to anyone who bought Titan over SLI 680s -- in which the 680s ironically have less microstutter.

It's not only on PCPer review, TR's one is showing this too even on a 7970 CF setup. TR's 7970 CF is throwing better frame figures in 4/6 game benchmarks. Sleeping Dogs bench is an hilarious one where they show it as having micro-stutter but every single graph failing to picture it delivering better frame times than any other setup in the review.

Anyway these reviews not only portrait different multi-GPU setups being better than titan. For that matter any GTX 670/680 can be replaced by 2 lesser Kepler cards.

These guys are going to save our sorry asses from FPS based reviews. OFC HardwareCanucks is showing another completely different story for the GTX 690.

http://www.hardwarecanucks.com/foru...orce-gtx-titan-6gb-performance-review-15.html
http://www.hardwarecanucks.com/foru...orce-gtx-titan-6gb-performance-review-16.html

The holy grail of graphics reviewing, sure.
 
Back
Top