Techreport 7950 vs. GTX 660 Ti "Smoothness" videos

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

jimhsu

Senior member
Mar 22, 2009
705
0
76
Using windows 8?

bcdedit /set disabledynamictick yes

Set HPET in BIOS

bcdedit /set useplatformclock true

Set High Performance settings in battery options

Get all current windows 8 updates

Get all current windows 8 drivers for your system

Install Skyrim on SSD

-----

Report back with the results.

In general, windows 8 has a host of problems with DPC latency (google, anandtech, etc), and the above are suggested ways to fix it.

Hm... I see that windows 8 was not the cause of their problems. Still, the tweaks I suggested do improve the latency picture.
 
Last edited:

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
And look. The very people who claim to be not biased flood the thread with "nuh uh" and countless fps graphs when the whole point is fps graphs are misleading. Why do you guys do this in any thread that suggests anything negative about amd? If you want to refute, refute with trusted measures of frametimes (not "this doesn't happen to me"). As it is, it looks like the same old "circle the wagons boys, we have to quash (not disprove) this data!" We have far too many people who think anecdote, and unrelated metrics are enough to disprove an unrelated thing. Your fps graphs are meaningless unless you can prove they were done with the same exact settings, driver revision, etc. I would imagine you know that, but are more interested in PR than truth anyway.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
He retested the old scene and found the same stuttering issues, there was no stutter in previous tests.



I think people are taking this out of context. This isn't AMD admitting there are stuttering issues present, it's more of, AMD is aware of their results and will do their own testing to verify if there is an issue. The good news, if it is an issue with the new drivers, AMD now has an early jump on getting it sorted out. If they can't replicate the issue, well, we'll know soon enough.

Imho,

It's AMD respecting the results from TechReport and simply investigating. They're not ignoring it, placing their heads in the sand, not accusing posters or name-calling, very defensive, trying to dismiss the findings!
 

Rikard

Senior member
Apr 25, 2012
428
0
0
Sweclockers http://www.sweclockers.com/nyhet/16260-amd-radeon-plagas-av-ojamna-renderingstider added an article about this, but so far the only new info is this
Våra egna tester påvisar liknande problem, dock i betydligt mindre omfattning. Det kan tyda på att de ojämna siffrorna inte beror på grafikkortet i sig utan är sidoeffekter av optimeringarna i prestandadrivrutinen Catalyst 12.11.
For those who do not understand Swedish I translate for you:
Our own tests indicate similar problems, although at much reduced severity. The flucuating numbers could be due to optimizations in the performance driver Catalyst 12.11 and not in the card per se.
They promise to study this in upcoming test. It will certainly be interesting!

Regarding the video in the OP, I never experience that sort of stuttering with either AMD or Nvidia, so I wonder what they did to make them perform so poorly. Sweclockers seem to echo that observation.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
And look. The very people who claim to be not biased flood the thread with "nuh uh" and countless fps graphs when the whole point is fps graphs are misleading. Why do you guys do this in any thread that suggests anything negative about amd? If you want to refute, refute with trusted measures of frametimes (not "this doesn't happen to me"). As it is, it looks like the same old "circle the wagons boys, we have to quash (not disprove) this data!" We have far too many people who think anecdote, and unrelated metrics are enough to disprove an unrelated thing. Your fps graphs are meaningless unless you can prove they were done with the same exact settings, driver revision, etc. I would imagine you know that, but are more interested in PR than truth anyway.

How can you call the frametimes accurate if one card appears to suddenly have lost nearly 50% of it's peformance when compared to past reviews of said card?
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
I still can't understand how is it possible to get this numbers:
1173104

1173105

I've been tinkering with excel, to check some random numbers, percentiles and what not. If you want to create a real roller coaster and you make something incredibly stupid such as a random chain of numers that start at 0 and finish at 33 you will get: (IE, frame times go from 0 to 33 ms... and now imagine how stupid that is :rolleyes: but its to go for the extreme).
a) Average framerate = 60 fps.
b) Average time frame = 16.6ms.
c) 99th percentile = 33ms.
My point being: its impossible to get a 99th percentile of 40ms AND and average of 60ms unless you go with negative numbers, that is. Its simply not possible to say that the 99th of the distribution sits at 40ms or below yet the average distribution sits at 16.6ms. Impossible. IM-PO-SSI-BLE. Mathematically it makes sense because you can work with negative data... but you can't have negative frametimes and, thus, none of this makes any sense.
So, I have no idea what TR did it... but they clearly have no QC whatsoever, because its entirely possible and very easy to check for inconsistencies, and frametimes, but:
a) You require to work with large sums of frame times (6.000 frames are a meaningless figure, because we are talking about 90 seconds of gameplay, which is close to nothing).
b) You require to dismiss outliers that might be created by whatever it is created... IE might be created by external factors.
c) You require to rinse and repeat until you find data that is consistent with your objective and can be obtained easily.
http://www.overclock.net/t/1337206/...d-7950-stumble-in-windows-8/120#post_18817131
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
The problem I have with the straw man in this guys random sample is that he is choosing a max swing of 33ms. But the guild wars data is swinging far more violently than that, its swinging all the way up to 80ms a frame and fairly often its going above 40ms. So while it may well be averaging around 60fps the momentary peaks make the 99% possible. I think that guy has misunderstand the distinction between 60ms and 60 fps at one point, which unfortunately makes the whole thing wrong.

Its this graph that explains how its possible:

gw2-latency.gif


So now lets look at the other graph, the trace:

gw2-amd.gif


Everything else is a summary of this raw data, so when it comes to checking it the summaries can be correct we need to go back to this one.

We can validate the summary figure of 37.6ms by counting all the points above and below 16ms. I see slightly more lines below 16ms than above, but the ones above move out further. The main line seems to hover in and around 60-50fps so its not a ridiculous figure, looks by eye to be correct. We know we have ~6000 points and in order for 37ms at the 99% point we need to be able to count 60 points that meet or exceed 37ms. I actually count more than 60, but I might be slightly misreading the difference between 30-35, either way from the raw frame times both pieces of information can be validated without seeing the trace.

This data is consistent with itself. So what about the NVidia data?

gw2-nv.gif


That is the trace and we need to say if 61 fps looks reasonable (looks like its oscillating around 16ms to me) and then us the 99% point likely to be 24.9ms? Well yes obviously that looks reasonable because the very highest peak looks to be 37 maybe and the rest are a tight cluster in the early 20's. Its not as easy to count as the Radeon's because its more consitent and tighter, but its certainly reasonable.

We could do this one any of the pages, the data is consistent with itself with some basic maths and finger counting.

Whether the data is right or something is actually broken we can't be sure, although we just got news that sweoverclockers are seeing the same thing but somewhat less severely (and I do also see similar traces on 7970's but less severely, expected because its a faster card).
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
It takes less than 1 janky frametime out of 100 frames to make a stuttering mess. "99%" doesn't really indicate anything when that's the case.
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
but every site acknowledges that the 7950 delivers WAAYYYY more frames per second than the 660ti therefore it is contradictory

Oh brother. The WHOLE point of this excercise, it would seem, is to show that in order to get those WAAYYYY more frames per second than the 660ti, you get incredible amounts of latency and stuttering. Uneven delivery of frames. So in effect, the 7950 NEEEEEDS WAAYYY more frames per second in order to deliver the same gameplay experience as a 660Ti with WAAYYY less frames per second. At least that is what Brent stated at H when testing SLI vs CF. TR and H are the first sites to really acknowledge this and TR just keeps up the testing. I only hope more sites follow suit.
 
Last edited:

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Something interesting, I was doing more testing with Sleeping Dogs and I get a smoother recording session with MSI Afterburner hardware monitoring than I do with FRAPs. Any other programs that record FPS and frame times?

Although after investing this much time into it I'm kind of wondering the usefulness of this information in the majority of games. I use VSync in pretty much every game I play. Since I'm not playing competitively in a clan or professional team I prefer a consistent minimal screen tearing experience regardless of input lag.
 
Last edited:

thilanliyan

Lifer
Jun 21, 2005
12,065
2,278
126
Oh brother. The WHOLE point of this excercise, it would seem, is to show that in order to get those WAAYYYY more frames per second than the 660ti, you get incredible amounts of latency and stuttering. Uneven delivery of frames.

That's not what he's saying. He's saying that in other reviews, the fps results are higher for the 7950 than they are in the TR review. THAT contradictory result could call into question their frame time numbers, and so THAT inconsistency should be explained FIRST.

If we're going to be thorough about this investigation, you have to also take considerations like that into account. Why was TR's fps results lower than other sites? Did they choose a specific location in the game where this happens? Is it something with their system? etc, etc.

Personally, I think other sites will pick up on this IF this is in fact an issue and am all for something that will ensure a better gaming experience. I myself have not noticed any stutter in the games I play with my 7950, and I think I will take Ryan Smith's advice and just play some games without worrying about frametimes! :)
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126

1) He confuses 60ms with 60fps.
2) Post #134 shows how he can be incorrectly interpreting the numbers and is wrong.
3) Most people on OCN did NOT read the article and just spew crap like "oh it's Nvidia marketing reassurance" and "Oh I don't use Windows 8 so why do I care?" It's clear they have absolutely no clue what was even mentioned at all. WIndows 7 is still a mess...it's not isolated to Windows 8 and that is what was shown. Typical forum lurkers though, don't wanna read anything outside their circle of understanding.

That's not what he's saying. He's saying that in other reviews, the fps results are higher for the 7950 than they are in the TR review. Why is that?

If we're going to be thorough about this investigation, you have to also take considerations like that into account. Why was TR's fps results lower than other sites? Did they choose a specific location in the game where this happens? Is it something with their system? etc, etc.

They used an i7 3820. Not overclocked. Other sites like TechPowerUp for example tests with a 4.6Ghz 3770k. Clock speed and chosen platform (z77 vs x79) can make a difference.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
There comes a point in one's life where you value time more than money. I'm too old to want to keep track of which driver version I need to use for X or Y game to get rid of stuttering; I want it gone regardless of driver version.

NV might not be the nicest company with bumpgate and basically banning overvolting 28nm GPUs, but here's the thing about NV: their driver team will take care of you. Fewer issues like microstuttering, especially for multi-GPU; fewer compatibility issues in general. You even get stuff like CUDA/Physx, AVsync, Surround without adapters, etc. Even if you don't use them, at least you COULD, whereas you don't even get the option with ATI.

I have always preferred NV over ATI ever since a particularly galling video bug in WC3, but NV kept losing the price/perf ratio in the last few years, so I went ATI. But now with this new information coming out (no thanks to certain reviewers who can't be bothered to investigate this longstanding phenomenon that HardOCP has commented on for years), Radeon price/perf isn't looking so good if you account for these jerky frametimes/driver issues.
 

Blastman

Golden Member
Oct 21, 1999
1,758
0
76
While I admit that these results are a bit strange, you should keep in mind that different scenarios can yield different results. Unless you use the exact same scene, benchmarks will not necessarily be comparable. Maybe techreport picked a worst case scenario for AMD by chance, who knows?

Yup, and there is more to it than that. If you look at this Techreport BF3 review, on one of the BF3 levels, Fear no Evil the NV cards had less stuttering than the AMD cards. On another level in the same game, Rock and a hard place, the AMD cards were much superior with regards to stuttering -- according to TR's methods. This goes to show that even the results in a single game can vary significantly depending on what scenes are being rendered.

Sure the video in question may show bigger stutters for AMD in the particular scene and game they benched, but if a benchmark method produces wildly varying results in the same game, then it is a inherently unreliable benchmark method. Using a widely varying benchmark method with a relatively small sample of games and scenes is a recipe for getting outliner results that are open to skewing by cherry-picking.

Hardware sites using such unreliable methods will need to put something in place to make sure that the benchmarks produced using such methods are a reasonable representation of what to expect in the games from the various cards, otherwise, why even benchmark with such a method? TR has done nothing that I can see to help resolve the inherent unreliableness of their benchmark method that was demonstrated in BF3. So, their results and methodology will be called into question.
 
Last edited:

Hitman928

Diamond Member
Apr 15, 2012
6,737
12,457
136
Kyle over at [H]:

HardOCP said:
There is a reason that we led the industry in moving away from "canned" benchmarks with highest FPS being the crowning factor.

We have known about frame times for a long time and we have not found an efficient and effective way to capture PRECISE data. I would suggest that current methods are flawed and that is why we are not publishing objective numbers. But we have been talking about this subjectively for years. And honestly, I am not sure we ever will publish frame time numbers. But we will continue to make subjective statements about the gaming experience that the hardware provides.

http://hardforum.com/showthread.php?t=1733630&page=3
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
TH has a FC3 article up. For the high detail settings, they bring up SLI is smoother than crossfire. From the min/max numbers it's not that obvious, but the plotted fps show some issues.

Far Cry 3 Performance, Benchmarked

High-1920.png

High-1920-FOT.png

AMD's Radeon HD 7870 generally hovers under 30 FPS, below our rough target for playability. Meanwhile, the GeForce GTX 660 Ti and Radeon HD 7950 with Boost are only a little bit quicker. Even the powerful Radeon HD 7970 and GeForce GTX 670 are humbled by average frame rates just above 35 FPS. Only the two Radeon HD 7870s in CrossFire and GeForce GTX 660 cards in SLI manage to generate averages in excess of 45 FPS.
Speaking of multi-card solutions, notice that the Radeons achieve higher average results, but suffer lower minimum frame rates. In the frame rate-over-time chart, you can see that the GeForce boards in SLI yield smoother numbers than AMD's cards, which are not as consistent.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
TH has a FC3 article up. For the high detail settings, they bring up SLI is smoother than crossfire. From the min/max numbers it's not that obvious, but the plotted fps show some issues.

Far Cry 3 Performance, Benchmarked


Well, microstutter will not appear on a plotted FPS graph that is averaged per second anyway (that's the whole point), but even like that, that looks pretty bad.
 
Feb 19, 2009
10,457
10
76
Yup, and there is more to it than that. If you look at this Techreport BF3 review, on one of the BF3 levels, Fear no Evil the NV cards had less stuttering than the AMD cards. On another level in the same game, Rock and a hard place, the AMD cards were much superior with regards to stuttering -- according to TR's methods. This goes to show that even the results in a single game can vary significantly depending on what scenes are being rendered.

This is a good explanation for the recent results, TR prior's Skyrim benches were done in a different scene and the result was completely reversed when they changed the scene for this recent test. Not just the frame time, but average fps, completely lopsided.

It is interesting to see so much variation in the same game, on the same hardware though. Suggests room for driver and or game patches to improve it.
 

jimhsu

Senior member
Mar 22, 2009
705
0
76
Gamebryo *cough* Creation engine has been buggy for the last decade or so. I don't believe a patch from either side is going to fix it any time soon.

Shadows in Skyrim are still rendered by the CPU for god's sake. (well not exactly, but they have huge CPU overhead).

Pity, since Beth makes such excellent games.
 
Last edited:

Hitman928

Diamond Member
Apr 15, 2012
6,737
12,457
136
This is a good explanation for the recent results, TR prior's Skyrim benches were done in a different scene and the result was completely reversed when they changed the scene for this recent test. Not just the frame time, but average fps, completely lopsided.

It is interesting to see so much variation in the same game, on the same hardware though. Suggests room for driver and or game patches to improve it.

The problem with that theory is that they went back to their original scene and the radeon scored 6% less fps and had 70% increased latency beyond 16.7 ms. If it is the driver, why did performance (in pure fps) go down when everyone else's went up (or at least stayed the same)?
 
Last edited:
Feb 19, 2009
10,457
10
76
The problem with that theory is that they went back to their original scene and the radeon scored 6% less fps and had 70% increased latency beyond 16.7 ms. If it is the driver, why did performance (in pure fps) go down when everyone else's went up (or at least stayed the same)?

Buggy card with non-functional boost? Or fluctuations from 850 to 925mhz? They won't mention or test with powertune +20% (eliminates poor cards from random jumps).

Heck just put in a 7970 Ghz and compare.
 
Last edited:

thilanliyan

Lifer
Jun 21, 2005
12,065
2,278
126
but here's the thing about NV: their driver team will take care of you.

Not in my experience...the only showstopper bugs I have had with video cards were with nV drivers. With my 8800GTS 640, Splinter Cell Double Agent crashed on startup due to a bug with G80 drivers in that game (prior nV cards and ATI cards didn't have that problem), Gothic 3 had some shadow problem IIRC, and also "driver stopped responding" errors at stock which went away with a driver update...the GTS 640 was my worst experience in terms of driver problems. That didn't stop me from getting a 8800GT as my next card however as it was very good bang/buck.

I have never had any issue like that on the ATI side and since my G80 card I have not had any driver issues with either camp....point being, BOTH sides have problems with drivers, and I personally never felt nV's drivers were any better, and certainly not enough to sway me to their side. I'll stick with whichever card has the best bang/buck and which I can use my waterblock on for the most part...the only thing that really swayed me towards ATI these past 2 generations was bitcoin mining, which nV have no competition for.
 
Last edited: