[TECH Report] As the second turns: the web digests our game testing methods

Discussion in 'Video Cards and Graphics' started by Final8ty, Jan 2, 2013.

  1. Hitman928

    Hitman928 Senior member

    Joined:
    Apr 15, 2012
    Messages:
    992
    Likes Received:
    3
    I think 60Hz would be fine for most people. In fact, that would probably make it more relevant for most people. Like I said though, if you're one of the few who has a 120Hz monitor, you might not find it sufficient.

    As far as overhead, they briefly mention a few things they have or are trying to work out, but I can't imagine they don't put the card in a seperate system from the one being benched, so no performance hit for the card/computer being tested. You just have to make sure the other system can keep up, essentially.
     
  2. KingFatty

    KingFatty Diamond Member

    Joined:
    Dec 29, 2010
    Messages:
    3,024
    Likes Received:
    0
    It's possible that the capture card is capable of more than 60 Hz. They mentioned it can do 2560x1600 @ 60 Hz, so perhaps that's a bandwidth ceiling so at other resolutions (1440p, 1080p) maybe it will do 120 Hz? It acts as a monitor to capture the video card output that would be sent to the viewer's eyes.
     
  3. 3DVagabond

    3DVagabond Lifer

    Joined:
    Aug 10, 2009
    Messages:
    11,728
    Likes Received:
    2
    It's one thing to see something and then test to try and quantify the effect. It's quite the opposite to take a measurement and try and attach an effect to it that nobody ever saw. In dual GPU people complained that the animation wasn't as smooth as single GPU. They played games and saw that something was wrong. Then they took measurements to quantify what they saw.
     
  4. Ibra

    Ibra Member

    Joined:
    Oct 17, 2012
    Messages:
    184
    Likes Received:
    0
    AMD cheating in tessellation and definitely will cheat in frame rating.

     
  5. BrightCandle

    BrightCandle Diamond Member

    Joined:
    Mar 15, 2007
    Messages:
    4,763
    Likes Received:
    0
    Have you got a link for reference re: assertion that tessellation being cheated and its impact?
     
  6. Keysplayr

    Keysplayr Elite Member

    Joined:
    Jan 16, 2003
    Messages:
    21,079
    Likes Received:
    5
    I don't believe that I've heard anything about AMD cheating in tesselation.
     
  7. SirPauly

    SirPauly Diamond Member

    Joined:
    Apr 28, 2009
    Messages:
    5,187
    Likes Received:
    0
    Imho,

    And yet many don't perceive micro-stuttering with multi-GPU. Was personally very vocal at Rage3d and one of the first gamers to raise the differences between single GPU and multi-GPU - before the term micro-stuttering was even used -- called it load balancing issues. Also discovered why by pestering MFA at Beyond3d and was told it was frame delays. Just offering frames was never objective to me when comparing multi-GPU to single GPU's.

    This awareness to me is like music to my ears -- sites, investigations, discussions that go beyond just frame-rate and also trying to gauge latency. The awareness may help single and multi-GPU sku's from AMD and nVidia -- the bigger picture. More information for the gamer is always welcomed!
     
  8. VulgarDisplay

    VulgarDisplay Diamond Member

    Joined:
    Apr 3, 2009
    Messages:
    6,194
    Likes Received:
    1
    They had weak tesselation hardware and fixed it. They do allow the user to cheat at tesselation, but they don't.
     
  9. Keysplayr

    Keysplayr Elite Member

    Joined:
    Jan 16, 2003
    Messages:
    21,079
    Likes Received:
    5
    They had weak tesselation hardware in 5 and 6 series. Greatly improved in 7 series. 5 and 6 series remain un "fixed".
     
  10. SirPauly

    SirPauly Diamond Member

    Joined:
    Apr 28, 2009
    Messages:
    5,187
    Likes Received:
    0
  11. Ferzerp

    Ferzerp Diamond Member

    Joined:
    Oct 12, 1999
    Messages:
    6,107
    Likes Received:
    1
    He's probably talking about the 3dmark tests where a user would change the tesselation to not do all the work that the benchmark requested, thus giving bogus scores.

    I'd not call that a systemic "cheating" as much as a "allowing users to cheat" in a way that the benchmark program couldn't tell.
     
  12. OVerLoRDI

    OVerLoRDI Diamond Member

    Joined:
    Jan 22, 2006
    Messages:
    5,470
    Likes Received:
    0
    Awesome! Looking forward to these updates.

    My dual 7970s never felt right, especially in the Witcher 2. I'm not gaming much at the moment so I haven't been all over this.
     
  13. BrightCandle

    BrightCandle Diamond Member

    Joined:
    Mar 15, 2007
    Messages:
    4,763
    Likes Received:
    0
    Yes the Witcher 2 was dreadfully bad, one of the worst of the games I played with the 7970's. I found if I balanced it perfectly at 60 fps and it never went below that I didn't get to vomit levels, but any drop below 60 fps (which is really common in the Witcher 2 regardless of settings) it would be just awful.

    Playing it on the 680s was a dream in comparison, I could happily run surround and higher graphics settings without ever feeling any ms. My frame time graphs were nominal on the 680's but 25ms swings on the 7970s.
     
  14. Face2Face

    Face2Face Diamond Member

    Joined:
    Jun 6, 2001
    Messages:
    3,677
    Likes Received:
    5
  15. krumme

    krumme Diamond Member

    Joined:
    Oct 9, 2009
    Messages:
    3,479
    Likes Received:
    6
    Its great we get new ways to measure gfx performance.
    I stopped using FPS assessments years back, and just relied on my personal impression.
     
  16. f1sherman

    f1sherman Platinum Member

    Joined:
    Apr 5, 2011
    Messages:
    2,244
    Likes Received:
    0


    nobody ever saw

    attach an effect

    Where do you get this from?
    That effect is in fact TR's performance enumeration. BENCHMARK!

    They took the measurement according to their methodology, and left to reader to take what they want from it. They added nothing.
    Apparently AMD took it seriously and are working hard to eliminate what does not exist and..."nobody ever saw"

    "There is no one single thing for, its all over the place - the app, the driver, allocations of memory, CPU thread priorities, etc., etc"
    -Dave Baumann-

    You seem to think FPS avg is bread and butter of gameplay experience.
    Some people disagree, Techreport too, as they think that 99th percentile frame time better enumerates gameplay experience and conversely GPU performance.

    Since there is a need for GPU benchmarks, there has to be a method to it.
    TechReport's method, same as conventional FPS measuring does just that.
    It enumerates GPU performance. Card A measures such-and-such, and card B measures such-and-such.
    Whether you can tell the difference between two cards, does not matter, and it's a whole story all together, but here it goes:

    I hate to post bench graphs, but this is so obvious it has to be done.
    Cherry picking? Yes, in order to get my point across, not for the sake of cherry picking.

    [​IMG]
    [​IMG]
    [​IMG]

    If you're going to argue that above jittering and 7950 20fps -> 100fps frame jumps are indiscernible from rather steady 660 Ti frame output,
    but OTOH the difference between, for example, GPU that generates 40fps avg and the one giving 50fps is HUUUGE and easily observable 20%, it's my opinion that you are wrong.
    Let me double that:

    Anyone not noticing the difference between above two GPUs, sure as hell won't be able to tell the difference between 40fps and 60fps
    (let alone will he be able to notice some silly 3fps or 5% that's often a matter of debate here)

    To such blindo, TR methodology should be even more important GPU metric than raw fps average.

    PROOF: NONE! (Just a years of gaming experience and a strong hunch coming from a modest amount of grey matter between the ears :D)
     
  17. willomz

    willomz Senior member

    Joined:
    Sep 12, 2012
    Messages:
    337
    Likes Received:
    0
    For me it's these graphs that show the most:
    [​IMG]

    You can read off the 99th percentile, or the 95th, or just get a general impression of how spread out frame latencies are. You can also get a good impression of overall FPS.
     
  18. AdamK47

    AdamK47 Lifer

    Joined:
    Oct 9, 1999
    Messages:
    11,957
    Likes Received:
    24
    Where can I see results of two 690s frame time in quad SLI? Has this been tested?
     
  19. Rikard

    Rikard Senior member

    Joined:
    Apr 25, 2012
    Messages:
    428
    Likes Received:
    0
    Yes, I would like that! It is the closest you can get to actual experience without buying it. For noise tests I find sound recordings extremely helpful before making a purchase. Graphs are good because they are objective, but what matters for a potential customer is how you subjectively experience it.

    I could not agree more. (I guess being an enthusiast is a bit like buying a Ferrari even though you know you will never drive 200 km/h and that you are not a good enough driver to handle it if you tried. People drive Ferrari's for other reasons.)

    Anyway, this is very good news!
     
  20. VulgarDisplay

    VulgarDisplay Diamond Member

    Joined:
    Apr 3, 2009
    Messages:
    6,194
    Likes Received:
    1
    I'm actually really curious about how a 120hz monitor effects frametimes. I will admit that my understanding of how the frametimes is calculated is fairly limited, but looking at the PC per article on their new testing methodology makes me wonder if a monitor with a faster refresh rate will display more of the chopped up frames that the poor frame metering exhibits on the 60hz monitors most reviewers seem to be using.

    Perhaps the choppy output I am seeing on my 120hz monitor at lower framerates is due to non uniform frametimes? Even though there is a problem with my AMD hardware at this time I find this all very exciting for what future fixes may entail for users. If all the sudden whatever fixes nvidia and amd come up with make it so lower framerates are not discernable from higher framerates in terms of smoothness perhaps they can just cap framerates at lower levels and then push image quality with whatever resources they aren't using anymore to get the maxiumum amount of frames rendered as possible.
     
  21. Haserath

    Haserath Senior member

    Joined:
    Sep 12, 2010
    Messages:
    789
    Likes Received:
    0
    Wonder if they'll wait for HD 8000 to release it then say "look at the difference!"

    But, of course, it will be for every 7k series GPU as well.

    Hopefully the fix moves them up another notch.
     
  22. 3DVagabond

    3DVagabond Lifer

    Joined:
    Aug 10, 2009
    Messages:
    11,728
    Likes Received:
    2
    All of that to dispute 2 statements in one sentence. :phew:

    I'm talking a visible effect, not a benchmark. Surely you could understand that without trying to make it out as something I never said. Like the parts in red. Which I never said.

    I'm not sure why they changed the benchmark suite, almost entirely, between their first review and the rematch. Other than the 7950 did much better in the first review. They did all of their special sauce testing then too. For the most part the frame latency differences in the first review followed very closely to the avg. FPS differences. That's not the same on the rematch review.

    Here's the Skyrim 99% from the first review.
    [​IMG]

    In the rematch the 7950 dropped down to be as bad as the previous gen Fermi cards. Which, btw, if you want to apply the time latency tests to them, they were absolutely abysmal in most tests. I also don't recall people complaining about how jittery Fermi was. Especially the people who are the most vocal about the AMD numbers.
    [​IMG]

    I'll pick a game that we know is not buggy and runs well on both brands, BF3. Will the people who own Fermi cards please tell us how awful and jittery they are in BF3 (and just about every other game in the first review)? I apparently have missed all the threads about it. The concern that we need testing to bring it to light so it can be fixed. I mean a lot of people bought and still own those cards. Don't they deserve to have their hardware operate properly? Even though it's never been complained about, we now have the measurements that prove they are jittery as hell, and not just in BF3 either. Look at the review HERE. They just must be unplayable, if these measurements mean anything at all to real time game play.
    [​IMG]

    As far as AMD wanting to fix it goes, it would be really stupid of them to ignore it, now wouldn't it. I'm sure they want their cards to bench well in these tests.
     
  23. BrightCandle

    BrightCandle Diamond Member

    Joined:
    Mar 15, 2007
    Messages:
    4,763
    Likes Received:
    0
    The original review was done in town somewhere. The second was done in some scrub land. They did a load of testing of various areas and concluded that the test area needed to change as more problematic areas existed for performance. Its explained in the podcast. This is why the Two tests don't match up.
     
  24. willomz

    willomz Senior member

    Joined:
    Sep 12, 2012
    Messages:
    337
    Likes Received:
    0
    Yes, it is pretty clear that the 560Ti and 470 are unplayable over that benchmark at those settings. This is exactly why people choose to upgrade old cards, or at least dial the settings back a bit. If you only have a 560Ti you might want to turn Ultra off and maybe other options.

    The hardware does operate properly, it just doesn't have the grunt available. Compared to the 470, the 7950 has almost double the memory bandwidth and almost triple the texel rate, it is supposed to be better.

    If a 2-3 year old card doesn't perform well you shouldn't be surprised, if a high end current generation card doesn't then you should.
     
  25. Imouto

    Imouto Golden Member

    Joined:
    Jul 6, 2011
    Messages:
    1,243
    Likes Received:
    0
    Fermi was an even greater stutter mess and no one complained as 3DVagabond said.

    I'm not saying that this shouldn't be improved but ffs make a little sense and acknowledge that this "problem" ain't as big as you're picturing it. The way you talk you're saying that the previous 2 Fermi gens were utter shit.

    TechReport did a sloppy job with this review comparing only a single model 7950 to the 660Ti and changing their whole benchmark suite to fit their desired outcome. I'm waiting for a serious article in several sites about this issue including all the cards for current and last gen.

    Several graphics reviewers said that this issue is hard to tell or even impossible to tell apart in a blind test but you're going nuts about it.

    Meanwhile some ppl should stop looking like bubble boys with brand allergy, seriously.

    @willomz this thread is about smoooooothness. As far as I know several ppl complained about the options in TR benchmarks to be unplayable like turning off Vsync making physics in Skyrim go mad. Unplayable settings ain't in a bencher dictionary.
     
    #150 Imouto, Jan 5, 2013
    Last edited: Jan 5, 2013