Sandy Bridge vs Haswell vs Coffee Lake [Computerbase.de]

Discussion in 'CPUs and Overclocking' started by Carfax83, Dec 6, 2017.

  1. Carfax83

    Carfax83 Diamond Member

    Joined:
    Nov 1, 2010
    Messages:
    5,533
    Likes Received:
    330
    Interesting generational comparison between SB, HW and CF from Computerbase.de in some productivity apps and games.

    Although people like to complain about the lack of advancement in desktop CPUs, these tests remind us that some of the advancements that were made, take a while to show up as developers have to optimize for the new instruction sets.

    And in some of those tests, you can see that. One thing that really popped out to me is the performance increase for Haswell in Forza 7 and Wolfenstein 2 over Sandy Bridge at 720p. In both of those titles, the stock 4770k is able to beat the overclocked 2600K despite it having a large clock speed advantage.

    These gains don't manifest themselves nearly as profoundly in the DX11 titles. So I'm assuming that the compilers they are using for DX12 and Vulkan games are able to take advantage of AVX2?

    Am I wrong in assuming this?

    *Edit* I forgot to put the Computerbase.de source tag in the header. Can a mod edit that in for me? Thanks.
     
  2. Loading...


  3. IEC

    IEC Super Moderator
    Super Moderator

    Joined:
    Jun 10, 2004
    Messages:
    12,962
    Likes Received:
    2,034
    Done. Added tags for you.
     
    Carfax83 likes this.
  4. Arachnotronic

    Joined:
    Mar 10, 2006
    Messages:
    11,475
    Likes Received:
    1,805
    The gain from SNB to HSW clock-for-clock looks like it's about 10%, in line with the expected IPC change between the architectures.
     
  5. Carfax83

    Carfax83 Diamond Member

    Joined:
    Nov 1, 2010
    Messages:
    5,533
    Likes Received:
    330
    Yes, but in certain applications (games included) the gains are much greater than 10%, ie Handbrake. The reason for the performance benefit in an application like Handbrake is obvious though. Handbrake uses FMA and AVX2, two instructions which can increase performance significantly.

    But my question pertained to the games. Forza 7 and Wolfenstein 2 show very large gains with Haswell and Coffee Lake when CPU bound compared to Sandy Bridge; 44% in Wolfenstein 2 and 35% in Forza 7 when comparing a stock 4770k to a 2600K. These gains are significantly more prominent than the DX11 titles. This makes me suspect that the compilers they are using for DX12 and Vulkan have AVX2 flags.

    How else could such a large gap be explained?
     
  6. EXCellR8

    EXCellR8 Golden Member

    Joined:
    Sep 1, 2010
    Messages:
    1,636
    Likes Received:
    283
    Prozessorleistung!
     
  7. dullard

    dullard Elite Member

    Joined:
    May 21, 2001
    Messages:
    21,872
    Likes Received:
    474
    I would assume that it is mostly memory based. The DDR3-1333 CAS 9 memory in the stock Sandy Bridge tests is both 20% lower bandwidth AND 20% higher latency than the DDR3-1600 CAS 9 memory in the stock Haswell tests. Any benchmark that is memory bound (these certainly are not GPU bound) will be at least 20% faster on Haswell.

    I didn't read the article closely, but since these are CPU-limited tests, you likely are reaching the point of downthrottling, at least on the higher powered Sandy Bridge. The 94 W Sandy Bridge could very likely run at base 3.4 GHz while the 84 W Haswell can keep on chugging at turbo 3.9 GHz assuming that they have the same cooling setup. There is another 15% speed difference (until the Haswell heats up enough to downthrottle). The extent of this effect depends on how long the benchmarks were ran which is data that reviews rarely tell you. A short test might not reach the Haswell downthrottling point, a long intensive test might downthrottle both CPUs. But tests with moderate length is where you will see the differences.

    Toss in a few percent differences for the 100 MHz clock speed differences in the chips and maybe some very mild IPC improvements and you are up to 40% gains in certain situations.
     
  8. TheELF

    TheELF Platinum Member

    Joined:
    Dec 22, 2012
    Messages:
    2,262
    Likes Received:
    125
    One would hope that at least in reviews they'd have a decent enough setup for thermal throttling not to occur.
     
  9. dullard

    dullard Elite Member

    Joined:
    May 21, 2001
    Messages:
    21,872
    Likes Received:
    474
    I would hope the opposite. Give us what we are likely to see in real world non-enthusiast situations (dust buildup, computer inside a cabinet, OEM builds with poor cooling, etc).
     
  10. dave_the_nerd

    Joined:
    Feb 25, 2011
    Messages:
    14,438
    Likes Received:
    810
    Are those situations likely to apply to people with overclocked 8700K CPUs?

    Anyway, it looks like a big chunk of the overall performance boost comes from extra cores, not IPC improvements. It's still better performance, but somehow feels less impressive.
     
  11. dullard

    dullard Elite Member

    Joined:
    May 21, 2001
    Messages:
    21,872
    Likes Received:
    474
    Are people with overclocked 8700K CPUs the only ones interested in knowing which processor is right for them? That is, should reviews only be in the most optimal situation possible, with great cooling, brand new processors, dust free cases (if any case is used at all), starting with cold processors?
     
  12. PhonakV30

    PhonakV30 Senior member

    Joined:
    Oct 26, 2009
    Messages:
    826
    Likes Received:
    261
    I think Heavy CPU Usage on Assassin's Creed because of DRM.
     
  13. Spjut

    Spjut Senior member

    Joined:
    Apr 9, 2011
    Messages:
    776
    Likes Received:
    31
    I'd want to know if they used worst case scenarios for the tests and seen minimum framerates, but the i7 2600k at stock speed and 1333 mhz is still damn good for a six year old CPU.
     
  14. spat55

    spat55 Senior member

    Joined:
    Jul 2, 2013
    Messages:
    538
    Likes Received:
    5
    Upgrading from a 2600k to 8700k would be worth it if the CPU wasn't so overpriced because of the lack of production along with RAM prices being ridiculously high, I was considering getting the R5 1600 on sale for £135 which would have been a bargain until the RAM would've cost me at least £180 for 2x8GB B Die 3200mhz. I keep hearing about PC hardware dying the reason why mainstream adoption is so low is because of ludicrous pricing, AMD have done their part now it's upto DDR4 manufactures to do the same.

    8700k looks like a great CPU but we all know that the 9700k will have 8c/16t with another new chipset so what is the point of Z370 when it'll be obsolete within 6 months, at least AMD will support AM4 until 2020.
     
    moonbogg likes this.
  15. SPBHM

    SPBHM Diamond Member

    Joined:
    Sep 12, 2012
    Messages:
    4,439
    Likes Received:
    133
    so even at 720P 2600K OC from basically 7 years ago is 79% of a new 8700K for their gaming test....
    (I know you can OC the 8700K, but the gain is not going to be impressive because the default clock is already very high during games)
     
  16. Carfax83

    Carfax83 Diamond Member

    Joined:
    Nov 1, 2010
    Messages:
    5,533
    Likes Received:
    330
    Nah, there's no way that gap can be accounted for by just memory speed. And when you look at the overclocked scores with the 2600K @ 4.6ghz with DDR3 2133 vs the stock 4770K with DDR3 1600 in Wolfenstein 2 and Forza 7, the stock 4770K still manages to outperform it, though just barely. But the 4770K has an almost 1ghz clock speed deficit against the overclocked 2600K.

    It has to be the instruction set that's being used ie AVX2, or perhaps that Haswell is able to issue twice as many SIMD instructions per clock as Sandy Bridge.
     
  17. Carfax83

    Carfax83 Diamond Member

    Joined:
    Nov 1, 2010
    Messages:
    5,533
    Likes Received:
    330
    No it's not because of the DRM. The CPU usage only spikes in congested areas like Alexandria and other cities, which implies that it's due to things like A.I, detail level etcetera..
     
  18. Carfax83

    Carfax83 Diamond Member

    Joined:
    Nov 1, 2010
    Messages:
    5,533
    Likes Received:
    330
    But most of those games aren't really CPU intensive other than AC Origins. And when you look at AC Origin's 99th percentile scores, the overclocked 2600K has 50% higher frametimes compared to the 8700K, which is definitely going to be noticeable.
     
  19. SPBHM

    SPBHM Diamond Member

    Joined:
    Sep 12, 2012
    Messages:
    4,439
    Likes Received:
    133
    I still fail to see that as a big deal, it's clearly a game taking advantage of more core, and as far as I know the DRM is very inefficient, maybe they can patch that.
    in some games sure you will notice the gain, but the level that 2600K OC is achieving is really good, for a 7 years old CPU that was almost $100 cheaper.

    also what happens if we compare with a 3930K OC?

    and then you look at their 4K results and see how hard GPUs can be pushed today, if you don't have that 1080 ti, it seems like a big waste of CPU (most of the time for gaming, in some gaming scenarios it can be justified)

    basically yes, the 8700K is a lot faster (IPC + cores), but gaming is kind of being hold back by the ultra slow console CPUs I think.
    which makes the past 7 years very uninteresting for CPU perf gains in gaming.
     
  20. epsilon84

    epsilon84 Senior member

    Joined:
    Aug 29, 2010
    Messages:
    298
    Likes Received:
    148
    The 8700K is what, around $50 over the MSRP at the moment? Even at MSRP it's not a cheap chip, it's not like it suddenly becomes a bargain at $370 instead of $420.

    It's a flagship CPU at flagship prices (inflated at that) and anyone who is looking for pure bang for buck in a gaming chip is better suited with an i5 8400 or R5 1600.

    DDR4 prices are insane and has been one of the reasons I haven't upgraded my 2600K yet, that plus the fact its still good enough to hold 60fps mins in the games that I play.

    Is the 9700K actually confirmed? Even if it will be released, it hardly makes a 8700K obsolete, especially for gaming. How much advantage do see going from a Ryzen 5 to Ryzen 7 in gaming? Almost nothing.

    AM4 upgradeability is a nice bonus, but again, like the hypothetical 9700K release, having Ryzen+ or Ryzen 2 doesn't suddenly make the 8700K obsolete. It will still be one of the fastest CPUs for gaming in 2018 and possibly even 2019. I don't see AMD matching the 8700K in gaming performance until Zen 2 and a 9700K will hardly be any faster in that regard either as 6C/12T is more than enough for gaming now and for a few years yet.

    Anyone who buys a 8700K today will be good to game for many years to come - 5GHz OC + high IPC ensures that. It's expensive, but actually a good long term investment IMO because there is no need to upgrade it for many years, the same way a 2600K was a good long term buy in 2011.
     
    #19 epsilon84, Dec 7, 2017
    Last edited: Dec 8, 2017
    WhoBeDaPlaya likes this.
  21. Carfax83

    Carfax83 Diamond Member

    Joined:
    Nov 1, 2010
    Messages:
    5,533
    Likes Received:
    330
    It's not a big deal, but it goes to show that the gap is potentially much larger in an actual CPU intensive game. Also, like I said above, the DRM is not an issue. The DRM is just a scapegoat that the pirates are using to stir up controversy and FUD because they can't crack the game.

    The gap would definitely be smaller no doubt. But in Wolfenstein 2 and Forza 7, those games seem to take advantage of AVX2 or Haswell and better's increased SIMD throughput.
    Yeah I definitely agree here. The console CPUs are just too weak to really make games that take advantage of modern CPUs in a big way. Whenever Star Citizen becomes available, that will be a big game changer I think as it will showcase what a modern PC is fully capable of.
     
  22. PingSpike

    PingSpike Lifer

    Joined:
    Feb 25, 2004
    Messages:
    20,789
    Likes Received:
    42
    Good points about the flagship at flagship prices. But I also decided to sit on my hands anyway because frankly it just seems like buying now is buying high. Ram is high and for my case I've started to be a bigger user of ram so its extra painful. There's not any reason to think Intels CPUs and motherboards will be more expensive the longer I wait, it looks like they'll get cheaper as the supply clears up. If I was going to buy anything overpriced now I'd probably buy a video card because it'll work on an old platform and a new platform.

    I think the reason people are only looking at the flagship though is history. I bet people who bought the 2600K mostly thought they'd be tossing that thing for something way better in a couple years. Well, its been 5+ years and we're only now seeing something from Intel that could maybe be seen as way better. So from their perspective they should just buy the flagship again because then they'll be good for 5-7 years. They don't want to buy the lower end chip and then 3-4 years from now find themselves asking whether they should buy a used 8700K that costs $300 still or if they should buy a whole new DDR5 setup for even more money. Rather just take the pain all at once. That used to be an insane strategy with PC parts but times have changed.
     
  23. dullard

    dullard Elite Member

    Joined:
    May 21, 2001
    Messages:
    21,872
    Likes Received:
    474
    $420 is a cheap chip. Especially compared to what top consumer chips cost in years past (usually $600 to $1000).
     
  24. epsilon84

    epsilon84 Senior member

    Joined:
    Aug 29, 2010
    Messages:
    298
    Likes Received:
    148
    I share a lot of your views on this. I agree that most people who got the 2600K at the time wouldn't have expected it to remain relevant for so long. I do see similarities though between the 2600K and 8700K as I said previously. I expect the 8700K to be a lot more 'future proof' than the 8600K the same way the 2600K holds up much better in todays games than a 2500K.

    Progress in IPC and clockspeed have slowed to a trickle in recent years and I don't see that changing anytime soon unless we see a radical change in CPU design (moving on from Silicon perhaps?)

    Nowadays it's all about more cores but we need game engines that can fully take advantage of huge thread counts, which is probably a lot easier said than done.
     
  25. epsilon84

    epsilon84 Senior member

    Joined:
    Aug 29, 2010
    Messages:
    298
    Likes Received:
    148
    Only Intels Extreme Edition chips cost that much though, or HEDT chips.

    All desktop i7 'K' SKUs from the past 6 or 7 years have cost around the $350 mark, give or take. I wouldn't call that cheap, but it's all relative.
     
  26. dullard

    dullard Elite Member

    Joined:
    May 21, 2001
    Messages:
    21,872
    Likes Received:
    474
    Yes, in the last 7 years--of about 40 years of consumer CPUs.

    Here are CPUs in years before that with top consumer CPU and top EE CPU at the time. This is just a small selection, sometimes multiple top CPUs were launched, usually at about the same price point.

    2010: i7 880 $583. top EE was the 975 at $999
    2008: Q9650 $530. top EE was the QX9775 $1499
    2006/2007: E6700 $530, Q6600 $530, Q6700 $851, the top EE QX6700 $999
    2004: Pentium 4 3.4 GHz : $415, Pentium 4 3.46 Extreme Edition: $999
    2002: Pentium 4 2.2 GHz: $562, Pentium 4 2.66 GHz $637
    2000: Pentium 4 1.4 GHz: $819. Pentium III 1000 MHz $990
    1998: Pentium II 400: $824
    1996: Pentium 200: $599
    1994: Pentium 815 $995
    1992: i486DX2 $550, OverDrive 25 MHz: $699, OverDrive 33 MHz: $799

    These are 1000 unit price at launch, street pricing may have been higher.

    These prices aren't limited to Intel either.
    2006: The AMD 5000+ was $696, FX-62 was $1030
    2004: 3800+ was $720 with 2.4 GHz Opterons were $851, FX-55 was $827
    2002: 1500+ was $525.
    2000: AMDs 900, 950 MHz, and 1 GHz Athlon processors launched at $899, $999, and $1299.

    $420 street price just really is cheap in comparison to the historical prices for top consumer CPUs.
     
    #25 dullard, Dec 8, 2017
    Last edited: Dec 8, 2017