• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

[coolaler] Devils Canyon: 4.0 base/ 4.4 turbo @ stock

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Been thinking very much about that i7, sitting on a i5 2500 non k and while its been great in some games i could use a bit more cpu. Mixed between running a Z97 now and a cheap celeron and hoping this new i7 is priced like 4770k then jumping on it or just swapping in a 3770...
 
Been thinking very much about that i7, sitting on a i5 2500 non k and while its been great in some games i could use a bit more cpu. Mixed between running a Z97 now and a cheap celeron and hoping this new i7 is priced like 4770k then jumping on it or just swapping in a 3770...

They've kept similar pricing haven't they? So don't think they'll do a massive price hike. Especially one without announcing it first.

I'm no expert on it but my own personal opinion is that if there is a price hike, it'll be next year when AMD is firmly out of anyone's minds on the high end desktop side.
 
Reguardless of what the clock speeds are, it does look like Devils Canyon's CPU is going to be faster than Broadwell-K. Granted, Broadwell-K's GPU and compute power will be far better.
 
Reguardless of what the clock speeds are, it does look like Devils Canyon's CPU is going to be faster than Broadwell-K. Granted, Broadwell-K's GPU and compute power will be far better.

And it'll still prob be useless to anyone who actually cares about GPU and compute power.
 
Considering my PC just fried itself I'm eagerly awaiting the 4790K. In the meantime I'm stuck on my MB Pro with its Intel Iris 5100 GPU so I'm not going to be doing much gaming

Come on Intel, release it already.😡
 
It's good to finally see a real speed bump from Intel instead of the drip-feed we've been getting since SB. This will probably upgrade my i5 2500K.
 
Those charts are all meaningless. You're either looking at integrated graphics and/or an equally meaningless 720p for discrete setups.

Looking at those charts to determine upgrade viability is missing the forest for the trees.

4%20WinRAR%204.2.png


Seems real enough to me?
 
Seems real enough to me?

Tell you what: you get another i5 2500K (@ stock) and pair it with the fastest memory you like. Meanwhile I'll get a 4GHz Devil's Canyon and keep it and my memory at stock.

Then we'll come back here and compare results from a variety of situations, including WinRAR if you like.

Or you can simply look at this: http://www.ocaholic.ch/modules/smartsection/item.php?page=9&itemid=1005

The 4770K/4760K trounces pretty much everything including an i7 2600K 3.40 GHz @ DDR3-2133.

Cliffnotes: memory speed can make a difference in some bandwidth-bound applications/synthetics, but in the vast majority of situations it's not the primary bottleneck and other CPU factors are far more important.
 
Reguardless of what the clock speeds are, it does look like Devils Canyon's CPU is going to be faster than Broadwell-K. Granted, Broadwell-K's GPU and compute power will be far better.

Yeah if 4GHz default speed is true then it's almost certain. I'm starting to think that Intel is going to sell them as two non competing products: one is high cpu performance at higher clockspeed (I'm still dubious that 14nm will give better clocks), the other has much better IGP and L4 that should help too, the 14nm shrink should make them similar sized (minus L4) so maybe they can have similar prices.
 
Reguardless of what the clock speeds are, it does look like Devils Canyon's CPU is going to be faster than Broadwell-K. Granted, Broadwell-K's GPU and compute power will be far better.

Would be funny if DC actually turns out to be Broadwell-K 😱 :hmm:
 
Well, if DC comes out in June, it certainly won't be broad well. Have to admit though, I am not sure where broadwell K will fit in. I could see a lower power model with the better igp for AIOs and brix like devices, but not sure where an overclockable lga model fits in.
 
Tell you what: you get another i5 2500K (@ stock) and pair it with the fastest memory you like. Meanwhile I'll get a 4GHz Devil's Canyon and keep it and my memory at stock.

Then we'll come back here and compare results from a variety of situations, including WinRAR if you like.

Or you can simply look at this: http://www.ocaholic.ch/modules/smartsection/item.php?page=9&itemid=1005

The 4770K/4760K trounces pretty much everything including an i7 2600K 3.40 GHz @ DDR3-2133.

Cliffnotes: memory speed can make a difference in some bandwidth-bound applications/synthetics, but in the vast majority of situations it's not the primary bottleneck and other CPU factors are far more important.

Hmm..I don't think you're getting my point. I'm not saying it's not worth the upgrade, but that clearly having faster memory does help in some instances. I never mentioned an i5-2500k or whatever you're going on about, so I'm not sure what point you are trying to make? Of course it's still worth the upgrade but that if you had 2133MHz ram you'd eek out a bit more performance.
 
Last edited:
Would be funny if DC actually turns out to be Broadwell-K 😱 :hmm:

Would be hilarious, but no, the slide of Devil's Canyon says 4th gen processor hence no Broadwell. Beside it wouldn't change anything for most users because the only thing that matters at this point is better overclocking, I mean it's an unlocked chip, who cares of 4GHz base clock?
Same for the i5, if it's the same chip even with 3.5GHz base it should reach the same clocks, maybe better ones without HT (but at 2/3 the price hopefully!).
 
I mean it's an unlocked chip, who cares of 4GHz base clock?
Same for the i5, if it's the same chip even with 3.5GHz base it should reach the same clocks, maybe better ones without HT (but at 2/3 the price hopefully!).

I guess the i7 gets binned. Maybe they actually bin them by the Gap between die and IHS or thermal performance.
 
Can you explain what you mean by this? Why is it pointless to run CPU benchmarks at low resolutions?

Because, no one games at 800x1600 or 1024x768 with a GTX680 or similar. The point of CPU and GPU testing is to provide an assessment to PC builders regarding real world gaming performance. If you fire up a game like Crysis 3, Tomb Raider, Metro LL with a GTX680 or 780Ti, etc. at common resolutions like 1080p/1440p/1600p, and you end up with results that show little to no difference in gaming performance between a i7 2600k and i7 4770k, well that's the correct conclusion. The point of benchmarks is not to skew benchmarks to try and sell products, but to show us whether or not the new part will provide us with a tangible and real world gain. For most games now, even at 'budget' 1080p resolution, we are more GPU than CPU limited with any modern Intel i5/i7 CPU. The biggest differences for CPUs are in minimum frame rates.

Even then, an upgrade from Titan/GTX780Ti to the next flagship from NV/AMD will net a greater performance improvement in 98% of PC games than a move from i5 2500K to Devil's Canyon. Even if you run into a CPU limited situation where you are > 100 fps, you can increase visual fidelity via a variety of AA settings like SSAA, etc. and shift the load to the GPU. One of the common exceptions are Blizzard titles which tend to be dual-threaded and rely heavily in clock speeds and IPC.

Those of us who'll be upgrading from i5 2500K or similar to some future Intel CPU (DC or Broadwell/Skylake) will do so because we are: (1) bored and want something new to play with; (2) want next gen features like Ultra M.2; (3) keeping up with our hobby. I can't see how a 5.0Ghz Haswell will provide any tangible performance improvement in games over a 4.4-4.5Ghz SB. The money is better spent on a new GPU/SSD upgrade when it comes to real world performance gains.
 
Last edited:
I guess the i7 gets binned. Maybe they actually bin them by the Gap between die and IHS or thermal performance.

Oh that's highly possible too.
Also the new PR will be... new unlocked chip with just 10μm gap! Let's start a new μm/nm race yeah!
 
I don't subscribe to all that 'real world' notion.
It does remind me though of the late 1990's TomsHardware benchmarks where games were at one point benchmarked only at 640x480 and no AA "to prove a point" that one CPU was "up to 50% faster" (which promptly dropped to about 5-10% difference at resolutions & settings most people actually played at...) Nothing wrong with a couple of low-res benchmarks in addition to normal 1080p benchmarks, etc, but not to the exclusion of. The last time I played a game with sub 1280x1024 resolution with simultaneous zero interest in how it ran at higher resolutions must have been 1998...

Edit: By far the most useful benchmarks for showing what you're wanting are the Techspot style ones which specifically go into core-scaling per game, ie, 2.5 vs 3.0 vs 3.5 vs 4.0 vs 4.5GHz, i7 vs i5 vs i3, Haswell vs Ivy, 2560x1600 vs 1920x1080 x 1680x1050, etc.

Edit 2: The same "low res only" benchmarks are also cheesily used in dumb articles "asking the question" whether iGPU / APU's are going to "replace" mid-range GFX cards 7790/7850, 260X/265, 750 Ti, etc, any-time soon. LOL.
 
Last edited:
Basically you need both to run low resolution and real world test to assess the potential of a CPU.

In the low resolution tests you only tests how much faster the CPU is in a "no limits" scenario. Then you run real world tests to see if it actually matters with these settings.
 
Back
Top