[HWUnboxed] "Are Quad-core CPUs dead in 2017?"

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
202
106
What would be the point of Intel releasing a 4C/8T Coffee Lake CPU?

If you look at AMD's lineup, their 4c/8t 1500X fits in nicely. The 1600 has more cores but lower clockspeeds, and if you look at benchmarks, the 1600 is generally a better chip despite the lower clocks.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
If you look at AMD's lineup, their 4c/8t 1500X fits in nicely. The 1600 has more cores but lower clockspeeds, and if you look at benchmarks, the 1600 is generally a better chip despite the lower clocks.
Yes but all of the Ryzen CPUs except for the Ryzen 3 1200 have SMT.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
^ This. Every time a new CPU generation comes out, we go through the same recycled "is X now dead?" based clickbait from self-sampling enthusiast writers. "Real world mainstream people" are the ones who don't argue about CPU's / GPU's in tech / gamer forums day after day, and are too busy playing games (not even AAA's) to argue over who a "Real Gamer (tm)" is. "Mainstreamers" often have static needs regardless of available hardware, eg, light office work, email, web, ripping a CD to MP3, etc, haven't changed in +15 years). Those who need heavy "time is money" parallel workload solutions (ie, 4K video encoding, rendering, etc, for work) will have their employer buy them one and then be too busy using it to make daily "look what I just bought" posts. Enthusiasts often over-sell exotic use-case scenario's as "every man's needs" via goalpost moving to justify a new upgrade whilst those with lower hardware simply avoid those same scenario's with greater common sense. Examples:-

- Video. A guy has a typical 20min video to encode for Youtube upload that evening. Let's say encoding times are TR 1950X = 3m / R7 1800X = 5m / i5-7600K = 10m / G4560 = 18m. The guy starts the encode at 6pm then goes and eats dinner, watches the news, etc, and comes back at 6:30pm. All four tasks are complete, have been sitting idle for +10mins and none have held him up for typical light usage until the amount of video encoded becomes substantial (hours per day). The 16/32 chip may be desirable and benchmark 3-6x faster vs 2/4 & 4/4 chips but it's not necessary for the average user to need one to accomplish the same sub-30min task in any evening, nor will the user get 3-6x more done if he only needs that one video encoded. It's why despite funny comments of 4GHz 4C/4T not being good enough, half the people with Youtube channels have 2GHz 15w "U" chip 2C/4T laptops and don't see what all the fuss is...

^ This is what separates enthusiast from mainstream. The benchmark obsessed enthusiast will argue how much faster everything will be for all future infinite work. The mainstream user will simply ask "Can it do this finite task" (ie, "encode a video whilst I'm making dinner / watching Netflix", "type letters", "play music", etc) and literally not care about number chasing beyond that. Same goes for other stuff:-

- Photo editing : Enthusiast = "Now I can run batch Adobe Photoshop scripts for 80-layer banner-size 50MP prints I may develop a future interest in!" Mainstream = "Can I crop, resize, rotate, remove red eye, adjust color and add funny captions to my 5-20MP single-layer grainy smart-phone JPG's in PS Elements / Paint.NET / GIMP on this? It's so simple even my phone has enough horsepower, but I want to use a bigger screen."

- Web browsing. Enthusiast = "OMG, my i5 rig is so slow during 100 tab web browsing sessions with 90x of those tabs all auto-playing video and flash ads. Therefore a 16/32 ThreadRipper is the new low-end to browse the net". Mainstream = "When I asked how to speed up web browsing, I was universally recommended to install ublock Origin and since then my web page render speeds have quintupled with all the trackers, pop-ups, pop-unders, etc, removed as a bonus. Now my i3 is so fast!!!"

- Office. Enthusiast = "My Excel Monte Carlo benchmarks are too low. I need to upgrade just in case I decide to become a Fortune 500 statistician working from home specialising in uncertainty probability modelling". Mainstream = "My son just swapped out my hard disk for something called an "S-S-D". Now all my normal sized office files open in under 1s on my Pentium laptop!"

- Audio. Enthusiast = "Even though I have zero interest in audio now and don't have the software anyway, at some point I may need to apply multiple filters to 32x tracks simultaneously. Ryzen 1800X or i7?" Mainstream = "Can this laptop rip a CD to MP3? I'm told even the slowest dual-core CPU's available today are bottlenecked by the optical drive speed"

- Gaming. Enthusiast = "I have an i7-7700K @ 5GHz with GTX 1070 and this is the third f****** stupid sh*tty console port that keeps dropping below 50fps. I DIDN'T SPEND $450 FOR ANYTHING LESS THAN ULTRA!" Mainstream = "Currently rocking 60-70fps in BF1, Doom, GTA V, etc, on custom High/V High mix on a 1050Ti. It's surprising how many heavier games can run at 60fps too by turning all the silly Pure Ultra sh*t off like "Chromatic Aberration, Lens Dirt", etc. I just don't see the point in putting higher res textures in only to smudge 80% of them outside the centre of the screen all out again with 3 layers of blur. By switching off "smear your monitor in Vaseline and pretend you have severe Myopia and Glaucoma simulator" in the settings menu, I easily turned the 50fps that the 'proper' benchmarks said I would get into a solid 60fps".

Enthusiast = "PC's are my hobby for life. All other enthusiasts online agree that's what mainstream is"

Mainstream = "I'll get what's good enough for what I need and then run it into the ground. One thing I've learned is that half of performance isn't just what you have, but the way that you use it"

Nice post.

I'll just add my own perspective about entusiast vs. mainstream.

For me, at least (there are properbly a few who feel the same way out there), PC-building has never been about pure performance, as much as being able to get something customised around my particular needs. This has meant some really oddball systems over the years, that you'd simply be unable to get from any manufacturer. Why is that important? Because I'd have to accept sometimes serious compromises to be able to get an off-the-shelf system. This way, I can also go where I get the most value for my cash. I couldn't care less what brand a particular component is, as long as it does it job without hiccups.

As for the F&F-segment, there is everything you can think of under the sun, and likely a few things you can't :), running. It doesn't have to be the latest and greatest, but it does have to be reliable. For the simple reason that most people will run it into the ground as long as it meets their need.

I -was- planning on retiring the oldest of those systems when RR comes around. C2D/Qs, first-gen Core is and various Athlon/Phenom X2/4s. They're not bad systems as such, but they are getting to the end of were I'm confident their components can last. But take that concern away, and most people would likely be fine for the next 5 years comfortable. And we are talking 7-10 year old systems... ;)
 
  • Like
Reactions: whm1974

Triloby

Senior member
Mar 18, 2016
585
273
136
As long as game developers have to cater to the lowest common denominator in terms of PC users, I don't see quad cores going away anytime soon. The fact that people can still recommend hyper-threaded dual cores like the Pentium G4560 or G4600 as a decent budget CPU for light tasks and light gaming, shows that quad cores have more than enough life in the CPU arena. I wouldn't recommend dual cores nowadays, but if you're truly stuck on a limited budget, then it's understandable.

And for most people playing games, as long as your CPU can reach 60 FPS or more in games at 1080p (along with whatever GPU you have), there's very little to no need in spending extra for something like an i7 or higher.
 

TheELF

Diamond Member
Dec 22, 2012
3,973
730
126
I think that you're sadly mistaken. Sure, the i5 may show "max" slightly higher than the Ryzen 7 CPUs, but the important metric to look at, is the 1% and 0.1% lows. Those are MARKEDLY BETTER / HIGHER than the i5-6600K. In short, the Ryzen CPUs are a BETTER CPU than the Intel i5 CPUs, for BF1, hands-down.

Edit: Unless you want to sit in-game, staring at a wall all match, and just gaze at your "higher" FPS counter on Intel.
It's 95.5 for the 1800x against 88.8 for the i5-7600k both at stock clocks,and I specifically said kaby i5.
Also did you look at the stock clocks smt1 (enabled) numbers? That's how the average user will run anything and for that the 1% and 0.1% lows are MARKEDLY BETTER / HIGHER ON THE i5-6600K the 0.1% are on par with the i5-2500k...
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Every time these threads pop up, the answer is always the same. Quad cores are alive and well, but dual cores are on life support. Straight dual cores that is, and not dual cores with SMT.

These questions always fail to take into consideration that ultimately, its a user's computing habits which dictate the lifespan and rate of obsolescence of their hardware. For someone like me, dual cores, dual cores with HT, straight quad cores, and straight quad cores with HT are definitely dead.

But there are plenty of dudes out there still rolling with a single core CPUs, as they don't use their PC for anything other than checking emails and Facebook. If you're a gamer though, straight dual cores are basically completely dead. A lot of these new games won't even boot up on a dual core, and if they do, the experience is so slow and painful that you will think about shooting yourself.