[HWUnboxed] "Are Quad-core CPUs dead in 2017?"

VirtualLarry

No Lifer
Aug 25, 2001
56,339
10,044
126
https://www.youtube.com/watch?v=9VypDQw5sbI

Interesting video. He recommends a Ryzen 5 1600 (and OC using stock cooler), over a i5-7600K, purely due to value, but then comments closer to the end that if the Ryzen 5 1400 and Intel i5-7600K were around the same price, he would choose the i5-7600K over it, even though it is only a 4C/4T, and the 1400 is a 4C/8T.

He mentions StarCraft (2?) gaming early on, something that AMD CPUs have always traditionally lagged a bit on, so I wonder if he was making his recommendations on that.

Certainly, with BF1 64 MP, or WD2, a Ryzen 5 1600 or better would be preferable to a 4C/4T Intel CPU. (My personal opinion.)

Edit: Also, can you watch THIS video, on your PC, @ 4K60? (Requires decoder hardware, and 4K UHD display.)
https://www.youtube.com/watch?v=yyw7coOhN78

My G4600 in my DeskMini CAN, but my A8-9600 @ 3.9Ghz seemingly CANNOT, but it's not maxed at 100% CPU usage either, more like 67%.

I looked around in about:config, but I'm running Firefox Nightly 58a1, so most of the stuff was already enabled. Not sure if this is an AMD driver limitation, or what.

Codec is VP9, 3840x2160@60. Have gigabit internet, that shouldn't be the issue.
 
Last edited:
  • Like
Reactions: Drazick

Indus

Diamond Member
May 11, 2002
9,935
6,516
136
The bottom line is if you have one of the 2 CPU's you can be happy. If you have an older cpu (like I had a 3470 i5 which would only do 3.8 ghz max).. there's a decent upgrade to be had by upgrading and if you're upgrading, why go with just a quad core in 2017.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,339
10,044
126
Good point. I bought a 3470 Dell tower PC off of ebay, and souped it up to sell, I ended up hooking a friend up with it, that had a Core2Quad w/GTX460.

I put in a GTX1050, SSD, 1TB HDD, and 8GB of DDR3, along with the i5-3470, which wasn't too bad, certainly a bit better than what he had.

But compared to modern architectures, yeah, it could be slightly slower. And then there's the core-count issue.

I expect his next upgrade to be to an 8C/16T or perhaps even a 12C/24T PC (Zen 2).

I'm set for now, with some Ryzen 5 1600 rigs (6C/12T), with 16GB DDR4-3000 (can only run at 2667, I bought cheap RAM).
 
  • Like
Reactions: Drazick and cbn

cfenton

Senior member
Jul 27, 2015
277
99
101
I thought it was a really silly question, but then I watched the video and realized he was responding to a Youtube comment (I get the desire to engage with viewers, but if there's anywhere to ignore the comments it's Youtube). I think he's correct that 4c/4t is still plenty for most people who will pair it with a 580 or 1060 at best and play at 1080p. I wouldn't buy a 4c/4t CPU in 2017, in fact, I bought an 8c/16t CPU in 2017, but that was mostly for fun. My 3570K@4.2GHZ was still fine for almost everything I was playing with my 1070.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
I think that a 4c/4t CPU is still fine most users use cases. I have an Haswell i5 and I have no plans on replacing my current rig anytime soon. However if I was it would a at least 6 core CPU.
 

bbhaag

Diamond Member
Jul 2, 2011
6,655
2,041
146
Four core cpus are not going away anytime soon and the fact is they are more than enough for most people. It always amazes me the sheer amount of disconnect between enthusiasts and the average user. Enthusiasts will tell you that the minimum cpu needed is an 8 core monster with a 200mhz overclock and 16gb of 3200mhz ram coupled with a 1080 gpu to get a good experience. The truth is far from that for most people.

Cores this and cores that. I've been hearing this since I joined the forums six years ago. Always more cores. The more cores the better. The reality is unless something dramatically happens with software development all the cores in the world aren't going to make for a better experience for your average user.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,339
10,044
126
Cores this and cores that. I've been hearing this since I joined the forums six years ago. Always more cores. The more cores the better. The reality is unless something dramatically happens with software development all the cores in the world aren't going to make for a better experience for your average user.
Maybe I'm in the minority then. I can always use "MOAR CORES!!1" for Distributed Computing projects. Although, many of them score better with GPUs than CPUs.

I agree with you for "normal" users. People that just want to browse the web. Especially if they're a Firefox user, since that is so poorly-threaded (web content, the media-decoding engines do appear to be multi-threaded). If someone uses Chrome, or Chromium, or Vivaldi, or Brave, then they can see an advantage with more cores, and more RAM.

I would recommend, at a minimum baseline for anyone buying a PC these days, to get a 4C/4T (or, barring that, due to expenses, get a 2C/4T Kaby Lake Pentium G4560 or G4600), with at a minimum 8GB of RAM, preferably dual-channel.

Ideally, they would get a 6C/12T CPU (Ryzen 5 1600), with 16GB or even 32GB of RAM. (I realize that RAM is fairly expensive right now, so that's one of the first areas to cut down to fit into a budget.)
 
  • Like
Reactions: Drazick

bbhaag

Diamond Member
Jul 2, 2011
6,655
2,041
146
You are part of the minority VirtualLarry. The fact that you are a regular here is testament to that. That fact that you run DC software is also a testament to the fact that you are not an average user.

Look guys I know that every one got excited when AMD brought competitive cpus to the market for the first time in years. It's great for everyone but the question we should be asking ourselves is how are we going to use them? All the cores in the world aren't going to make a damn bit of difference unless someone figures out a way to make them useful.
Honestly video encoding, DC, and other productivity software ain't going to cut it. We need a huge leap forward in software development for everyday use to really make the more cores are better argument viable.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
You are part of the minority VirtualLarry. The fact that you are a regular here is testament to that. That fact that you run DC software is also a testament to the fact that you are not an average user.

Look guys I know that every one got excited when AMD brought competitive cpus to the market for the first time in years. It's great for everyone but the question we should be asking ourselves is how are we going to use them? All the cores in the world aren't going to make a damn bit of difference unless someone figures out a way to make them useful.
Honestly video encoding, DC, and other productivity software ain't going to cut it. We need a huge leap forward in software development for everyday use to really make the more cores are better argument viable.
I agree, as most software can't really take advantage of more then four cores at the moment. Now this may change over time as CPUs with more cores become cheaper and more commonplace, but that is going to take awhile.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,795
3,626
136
All this argument over when software optimized for multiple cores would really take off is short-sighted. If people use something that can take advantage of many cores, then they're gonna use it, regardless of how much time regular stuff like media players or email clients or word processors take to catch up. Because things like these will never catch up as they are not that demanding for the single threaded performance of today's x86 CPUs that they will need multi-threading.

You want these things to use more cores, you're going to see see them use more cores in things like Android tablets.

Having said that, a CPU with four physical cores is barely adequate in any serious development environment with VMs running on top.

If all you're going to do is browse and write emails, except for DIY projects like file/plex servers, a dual-core is enough. For everything else, a quad-core is the minimum.
 

bbhaag

Diamond Member
Jul 2, 2011
6,655
2,041
146
How many years will it take then? It has already been six and I was late to the party. Eight, ten, maybe twelve years is that enough time? How long are we going to call lack luster software development short sighted?
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
How many years will it take then? It has already been six and I was late to the party. Eight, ten, maybe twelve years is that enough time? How long are we going to call lack luster software development short sighted?
Hard to say. It depends on fast users adopt CPUs with more then 4 cores now that reasonable priced 6 and 8 cores ones are out now. But it will still take awhile as folks like me are not going to be upgrading anytime soon.
 

NTMBK

Lifer
Nov 14, 2011
10,237
5,019
136
What a silly title. Classic click bait! We still have dual cores, quad cores will be here for a long time to come. Hell, Intel were still selling single core CPUs until Sandy Bridge.
 

imported_bman

Senior member
Jul 29, 2007
262
54
101
At least dual cores in the consumer space will be dead by 2018 for everything but the 5W Ultrabook/Computer-card segment.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
OK I watched the video and no, quad core CPUs are not dead in 2017 or even next year. Even in 2020 4c/4t processors will still have plenty of life in them even for gaming.

I for one am going to wait until 2020 to build another rig.
 

kwalkingcraze

Senior member
Jan 2, 2017
278
25
51
Now this may change over time as CPUs with more cores become cheaper and more commonplace, but that is going to take awhile.
Sorry, but I don't see this coming in my lifetime. Inflation will always catch in and be higher. Coffee Lake prices are highest and new record because inflation is higher in 2018 than in 2011 with Sandy Bridge. Core i7 starting at $500 with upcoming socket LGA1152 is the new norm. Good thing I'm going to sleep-in on this buzz.
 
Last edited:

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
Of course quad cores are not dead, at least from a gaming POV. Game developers have to accomodate for the average user, so if they make a game that only runs well on 6C or even 8C CPUs, they will be alienating 99% of the market and that game will flop no matter how good it is.

We as enthusiasts tend to have higher expectations on gaming performance and think too much about the 'bleeding edge', which I think is in contrast to what the average PC gamer is happy with. Case in point, my housemate happily games on his 1year old Dell gaming laptop that has a GTX960M, which is about equivalent to a desktop GTX950, he often runs at 720P medium details in order to run newer games at playable framerates but that doesn't seem to bother him one bit.

Whereas I have a much faster 1050 Ti based laptop and I'm still not happy with how it games at 1080P in the latest games as I often have to turn settings down to medium, and I refuse to game at 720P because of how blurry it looks. Lately I've been gaming more on my desktop as a result, which sports a much faster R9 Fury, though my o/ced 2500K is starting to bottleneck that too in certain games and that bothers me because I see framerates dipping below 60fps, whereas the 'average Joe' gamer would most likely be a lot more forgiving towards momentary frame rate drops than I am.

Also, the guy at HardwareUnboxed has a very good point about the vast disparity in performance between an older, slower quad core CPU (AMD FX etc) and a modern 4C/4T CPU like the i5 7600K, which when overclocked to 5GHz is still one of the fastest gaming CPUs currently available short of an i7 or perhaps overclocked Ryzen 5/7 @ ~4GHz in heavily threaded titles.

TLDR: a modern highly clocked quad core such as a Skylake/Kabylake based i5 will remain a viable gaming CPU for a long time. Sure it may not get as many fps as higher core/thread CPUs in highly threaded games but it's a long way from being obsolete.
 

eddman

Senior member
Dec 28, 2010
239
87
101
AMD released 6-8 core CPUs and suddenly 4 cores are not enough?! Quad cores will be more than good enough for probably up to three years for the huge majority of people, a.k.a non-prosumers.

I plan on getting a CFL i3 for games and occasional light video encoding. Whenever 4 cores become a limitation down the line, I'd upgrade to a used CFL i5 or i7.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
Sorry, but I don't see this coming in my lifetime. Inflation will always catch in and be higher. Coffee Lake prices are highest and new record because inflation is higher in 2018 than in 2011 with Sandy Bridge. Core i7 starting at $500 with upcoming socket LGA1152 is the new norm. Good thing I'm going to sleep-in on this buzz.
Well AMD released good decent performance 6 and 8 core CPUs this year and they are cheaper then Intel's HEDT platform.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
Sorry, but I don't see this coming in my lifetime. Inflation will always catch in and be higher. Coffee Lake prices are highest and new record because inflation is higher in 2018 than in 2011 with Sandy Bridge. Core i7 starting at $500 with upcoming socket LGA1152 is the new norm. Good thing I'm going to sleep-in on this buzz.
I'm sorry, but are you like 95 years old to say smth like that?
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
Of course quad cores are not dead, at least from a gaming POV. Game developers have to accomodate for the average user, so if they make a game that only runs well on 6C or even 8C CPUs, they will be alienating 99% of the market and that game will flop no matter how good it is.

We as enthusiasts tend to have higher expectations on gaming performance and think too much about the 'bleeding edge', which I think is in contrast to what the average PC gamer is happy with. Case in point, my housemate happily games on his 1year old Dell gaming laptop that has a GTX960M, which is about equivalent to a desktop GTX950, he often runs at 720P medium details in order to run newer games at playable framerates but that doesn't seem to bother him one bit.

Whereas I have a much faster 1050 Ti based laptop and I'm still not happy with how it games at 1080P in the latest games as I often have to turn settings down to medium, and I refuse to game at 720P because of how blurry it looks. Lately I've been gaming more on my desktop as a result, which sports a much faster R9 Fury, though my o/ced 2500K is starting to bottleneck that too in certain games and that bothers me because I see framerates dipping below 60fps, whereas the 'average Joe' gamer would most likely be a lot more forgiving towards momentary frame rate drops than I am.

Also, the guy at HardwareUnboxed has a very good point about the vast disparity in performance between an older, slower quad core CPU (AMD FX etc) and a modern 4C/4T CPU like the i5 7600K, which when overclocked to 5GHz is still one of the fastest gaming CPUs currently available short of an i7 or perhaps overclocked Ryzen 5/7 @ ~4GHz in heavily threaded titles.

TLDR: a modern highly clocked quad core such as a Skylake/Kabylake based i5 will remain a viable gaming CPU for a long time. Sure it may not get as many fps as higher core/thread CPUs in highly threaded games but it's a long way from being obsolete.

The enthusiast circle is so obsessed with the latest tech of the month that they forgot it took almost forever for dual cores to become obsolete.
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
The enthusiast circle is so obsessed with the latest tech of the month that they forgot it took almost forever for dual cores to become obsolete.

And they still aren't quite obsolete, at least the latest 2C/4T Pentium / i3s aren't. Sure, they will struggle in heavily threaded apps but for the average budget gamer who pairs that with a GTX 1050 class GPU it will still provide more than playable framerates (the GTX1050 will be the bottleneck) in fact they can handle GPUs up to a GTX 1060 without serious bottlenecking, just check the scaling between the GTX 1050 and 1060. Only from a GTX1070 and up does it struggle to keep up.
https://www.youtube.com/watch?v=DL7YcPlJ83c
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
most of the time I would be fine with a decent dual core still.
quad cores are about to gain, not lose relevance I think, with i3 being quad core and lots of options for laptops.
even for high end gaming, more often than not over 4c is being wasted, specially if people go for 4K on the current GPUs (low framerate)
 

kwalkingcraze

Senior member
Jan 2, 2017
278
25
51
I'm sorry, but are you like 95 years old to say smth like that?
From LightiningZ71 user:

"R&D on newer and smaller process nodes is becoming increasingly expensive over time. Foundaries are starting to have to use tech that was originally developed for niche products to achieve generational improvements. While smaller processes can yield higher circuit densities, which could result in more chips per water, chips are getting bigger and bigger (circuit count wise) due to larger caches, more cores, integrated chipset features and additional memory controllers. All of this results in fewer perfect dies per wafer, more complex wafer production processes, and generally more production difficulties. Now, spread those much higher costs over lower sales volumes and you get higher sales prices to cover costs.

Throw all of that on top of an industry that is trying to regain sanity after nearly a decade of cut throat competition to stay in business over the long term. Value will be a scarce thing going forward."

Fair enough... In addition, inflation will always factor in and be higher.
 

Ranulf

Platinum Member
Jul 18, 2001
2,349
1,172
136
Bah, anyone who uses Starcraft2 as a serious benchmark is nuts. Its notoriously not coded well. My Q8400 (x4) to Phenom 955 (x4) to 8350 (x8) all could be brought to their knees in that game just playing against 4+ AI. And I doubt my conroe e6600 (dual 2.4ghz) would have been any worse than the q8400 (2.6ghz).