The 8 Core CPU: Are they replacing 4 Cores as the standard? (Poll Inside)

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Have we entered the era of the common 8 core CPU?

  • Yes, 8 cores are the new standard and Intel will catch up very soon.

    Votes: 88 50.6%
  • Nope! No way 8 cores are the new standard!

    Votes: 86 49.4%

  • Total voters
    174

Asterox

Golden Member
May 15, 2012
1,058
1,864
136
Intel had no reason to continue improving. So they release the same quad core with 10% improvements for 7 years. Now that ryzen came and brought the heat Intel was forced to respond.

Yes Intel respond like "Rambo from Finland", and that respond is "very impresive" no doubt.:cool:

- i7 8700K ofers the same Multithreading performanse vs R7 1700 with lower power consumption+100$ cheaper+good "RGB Led stock CPU cooler"
 

slashy16

Member
Mar 24, 2017
151
59
71
8 Core Ryzen processors have been out 8 months and still, most gamers are choosing the 7600k/7700K. There has to be a balance between IPC and Core count and Ryzen isn't competitive at all with Intel in gaming or any single threaded application. Maybe this will change next year if Ryzen can manage higher clocks but, for now, Intel has that balance. I have a 3770k @ 4.6 and there is only one task I run that may benefit from an 8core CPU's and that's converting videos for my iPad. I'm not going to buy Ryzen to get a boost on a single task. I am, however, willing to upgrade to a 8700K because I will see a boost in every single task I run with no exception. 4C/4T is far from being dead. I have built two 8350k and one 8100 machines in the past week(yay frys) for friends moving from i7 920's.

P.S Fry's has a boatload of i5 8400's currently instock at their concord cali store :D
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I’m going to say something shocking and controversial. I believe anyone with a 2600k or above will likely be fine for the next 3-4 years, assuming the system has 16+ GB of RAM and a 980 Ti or above. I’ll tell my upgrade plans a little later in this post, but I’m still skeptical that 6+ cores will be a hard requirement in this timeframe.

It all depends on your expectations for gaming. If 60Hz gaming is fine for you, then yeah a 2600K still has some life left in it, though I don't think it's as much as 3-4 years. However gaming has been evolving, especially on the PC side of things. 60Hz may be the standard, but we now have monitors capable of 240Hz refresh rates, which not only requires a powerful GPU, but a powerful CPU as well. I used to be in the "60Hz is all you need" crowd, until I got a new Gsync capable 1440p 165Hz monitor. The difference in clarity, smoothness and response is unbelievable. Not in all games mind you, but in many there is a huge difference. Doom in particular felt like a completely new game compared to when I played it at 60Hz.

But let’s look at the facts: 1) PC sales are still declining; Gartner said we may see a single-digit overall increase in sales this year in the “PC” category, but they attribute that to corporate upgrades and mobile products. 2) Big, AAA games for the PC seem to be getting more rare. 3) We’ve had hex core CPUs since what, 2009 or 2010, and games taking advantage of the extra cores are still relatively rare. In terms of productivity, there are obviously more packages using more cores, but the big driver of what will be “standard” will be business refreshes in the next few years.

PC sales have been declining for years, but GAMING PC sales (and accessories) has been increasing yearly for a long time now. And when I say gaming PCs, I mean high end gaming. PC gaming is more accessible and popular than ever, and there are lots of people out there that are willing to plop down some serious dough to have a high end gaming experience that consoles simply cannot match.

PSsMjZdXsA6RAaSMmzN7Ek-650-80.png


In fact, the PS4 Pro and Xbox One X are Sony and Microsoft's attempt to stem the tide of hardcore console gamers that switch over to PC, especially during the twilight years of the console cycle.
 
  • Like
Reactions: moonbogg

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Here is a recent 2600k to 8700K comparison. For games if you have a 1070 or lesser card, upgrading from a 2600K to 8700K would be pretty much impossible to detect without running benchmarks.
https://youtu.be/gMFd0aVhVKU?t=6m53s

That comparison is invalid. I like HardwareCanucks, but the guy that did the testing obviously doesn't know much about benchmarking. He used BF1 DX12 and Deus Ex MD DX12, and both have substantially poorer performance in DX12 mode compared to DX11 on NVidia hardware. When I downloaded BF1 to test it, the framerate in DX11 was about 30 FPS higher than it was in DX12.

The DX12 renderer in both BF1 and Deus Ex MD are far more underdeveloped compared to the DX11 renderer.

I am finally looking to upgrade my 2008 vintage PC with 3.2GHz OC C2Q, and when I get very logical about it, I could probably keep it running for years with just a GPU upgrade, and it's my only PC used for gaming, internet, productivity, Video encoding.

If you are still using a C2Q CPU, then you're likely not playing contemporary games. That CPU would just about get murdered in any big online multiplayer game, and quite a few AAA single player games as well as evidenced in this video where a C2Q at 3.6ghz drops down in the 30s and 20s in BF1 64 player match:


I will be more controversial and say when I really think about it, the ~10 year old CPU is not really holding me back. I record OTA TV, and encode it with Handbrake, but other than Video Encoding and gaming nothing really pushes my Ancient CPU.

Every time these threads pop up, I always state that everyone's computing needs and tolerances are different. Some people can probably still get by on a P4 machine if all they do is browse the net and send emails. Some people can play a game at 30 FPS and still be happy, while others are annoyed by any framerate drops below 100 FPS.

Different strokes for different folks. Personally, I like everything to be fast and snappy, so that is why I upgrade often.
 
  • Like
Reactions: moonbogg

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
If you are still using a C2Q CPU, then you're likely not playing contemporary games. That CPU would just about get murdered in any big online multiplayer game, and quite a few AAA single player games as well as evidenced in this video where a C2Q at 3.6ghz drops down in the 30s and 20s in BF1 64 player match:

I have zero interest in FPS games. I mostly play RPGs, and these days I only buy from GOG.

My GPU(8800GT) is much more of a detriment in gaming than my CPU. The game I want to play but don't have the HW for is Witcher 3. But I figure with a GTX 1060 upgrade and my 3.2GHz C2Q I could be able to tweak game settings to a reasonable frame rate.

Everything else is quite good. I can use Handbrake to compress video, watch another video and surf the internet at the same time without noticeable hiccups. SSDs really were the game changer for my system and the big thing I am looking forward to with the my new system is a fast m.2 drive.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,695
136
I have built two 8350k and one 8100 machines in the past week(yay frys) for friends moving from i7 920's.

May I ask if that extra 2MB cache and extra frequency on the 8350k is worth the premium?

I'm having a good eye on both, for the same reason.

My GPU(8800GT) is much more of a detriment in gaming than my CPU. The game I want to play but don't have the HW for is Witcher 3. But I figure with a GTX 1060 upgrade and my 3.2GHz C2Q I could be able to tweak game settings to a reasonable frame rate.

A GT1030 would be a serious upgrade from an 8800GT, be inexpensive, and use a lot less power too. If it feels too "small", you don't really need more then 1050ti for The Witcher 3 at 1080p.
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
Yes Intel respond like "Rambo from Finland", and that respond is "very impresive" no doubt.:cool:

- i7 8700K ofers the same Multithreading performanse vs R7 1700 with lower power consumption+100$ cheaper+good "RGB Led stock CPU cooler"

Clearly inferior chip is cheaper, more obvious news at 11.
 

IndyColtsFan

Lifer
Sep 22, 2007
33,655
688
126
It all depends on your expectations for gaming. If 60Hz gaming is fine for you, then yeah a 2600K still has some life left in it, though I don't think it's as much as 3-4 years. However gaming has been evolving, especially on the PC side of things. 60Hz may be the standard, but we now have monitors capable of 240Hz refresh rates, which not only requires a powerful GPU, but a powerful CPU as well. I used to be in the "60Hz is all you need" crowd, until I got a new Gsync capable 1440p 165Hz monitor. The difference in clarity, smoothness and response is unbelievable. Not in all games mind you, but in many there is a huge difference. Doom in particular felt like a completely new game compared to when I played it at 60Hz.



PC sales have been declining for years, but GAMING PC sales (and accessories) has been increasing yearly for a long time now. And when I say gaming PCs, I mean high end gaming. PC gaming is more accessible and popular than ever, and there are lots of people out there that are willing to plop down some serious dough to have a high end gaming experience that consoles simply cannot match.

PSsMjZdXsA6RAaSMmzN7Ek-650-80.png


In fact, the PS4 Pro and Xbox One X are Sony and Microsoft's attempt to stem the tide of hardcore console gamers that switch over to PC, especially during the twilight years of the console cycle.

Gaming PCs are increasing in sales, especially gaming laptops. However, they’re still a tiny fraction of the overall PC market.

You’re missing that the 2600k isn’t that much slower in games than the 7700k. Sure, there are a few where the 7700k shows a good advantage, but I’d wager any game which performs poorly on a 2600k will not play well on a 7700k. It remains to be seen how much 8+ core CPUs will be used in games in the future. I’d wager adding a 1080Ti or a Volta to my 2600k would make it viable for years to come. Of course, I’m a geek and want to upgrade so that won’t happen.

You do have a good point about high refresh rate monitors but like 8 core CPUs, I suspect it will be years before 144Hz monitors are standard.

Regarding consoles, IMO, the console gaming market looks very dreary right now. It seems they keep rehashing the same garbage and there are very few original ideas.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
A GT1030 would be a serious upgrade from an 8800GT, be inexpensive, and use a lot less power too. If it feels too "small", you don't really need more then 1050ti for The Witcher 3 at 1080p.

The next GPU I buy, I plan to use for 5+ years like my 8800GT, so I want something a little better than the minimum I need to play Witcher 3 today.
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
The next GPU I buy, I plan to use for 5+ years like my 8800GT, so I want something a little better than the minimum I need to play Witcher 3 today.

What resolution and refresh rate? If you're planning on using it for 5+ years you may as well go big with a 1080ti especially since it's the only GPU on the market that come close to adequately playing at 4k or 1440p 144hz. If you're willing to upgrade every 2-3 years a 1070, 1070ti, 1080, or even the Vega cards would be sufficient.
 

TigerMonsoonDragon

Senior member
Feb 11, 2008
589
39
91
If more and more games go the route of assassin's Creed origins drm. Yes 8 and more threads.

Looking at 100% on 4 core and 60 to 80% on 12 and 16 threads.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Intel had no reason to continue improving. So they release the same quad core with 10% improvements for 7 years. Now that ryzen came and brought the heat Intel was forced to respond.
Intel was improving... they were improving 10% a year as you indicated in your OWN post.
They didn't give you more cores for no price increase because they didn't have to. That's what you're complaining about.

Do NOT mix the two up. Intel has been improving, saying otherwise is just a lie, I'm not sure how this narrative is allowed to be pushed on here that intel didn't improve/had no reason to improve. It's a provable lie.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
It all depends on your expectations for gaming. If 60Hz gaming is fine for you, then yeah a 2600K still has some life left in it, though I don't think it's as much as 3-4 years. However gaming has been evolving, especially on the PC side of things. 60Hz may be the standard, but we now have monitors capable of 240Hz refresh rates, which not only requires a powerful GPU, but a powerful CPU as well. I used to be in the "60Hz is all you need" crowd, until I got a new Gsync capable 1440p 165Hz monitor. The difference in clarity, smoothness and response is unbelievable. Not in all games mind you, but in many there is a huge difference. Doom in particular felt like a completely new game compared to when I played it at 60Hz.



PC sales have been declining for years, but GAMING PC sales (and accessories) has been increasing yearly for a long time now. And when I say gaming PCs, I mean high end gaming. PC gaming is more accessible and popular than ever, and there are lots of people out there that are willing to plop down some serious dough to have a high end gaming experience that consoles simply cannot match.

PSsMjZdXsA6RAaSMmzN7Ek-650-80.png


In fact, the PS4 Pro and Xbox One X are Sony and Microsoft's attempt to stem the tide of hardcore console gamers that switch over to PC, especially during the twilight years of the console cycle.

It was a nice try, but the PS4 Pro was a joke for PC gamers. Xbox One X has a better chance though with more GPU power.
4k 144hz is my next upgrade.
I just don't think that everyone has the imagination to understand just how much these resolution/frame rate(refresh rate) upgrades are.

It mind boggles me that people are satisfied with any resolution/refresh rate currently out.... with the exception of maybe 240 hz.

We're going to see insane graphics/refresh rates in our life time.... old hardware won't power that.
 

stockwiz

Senior member
Sep 8, 2013
403
15
81
I used to be one of them who thought 720p or 1080p was "good enough" .. now my sky is only limited by how much I'm willing to spend. :) There is a law of diminishing returns though... much beyond 8K and you're not going to gain anything even among-st large TV sets.. by then we need to be getting into holograms and virtual reality....

I'm one of those few weirdos that actually likes 3D TV and my LG OLED set supports 3D and yet when they made the 4K bluray standard, they didn't include 3D which means all content done in 3D will be done at 1080P, making it look pretty blurry when you cut that resolution in half. Kind of sucks. :\

I don't think we're going to hit 8+ core as quick as people think at a good price point.. they're going to milk it unless AMD steps up to the plate yet again. The ryzen 1700 only has the same IPC as Sandy Bridge.. I wasn't about to sidegrade just for more cores. Finally though we get some excitement in the desktop front after 6 years since Sandy Bridge... dual core finally going away for good? No.. they've still got their pentium line to throw in the bargain basement black friday doorbuster crap.
 
Last edited:
  • Like
Reactions: moonbogg

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,259
16,117
136
Yes Intel respond like "Rambo from Finland", and that respond is "very impresive" no doubt.:cool:

- i7 8700K ofers the same Multithreading performanse vs R7 1700 with lower power consumption+100$ cheaper+good "RGB Led stock CPU cooler"
Ah,,,, NO.The 1700X (higher TDP than a 1700) used one watt less than the 8700K, and its not cheaper, wrong on both accounts.

Here: https://www.anandtech.com/show/1185...-lake-review-8700k-and-8400-initial-numbers/5

8700K is supposed to be $360, but its $420 now. The 1700 is $319 and the 1700X is $399 at the moment on newegg. Prices do vary.
 
  • Like
Reactions: moonbogg

TigerMonsoonDragon

Senior member
Feb 11, 2008
589
39
91
Intel was improving... they were improving 10% a year as you indicated in your OWN post.
They didn't give you more cores for no price increase because they didn't have to. That's what you're complaining about.

Do NOT mix the two up. Intel has been improving, saying otherwise is just a lie, I'm not sure how this narrative is allowed to be pushed on here that intel didn't improve/had no reason to improve. It's a provable lie.

Should've said at most improving at maybe 10% is not improving it's stagnating. When my ivy bridge or haswell CPUs essentially matched the many lake processors that's not improving. Sure there were outliers where they were slightly faster. They didnt bring out the good stuff until ryzen. Coffelake vs kabylake.....now that's a real improvement.
 
Last edited:

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
If more and more games go the route of assassin's Creed origins drm. Yes 8 and more threads.

Looking at 100% on 4 core and 60 to 80% on 12 and 16 threads.

Yeah, even 6/6 barely cuts it for this garbage DRM. 6/12 is needed for some comfortable breathing room for that crazy DRM overhead. Insane.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136

Yea, look at this,,, 60 to 90%

OMG 90% on 12 threads? You see! We really have entered the era of the 8 core CPU! I'm such a prophet. That's profit, baby.

No, jk. This is absolutely disgusting. But, I must say, DRM or not, I'm really glad I don't have a quad.
 

traderjay

Senior member
Sep 24, 2015
221
167
116
On my system, I sometimes see 100% utilization on all 22 cores when playing Assasin's Creed Origins and then the CPU usage spills over to the second CPU, thats also when my FPS tanks from 80+ to low 60s or 50s.
 
  • Like
Reactions: rvborgh

wilds

Platinum Member
Oct 26, 2012
2,059
674
136
Thanks guys. Really makes my quad feel insecure about its future in this thread. And I'll be staying far away from the new Assassin's Creed!! WTF UBI