TigerMonsoonDragon
Senior member
- Feb 11, 2008
- 589
- 39
- 91
Intel had no reason to continue improving. So they release the same quad core with 10% improvements for 7 years. Now that ryzen came and brought the heat Intel was forced to respond.
Intel had no reason to continue improving. So they release the same quad core with 10% improvements for 7 years. Now that ryzen came and brought the heat Intel was forced to respond.
I’m going to say something shocking and controversial. I believe anyone with a 2600k or above will likely be fine for the next 3-4 years, assuming the system has 16+ GB of RAM and a 980 Ti or above. I’ll tell my upgrade plans a little later in this post, but I’m still skeptical that 6+ cores will be a hard requirement in this timeframe.
But let’s look at the facts: 1) PC sales are still declining; Gartner said we may see a single-digit overall increase in sales this year in the “PC” category, but they attribute that to corporate upgrades and mobile products. 2) Big, AAA games for the PC seem to be getting more rare. 3) We’ve had hex core CPUs since what, 2009 or 2010, and games taking advantage of the extra cores are still relatively rare. In terms of productivity, there are obviously more packages using more cores, but the big driver of what will be “standard” will be business refreshes in the next few years.
Here is a recent 2600k to 8700K comparison. For games if you have a 1070 or lesser card, upgrading from a 2600K to 8700K would be pretty much impossible to detect without running benchmarks.
https://youtu.be/gMFd0aVhVKU?t=6m53s
I am finally looking to upgrade my 2008 vintage PC with 3.2GHz OC C2Q, and when I get very logical about it, I could probably keep it running for years with just a GPU upgrade, and it's my only PC used for gaming, internet, productivity, Video encoding.
I will be more controversial and say when I really think about it, the ~10 year old CPU is not really holding me back. I record OTA TV, and encode it with Handbrake, but other than Video Encoding and gaming nothing really pushes my Ancient CPU.
If you are still using a C2Q CPU, then you're likely not playing contemporary games. That CPU would just about get murdered in any big online multiplayer game, and quite a few AAA single player games as well as evidenced in this video where a C2Q at 3.6ghz drops down in the 30s and 20s in BF1 64 player match:
I have built two 8350k and one 8100 machines in the past week(yay frys) for friends moving from i7 920's.
My GPU(8800GT) is much more of a detriment in gaming than my CPU. The game I want to play but don't have the HW for is Witcher 3. But I figure with a GTX 1060 upgrade and my 3.2GHz C2Q I could be able to tweak game settings to a reasonable frame rate.
Yes Intel respond like "Rambo from Finland", and that respond is "very impresive" no doubt.
- i7 8700K ofers the same Multithreading performanse vs R7 1700 with lower power consumption+100$ cheaper+good "RGB Led stock CPU cooler"
It all depends on your expectations for gaming. If 60Hz gaming is fine for you, then yeah a 2600K still has some life left in it, though I don't think it's as much as 3-4 years. However gaming has been evolving, especially on the PC side of things. 60Hz may be the standard, but we now have monitors capable of 240Hz refresh rates, which not only requires a powerful GPU, but a powerful CPU as well. I used to be in the "60Hz is all you need" crowd, until I got a new Gsync capable 1440p 165Hz monitor. The difference in clarity, smoothness and response is unbelievable. Not in all games mind you, but in many there is a huge difference. Doom in particular felt like a completely new game compared to when I played it at 60Hz.
PC sales have been declining for years, but GAMING PC sales (and accessories) has been increasing yearly for a long time now. And when I say gaming PCs, I mean high end gaming. PC gaming is more accessible and popular than ever, and there are lots of people out there that are willing to plop down some serious dough to have a high end gaming experience that consoles simply cannot match.
![]()
In fact, the PS4 Pro and Xbox One X are Sony and Microsoft's attempt to stem the tide of hardcore console gamers that switch over to PC, especially during the twilight years of the console cycle.
A GT1030 would be a serious upgrade from an 8800GT, be inexpensive, and use a lot less power too. If it feels too "small", you don't really need more then 1050ti for The Witcher 3 at 1080p.
The next GPU I buy, I plan to use for 5+ years like my 8800GT, so I want something a little better than the minimum I need to play Witcher 3 today.
What GPU and what res?If more and more games go the route of assassin's Creed origins drm. Yes 8 and more threads.
Looking at 100% on 4 core and 60 to 80% on 12 and 16 threads.
Intel was improving... they were improving 10% a year as you indicated in your OWN post.Intel had no reason to continue improving. So they release the same quad core with 10% improvements for 7 years. Now that ryzen came and brought the heat Intel was forced to respond.
It all depends on your expectations for gaming. If 60Hz gaming is fine for you, then yeah a 2600K still has some life left in it, though I don't think it's as much as 3-4 years. However gaming has been evolving, especially on the PC side of things. 60Hz may be the standard, but we now have monitors capable of 240Hz refresh rates, which not only requires a powerful GPU, but a powerful CPU as well. I used to be in the "60Hz is all you need" crowd, until I got a new Gsync capable 1440p 165Hz monitor. The difference in clarity, smoothness and response is unbelievable. Not in all games mind you, but in many there is a huge difference. Doom in particular felt like a completely new game compared to when I played it at 60Hz.
PC sales have been declining for years, but GAMING PC sales (and accessories) has been increasing yearly for a long time now. And when I say gaming PCs, I mean high end gaming. PC gaming is more accessible and popular than ever, and there are lots of people out there that are willing to plop down some serious dough to have a high end gaming experience that consoles simply cannot match.
![]()
In fact, the PS4 Pro and Xbox One X are Sony and Microsoft's attempt to stem the tide of hardcore console gamers that switch over to PC, especially during the twilight years of the console cycle.
Ah,,,, NO.The 1700X (higher TDP than a 1700) used one watt less than the 8700K, and its not cheaper, wrong on both accounts.Yes Intel respond like "Rambo from Finland", and that respond is "very impresive" no doubt.
- i7 8700K ofers the same Multithreading performanse vs R7 1700 with lower power consumption+100$ cheaper+good "RGB Led stock CPU cooler"
What GPU and what res?
Intel was improving... they were improving 10% a year as you indicated in your OWN post.
They didn't give you more cores for no price increase because they didn't have to. That's what you're complaining about.
Do NOT mix the two up. Intel has been improving, saying otherwise is just a lie, I'm not sure how this narrative is allowed to be pushed on here that intel didn't improve/had no reason to improve. It's a provable lie.
If more and more games go the route of assassin's Creed origins drm. Yes 8 and more threads.
Looking at 100% on 4 core and 60 to 80% on 12 and 16 threads.
Yeah, even 6/6 barely cuts it for this garbage DRM. 6/12 is needed for some comfortable breathing room for that crazy DRM overhead. Insane.
Yeah, even 6/6 barely cuts it for this garbage DRM. 6/12 is needed for some comfortable breathing room for that crazy DRM overhead. Insane.
Yea, look at this,,, 60 to 90%