Speculation: i9-9900K is Intel's last hurrah in gaming

Page 27 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Will Intel lose it's gaming CPU lead in 2019?


  • Total voters
    184
  • Poll closed .

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
209
106
The way i see this:
1st tier gaming CPUs - Intel 9900K, 9700K, 8700K and 9600K. AMD Ryzen 3950X, 3900X, 3800X and 3700X
Just because when you can afford those CPU's, you can also afford a decent 1440P monitor or a 4K one

But to answer the thread title, no the 9900K wont be the last best gaming cpu in the world manufactured by Intel, the KS will increase the tiny gap for sure . Good thing is that we now have competition and that will increase pressure on Intel to lower prices further
 
Last edited:

maddie

Diamond Member
Jul 18, 2010
3,107
1,735
136

TheELF

Diamond Member
Dec 22, 2012
3,025
305
126
Please read the Anandtech review for the latest on clockspeed and performance.
https://www.anandtech.com/show/14605/the-and-ryzen-3700x-3900x-review-raising-the-bar/5

Happy to be of service and please feel free to request help in the future. ;)
From the article you linked:
We've updated the article benchmark numbers on the Ryzen 9 3900X. We've seen 3-9% improvements in exclusive ST workloads. MT workloads have remained unchanged,
Der8auer showed what would be needed to get MT up.
There is not one single game that uses only one single thread in a vacuum so getting a higher ST result will not change anything for gaming, you want a higher ST while all or at least a lot of threads are active.
 

maddie

Diamond Member
Jul 18, 2010
3,107
1,735
136
From the article you linked:

Der8auer showed what would be needed to get MT up.
There is not one single game that uses only one single thread in a vacuum so getting a higher ST result will not change anything for gaming, you want a higher ST while all or at least a lot of threads are active.
And yet, there appears to be some gain in MT for some of the benchmarks. I don't think it's as straightforward as you claim.
 

PotatoWithEarsOnSide

Senior member
Feb 23, 2017
664
700
106
From the article you linked:

Der8auer showed what would be needed to get MT up.
There is not one single game that uses only one single thread in a vacuum so getting a higher ST result will not change anything for gaming, you want a higher ST while all or at least a lot of threads are active.
I'm waiting for the updated power consumption figures, because the figures they've listed adhere strictly to the PPT limits that PB2 and XFR2 operate within.
PBO would override these limits, with the consequent increase in power draw across the board.
Unless these CPUs are severely thermally constrained even within stock settings, we've not truly seen PBO in action. If they are so severely thermally constrained then you do have to wonder why AMD even bothered with a warranty-busting feature that does diddly squat.
 

TheGiant

Senior member
Jun 12, 2017
659
262
106
I'm waiting for the updated power consumption figures, because the figures they've listed adhere strictly to the PPT limits that PB2 and XFR2 operate within.
PBO would override these limits, with the consequent increase in power draw across the board.
Unless these CPUs are severely thermally constrained even within stock settings, we've not truly seen PBO in action. If they are so severely thermally constrained then you do have to wonder why AMD even bothered with a warranty-busting feature that does diddly squat.
I must say I dont give much hope to the max performance with BIOS/driver optimisation
manual overcklocking and tuning doesnt give much more above what we have ATM
still waiting for my 3900X :(
as I see it TSMC 7nm is too dense and that heat flow density is too high
that is why IMO Intel does coffe lake cores with much lower density that 14nm theoretical- they give it for that MHz
 

coercitiv

Diamond Member
Jan 24, 2014
3,731
3,516
136
Unless these CPUs are severely thermally constrained even within stock settings, we've not truly seen PBO in action. If they are so severely thermally constrained then you do have to wonder why AMD even bothered with a warranty-busting feature that does diddly squat.
They don't look thermally constrained to me.



My bet is power / current constraints. Reminds me of the first BIOS revision on my Z370 board, where AVX loads plunged clocks way bellow all-core turbo due to a hidden current limit. Limit was removed in a further BIOS revision.
 

PotatoWithEarsOnSide

Senior member
Feb 23, 2017
664
700
106
Having just seen some screen shots on Reddit in relation to other stuff, I'd also like to see some Ryzen Master screenshots of folk that think that PBO is actually overriding current and power limits. Ryzen Master clearly displays what those limits are, so if they are being exceeded via PBO then folk should easily be able to prove it.
 

mopardude87

Platinum Member
Oct 22, 2018
2,784
1,090
96
I believe Anandtech still tests gaming with a GTX 1080 (!), not even a 1080 Ti. That's basically an upper mid range card these days, so their gaming tests aren't really useful IMO because they are so GPU limited.
Even with that low of a gpu choice,their 1080p and gx1080 results still show cpu bottlenecking especially in GTA V with high settings. I find those numbers extremely helpful as a basis and avid player of that game. The 4k numbers are extremely useless given i turn down the most extreme of settings and was pulling a easy 60+ avg on a 1070ti. The 1080p numbers are nice for at least 144hz gamers and given some Ultra settings like grass absolutely tank fps, putting those settings on high suddenly buy you a easy 50% more frames while still looking nearly as good.
 
  • Like
Reactions: Tlh97

Vattila

Senior member
Oct 22, 2004
488
444
136
So, Intel Core has clung to the title "world's best gaming processor", despite AMD's launch of Ryzen 3000, and reinforced the title with the launch of the 10000-series Comet Lake refresh.

It must irk AMD's marketing executives that, despite giving glowing recommendations for the Ryzen chips, especially on value and productivity, reviewers often add the caveat that "Intel is still the choice for absolute best high frame-rate gaming", implying that Intel is still the premium brand for gamers — the majority of PC enthusiasts.

Now, with the Ryzen 3000 refresh, AMD looks to challenge Intel on single-threaded performance yet again, even overtaking in benchmarks such as Cinebench Single-Threaded. But nothing in the rumours so far points to Ryzen taking a decisive lead in gaming.

That said, where is the 16-core in this? Only 6, 8 and 12-core chips are included in the rumoured refresh info, so far. Could AMD be preparing a new 16-core gaming champion in the corner?

They would just need a couple of CCXs binning at very high clocks, I guess. How high do they need to get to push the FPS needle beyond Intel's 10900K in a majority of games?
 
Last edited:

amrnuke

Senior member
Apr 24, 2019
838
1,015
96
So, Intel Core has clung to the title "world's best gaming processor", despite AMD's launch of Ryzen 3000, and reinforced the title with the launch of the 10000-series Comet Lake refresh.

It must irk AMD's marketing executives that, despite giving glowing recommendations for the Ryzen chips, especially on value and productivity, reviewers often add the caveat that "Intel is still the choice for absolute best high frame-rate gaming", implying that Intel is still the premium brand for gamers — the majority of PC enthusiasts.

Now, with the Ryzen 3000 refresh, AMD looks to challenge Intel on single-threaded performance yet again, even overtaking in benchmarks such as Cinebench Single-Threaded. But nothing in the rumours so far points to Ryzen taking a decisive lead in gaming.

That said, where is the 16-core in this? Only 6, 8 and 12-core chips are included in the rumoured refresh info, so far. Could AMD be preparing a new 16-core gaming champion in the corner?

They would just need a couple of CCXs binning at very high clocks, I guess. How high do they need to get to push the FPS needle beyond Intel's 10900K in a majority of games?
Two CCXs at very high clocks won't really help the situation would it? Due to inter-chiplet latency and the fact that most games stop seeing improvement in performance beyond about 8-12 threads?

If that's the case, maybe best option is just a hyper-binned 3800X?
 
  • Like
Reactions: Tlh97 and Vattila

Topweasel

Diamond Member
Oct 19, 2000
5,326
1,521
136
That said, where is the 16-core in this? Only 6, 8 and 12-core chips are included in the rumoured refresh info, so far. Could AMD be preparing a new 16-core gaming champion in the corner?
Probably because it will show the smallest increase in performance. 2-4 core usuage would see some kind of increase. But how much since it already has the highest ST clock out of all Zen 2 CPU's. This really comes off as AMD having a surplus of 3950 usable dies and implies a dip in sales. Guessing unlike the much more manageable cost of the 3900 having consistent demand the 3950 sales were all pent up demand with no increase. So even if they could boost the 3950 a little bit more, its probably not selling enough to really make it worth it on top of the small increase.

If that's the case, maybe best option is just a hyper-binned 3800X?
If I was looking at any of these refresh's, i mean to a degree any of them might be a bonus on top of the original chips, but only the 3800x looks like a real improvement. This is probably more of a value reset against comet lake than a real refresh. Extra boost here and there to increase value and keep them from dropping the prices while we wait for zen 3. Might also be why we aren't seeing a 3950 refresh. Getting dies to give that a boost has to be incredibly hard, have the lowest value in increase, and the 3950 still doesn't have a real competitor.
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
13,886
3,402
136
I'm optimistic that Zen 3 will level the playing field for gaming. Performance doesn't seem to scale much beyond 4.4GHz on Zen 2 anyways, requiring FCLK OC in order to scale further. If we could see reliable 2000 FCLK with 4.5GHz or so gaming clocks on the CPU that could potentially close the gap. We might see that with a rumored Zen 2+ product.

Personally my gaming experience would not benefit with any current CPUs AMD or Intel, so whichever camp delivers the next improvement in performance, efficiency, and value will lead me to upgrade. That could be Zen 3... or it may not be until late 2021-mid 2022 before I see something worth upgrading to.
 

Thunder 57

Golden Member
Aug 19, 2007
1,409
1,184
136
I'm optimistic that Zen 3 will level the playing field for gaming. Performance doesn't seem to scale much beyond 4.4GHz on Zen 2 anyways, requiring FCLK OC in order to scale further. If we could see reliable 2000 FCLK with 4.5GHz or so gaming clocks on the CPU that could potentially close the gap. We might see that with a rumored Zen 2+ product.

Personally my gaming experience would not benefit with any current CPUs AMD or Intel, so whichever camp delivers the next improvement in performance, efficiency, and value will lead me to upgrade. That could be Zen 3... or it may not be until late 2021-mid 2022 before I see something worth upgrading to.
I think the single L3 cache rather than split will help a lot in gaming. Check out the 3100 vs 3300X.
 

chrisjames61

Senior member
Dec 31, 2013
579
292
136
So, Intel Core has clung to the title "world's best gaming processor", despite AMD's launch of Ryzen 3000, and reinforced the title with the launch of the 10000-series Comet Lake refresh.

It must irk AMD's marketing executives that, despite giving glowing recommendations for the Ryzen chips, especially on value and productivity, reviewers often add the caveat that "Intel is still the choice for absolute best high frame-rate gaming", implying that Intel is still the premium brand for gamers — the majority of PC enthusiasts.
"Irk AMD"? I don't think so. AMD outsells Intel at a 9 to 1 ratio in retail cpu sales at Amazon and Newegg etc... The majority of enthusiasts at least on the retail cpu segment are overwhelmingly voting with their wallets for AMD.
 

mopardude87

Platinum Member
Oct 22, 2018
2,784
1,090
96
"Irk AMD"? I don't think so. AMD outsells Intel at a 9 to 1 ratio in retail cpu sales at Amazon and Newegg etc... The majority of enthusiasts at least on the retail cpu segment are overwhelmingly voting with their wallets for AMD.

There is a reason on top of the prices that got me interested in my 3900x, i don't call my build a tribute for nothing. I love the gaming performance of a 9900k but my heart was more set on something with more threads so it could do more with folding. Its not all about gaming for me, i focused my build on what it was doing mostly. It's folding. I believe in balance in all things and i accepted more threads for lost in FPS. The 4000 series may allow me to have my cake and eat it too, will have to see.

I love gaming, but when i am getting over 70+ in everything i play and i get zero stutter or lag, why the hell should i care at this point if i get more frames? I care more if i could load up BOINC to 76% and still play BF4 at 100+ constant and well i can with ease. I don't believe in wasting nothing with computer resources and since i sit idle often enough i rather see 23 threads working vs 15. That is my pleasure, seeing this thing work.
 
Last edited:

Arkaign

Lifer
Oct 27, 2006
20,481
937
126
So Intel is for toy computers. AMD gets everything else?
I think a nicer way of putting it is that there are outstanding options available from both Intel and AMD now, and AMDs rise back to excellence has paid dividends for all of us, regards of our personal priorities.

Realizing objectivity vs subjectivity is also very important when trying to maintain respectful and honest dialogues with others on these things. I have no personal allegiance towards or against any of these companies. I own both Intel and AMD CPUs, and Nvidia and AMD GPUs, I even have a cache of stuff going back to 3dfx, Matrox, ATi, S3, Cirrus Logic, Tseng Labs, Commodore PET, TRS-80, Sinclair, etc. I can tell you the pain of setting up RLL and MFM interfaces, manually tracing out Mobo designs because I got a used pile of parts at a Saturday sale and have no idea what any of the 59 jumpers do until careful examination and testing. DIPP Ram expansions 🤣

People getting in their feelings over these companies just need some perspective, and let others enjoy their favorite tech even if it's "someone is wrong on the internet!" Haha.

Love all my AT crew
 

amrnuke

Senior member
Apr 24, 2019
838
1,015
96
A couple more fps in a game vs getting clobbered in every other metric is called "grasping at straws".
When you compare 10600K ($450 all-in with mobo, chip, tower cooler) to 3700X ($445 all-in with mobo, chip) there are non-gaming tasks that it beats the 3700X in, such as Octane, WebXPRT (I'm being facetious).

In seriousness, looking even at multithreaded apps it's not all that far behind the 3700X at stock, except in rendering, given its 2 core / 4 thread deficit.

My point not being that the 10600K is better, or an overall excellent value, or efficient, but that it is, indeed, remotely competitive overall at stock on average in CPU tests (cf TPU review 10600K) when compared to the 3700X in non-gaming, non-rendering tasks.

But if you overclock and spend another $75 to get an AIO instead of a tower cooler... it pulls ahead in Tensorflow, Euler, DigiCortex, Adobe, Zephyr, VMWare, VeraCrypt, Visual Studio -- and comes within 2% of the 3700X overall in CPU tests.

So it's not getting "clobbered" in every other metric. Overall, it is behind, yes, but there are plenty of non-gaming use cases where the 10600K might be worth the cost, especially if you don't want to jump all the way to a 3900X ($600 mobo + chip) / 10900K ($750-800 mobo + chip + cooler) but do want the best processor for the price and want something between the 3700X/3800X and the 3900X. Yes, a 10600K overclocked will run hotter than a stock 10900K in heavy tasks, and yes it might draw 35W more power than a 10900K during stress tests... but if you're using apps that parallel performance in any of the tests the 10600K leads in when overclocked, the 3700X might not make as much sense as spending $50-75 more for a 10600K with AIO, if you don't want to jump all the way to a 3900X or 10900K.

So at $525, it might just be at the right sweet spot for some people even in non-gaming tasks, if your workload leverages the 10600K better than a 3700X.

Intel gave a decent answer for now, for some people. Now it's up to AMD to show if they can exploit their process lead.
 

Thunder 57

Golden Member
Aug 19, 2007
1,409
1,184
136
When you compare 10600K ($450 all-in with mobo, chip, tower cooler) to 3700X ($445 all-in with mobo, chip) there are non-gaming tasks that it beats the 3700X in, such as Octane, WebXPRT (I'm being facetious).

In seriousness, looking even at multithreaded apps it's not all that far behind the 3700X at stock, except in rendering, given its 2 core / 4 thread deficit.

My point not being that the 10600K is better, or an overall excellent value, or efficient, but that it is, indeed, remotely competitive overall at stock on average in CPU tests (cf TPU review 10600K) when compared to the 3700X in non-gaming, non-rendering tasks.

But if you overclock and spend another $75 to get an AIO instead of a tower cooler... it pulls ahead in Tensorflow, Euler, DigiCortex, Adobe, Zephyr, VMWare, VeraCrypt, Visual Studio -- and comes within 2% of the 3700X overall in CPU tests.

So it's not getting "clobbered" in every other metric. Overall, it is behind, yes, but there are plenty of non-gaming use cases where the 10600K might be worth the cost, especially if you don't want to jump all the way to a 3900X ($600 mobo + chip) / 10900K ($750-800 mobo + chip + cooler) but do want the best processor for the price and want something between the 3700X/3800X and the 3900X. Yes, a 10600K overclocked will run hotter than a stock 10900K in heavy tasks, and yes it might draw 35W more power than a 10900K during stress tests... but if you're using apps that parallel performance in any of the tests the 10600K leads in when overclocked, the 3700X might not make as much sense as spending $50-75 more for a 10600K with AIO, if you don't want to jump all the way to a 3900X or 10900K.

So at $525, it might just be at the right sweet spot for some people even in non-gaming tasks, if your workload leverages the 10600K better than a 3700X.

Intel gave a decent answer for now, for some people. Now it's up to AMD to show if they can exploit their process lead.
While I largely agree, I could be concerned about motherboard compatibility going forward with LGA1200. Even if they keep the socket, will they require a new chipset considering Intel is supposedly on it's last Lake revision and moving to its Cove cores? Or am I wrong on that? I know it's called Rocketlake but I thought it was a backport of some Cove core.
 

Arkaign

Lifer
Oct 27, 2006
20,481
937
126
You know, something else that came to mind regarding the descriptions above of +5-20% in this vs that etc, is definitely a thing that perhaps people should take more measure on how they actually use a PC.

Person A owns : Xeon E5 1660 v4, 64GB Quad Channel ECC Ram, SAS 6TB Array + 2TB Enterprise PCIe SSD etc.

Person B owns : Core i7 4790 non K, 16GB DDR3-2133, HD 7870 2GB, 240GB SSD.

Say person A is an independent media professional, who encodes dozens of hours of 1080p and 4k media per month, perhaps on contract, or perhaps for a YouTube channel etc. They want to build a single PC, and they make income based on volume of content produced. This person 'uses heavy multithreading' for sure. They also game from time to time when they have the urge, but maybe only a few hours a month at most, and they can't be bothered to buy a cutting edge GPU, and are happiest with 4k 60hz large format displays, and run a pair of them off a last gen Pascal or Vega card. Because of the volume of their primary heavy use case, +20% or more improvement in time efficiency is HUGE. They could conceivably take on more work, or be able to use their time more effectively, spend more time with family, hobbies, getting outside, whatever.

Now take person B. They encode a couple of times a year as the family member called on to help make videos of family reunions, quinceaneras, bar or bat mitzvahs, gatherings of the Hell's Angels, or whatever. But basically something they do one weekend two or three times a year, and maybe the odd Photoshop job a cousin messages them about over Facebook etc. They also "use heavy mulitthreading", but say you even were able to double their effective encoding performance, sure it might cut 20 hours a year down to 10 hours a year, but it's not going to really affect their day to day life in any real way, and if their PC is already good enough for their purposes, upgrading might be a bigger risk for them vs just riding it out a while longer or upgrading an area where they might see more benefit (say going from HDD to SSD, or getting a new monitor with better size and quality, etc). They game a few hours a week with friends on common MP titles like Apex Legends and COD Warzone.

Person A might see value even in investing in a new Threadripper 64C/128T rig, or multiple rigs, if the economic results magnify their revenue and profit vs expenses (to say nothing about potential equipment write-offs). Even 5% advantage when you're talking big numbers and actual improvement in quality of life is a real consideration.

Person B, ehh. They might get wrapped up in reviews and benchmarks, and buy something that really doesn't even make all that much sense. Particularly if they have a limited budget, and they could have made a much more QOL-focused purchase that made their experience nicer. Maybe their KB/M was trash, or they had an old 1080p 23" TN panel with trash color accuracy, or their 240GB Sata SSD pushing past 85% utilization so it's starting to really chug and lag, etc. Going from a 4790 to a 10600 or 3600 rig would basically do nothing for then unless they also spent at LEAST $400+ on a GPU. And at that point they'd also be needing new DDR4 Ram, probably a new SSD, etc. Could turn into a $800-$1k+ upgrade in a hurry. Or they could spend a fraction of that on a 1660 Super, 480GB SSD, and a 27" 144hz 1080p AOC Freesync IPS, and immediately their gaming time would massively improve, at half the cost, and be arguably a better experience than spending double the money, getting an entirely new platform, but still on a 23" trash monitor.

Just goes to show that the answers for everyone will be different, and just because they vary, doesn't mean they're wrong really, I mean I might have the opinion that people can more effectively hit their goals within limited budgets by doing more intelligent research and analysis, but at the end of the day : are they happy with what they chose? If so, I mean what can we really say, even if they chose something ludicrous haha.
 

RasCas99

Junior Member
May 18, 2020
12
14
36
You know, something else that came to mind regarding the descriptions above of +5-20% in this vs that etc, is definitely a thing that perhaps people should take more measure on how they actually use a PC.

Person A owns : Xeon E5 1660 v4, 64GB Quad Channel ECC Ram, SAS 6TB Array + 2TB Enterprise PCIe SSD etc.

Person B owns : Core i7 4790 non K, 16GB DDR3-2133, HD 7870 2GB, 240GB SSD.

Say person A is an independent media professional, who encodes dozens of hours of 1080p and 4k media per month, perhaps on contract, or perhaps for a YouTube channel etc. They want to build a single PC, and they make income based on volume of content produced. This person 'uses heavy multithreading' for sure. They also game from time to time when they have the urge, but maybe only a few hours a month at most, and they can't be bothered to buy a cutting edge GPU, and are happiest with 4k 60hz large format displays, and run a pair of them off a last gen Pascal or Vega card. Because of the volume of their primary heavy use case, +20% or more improvement in time efficiency is HUGE. They could conceivably take on more work, or be able to use their time more effectively, spend more time with family, hobbies, getting outside, whatever.

Now take person B. They encode a couple of times a year as the family member called on to help make videos of family reunions, quinceaneras, bar or bat mitzvahs, gatherings of the Hell's Angels, or whatever. But basically something they do one weekend two or three times a year, and maybe the odd Photoshop job a cousin messages them about over Facebook etc. They also "use heavy mulitthreading", but say you even were able to double their effective encoding performance, sure it might cut 20 hours a year down to 10 hours a year, but it's not going to really affect their day to day life in any real way, and if their PC is already good enough for their purposes, upgrading might be a bigger risk for them vs just riding it out a while longer or upgrading an area where they might see more benefit (say going from HDD to SSD, or getting a new monitor with better size and quality, etc). They game a few hours a week with friends on common MP titles like Apex Legends and COD Warzone.

Person A might see value even in investing in a new Threadripper 64C/128T rig, or multiple rigs, if the economic results magnify their revenue and profit vs expenses (to say nothing about potential equipment write-offs). Even 5% advantage when you're talking big numbers and actual improvement in quality of life is a real consideration.

Person B, ehh. They might get wrapped up in reviews and benchmarks, and buy something that really doesn't even make all that much sense. Particularly if they have a limited budget, and they could have made a much more QOL-focused purchase that made their experience nicer. Maybe their KB/M was trash, or they had an old 1080p 23" TN panel with trash color accuracy, or their 240GB Sata SSD pushing past 85% utilization so it's starting to really chug and lag, etc. Going from a 4790 to a 10600 or 3600 rig would basically do nothing for then unless they also spent at LEAST $400+ on a GPU. And at that point they'd also be needing new DDR4 Ram, probably a new SSD, etc. Could turn into a $800-$1k+ upgrade in a hurry. Or they could spend a fraction of that on a 1660 Super, 480GB SSD, and a 27" 144hz 1080p AOC Freesync IPS, and immediately their gaming time would massively improve, at half the cost, and be arguably a better experience than spending double the money, getting an entirely new platform, but still on a 23" trash monitor.

Just goes to show that the answers for everyone will be different, and just because they vary, doesn't mean they're wrong really, I mean I might have the opinion that people can more effectively hit their goals within limited budgets by doing more intelligent research and analysis, but at the end of the day : are they happy with what they chose? If so, I mean what can we really say, even if they chose something ludicrous haha.
Agreed , for me a life cycle of an upgrade (for the last 18 years or so) is usually :
1) New rig.
2) Upgrade GPU after 2-3 years.
3) Upgrade GPU after 2-3 years.
4) New Rig.
..
..
I am probably not doing this optimally , but as i play non shooter exclusively even this is an overkill , i am about to "New rig" this year from an OC I7-2600K + RX-480 and to be honest i just finished Gear tactics and Xcom2 WOTC (again) and they both looked and played extremely well for my old eyes.
So for Cyberpunk (hope it wont be too shootery :)) its going to be a new Rig (including monitor) with an AMD , but as other as mentioned the CPU`s feel similar to me , as a non professional ,if it was impacting my day 2 day job i would 100% pick the better one even if by few %.
My first AMD rig , and i started having PC`s 25 years ago , only thing that bothers me is the fact i dont know the reliability of the MOBO`s , assuming AMD are back to mainstream they should be good quality so ill just take the plunge and go for it.
 

rbk123

Senior member
Aug 22, 2006
695
254
136
When you compare 10600K ($450 all-in with mobo, chip, tower cooler) to 3700X ($445 all-in with mobo, chip) there are non-gaming tasks that it beats the 3700X in, such as Octane, WebXPRT (I'm being facetious).
In seriousness, looking even at multithreaded apps it's not all that far behind the 3700X at stock, except in rendering, given its 2 core / 4 thread deficit.....

Intel gave a decent answer for now, for some people. Now it's up to AMD to show if they can exploit their process lead.
The real problem, though, is the consumer has his/her own memory and for the most part isn't stupid. They know Intel could have offered something like that for the prior 10 years of dominance. They also know the only reason "Intel gave a decent answer" is because they were forced to. Consumers here aren't quick to reward getting screwed for all those years and buy even competitive Intel products. They'd do the same with Nvidia if they had an alternative.
When, or if, AMD starts trying to gouge the consumer - which is typically inevitable because of the bean counters and shareholders - then consumers here will be a little more even handed with Intel.
 

ASK THE COMMUNITY