Question Next CPU/Platform Upgrade Timing?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

dsc106

Senior member
May 31, 2012
320
10
81
Specs in my sig: essentially, a 2012 Sandy Bridge E 3.2ghz 6-core build (2nd gen core i7). 64gb RAM. Upgraded with GTX 1080 TI. Still kicking well, but I'm a video editor, and gamer on the side.

This nothing urgent, but my timeline is beginning to feel like "within the next 18 months, 2 years tops" for a core system upgrade. But here are my questions:

1. How noticeable would a CPU upgrade today be? For 4K video with HEVC, I am guessing it would be noticeable. For everything else, i.e. games, I am guessing kinda-sorta but not really worth it?

2. Thoughts on timing? Ice Lake 10nm is around the corner, and that seems like a good time to move on, so sometime in early 2020, or by the end of year 2020. Thoughts?

3. I am not familiar with Intel's usual product plans: should we expect an Ice Lake 10nm Extreme Chip within the next 18 months? Or will I find myself preferring a 14nm Extreme Chip with more cores for video editing?

My thoughts for an upgrade are the following:

- Move from 64gb RAM to 128gb.
- Move from 6-core 3.2ghz to as many cores as I can and as fast as I can (12 core 4.5ghz??)
- Acquire new chip tech that better supports HEVC, h.265, 4K, HDR, 10-bit color, etc.
- Motherboard with all new features for easy use of PCI-E M.2 SSDs and lots of SATA ports, etc.

Will next year be good timing? Or is this year just as good as any, or is my system strong enough and upgrade benefits minimal enough that I should press through longer for a big breakthrough in 2021?

It seems DDR5 isn't going to be worth waiting for and the CPU upgrade cycles have really slowed down.

I am also wondering if my Corsair 850 PSU will be good to keep chugging on a new rig, or if I need to upgrade it for either age (it will be 8 years old in 2020) or power (is 850w enough for a high powered rig these days / next year?)

If this seems premature, it is because I need to start budgeting and planning now to project my upgrade next year, and also want to get an idea of what I should target roughly for timing. It's nice to be in a place where there is no pressing need, but also realizing that as 4K h.265 HEVC, HDR, and other things keep moving forward the time is drawing near where a new, strong platform may serve me well for another 7-10 years (I can't believe that's how long upgrade cycles for PCs have moved to! I remember having to upgrade every 2 years "back in the day"...)

Thank you!
 

Mopetar

Diamond Member
Jan 31, 2011
7,826
5,971
136
here is how it looks like when you don't benchmark average of averages

https://www.techspot.com/review/1754-battlefield-5-cpu-multiplayer-bench/

https://www.computerbase.de/thema/prozessor/rangliste/#diagramm-performancerating-frametimes - select the gaming chart and frametimes (99th percentile)

CPU is still most important when when it comes to the most intensive combat/boss fight scenes, where GPU defines the overall performance at selected resolution

IMO we wont see much improvement in this area (even with ryzen 3000) unless sunny cove architecture derivates come to the market...

The AnandTech bench link that I used also contains 95th percentile frame rates, but at the highest resolutions, they weren't that much different compared to the averages so I didn't bother to include them. The overall results were slightly more favorable here to the 9900k, but not so much as you might expect. There were still several games where once you get to 1440p or 4K, the results fall within the margin of error.

We can argue over whether 95th percentile is good enough, or whether it should really be 99th, or even 99.9th percentile, but if we're going to get to that level of analysis you'd want to look at a frame time graph to see what's going anyways. And once again, those results tend to get washed out more and more when you move to higher resolutions.
 

TheGiant

Senior member
Jun 12, 2017
748
353
106
The AnandTech bench link that I used also contains 95th percentile frame rates, but at the highest resolutions, they weren't that much different compared to the averages so I didn't bother to include them. The overall results were slightly more favorable here to the 9900k, but not so much as you might expect. There were still several games where once you get to 1440p or 4K, the results fall within the margin of error.

We can argue over whether 95th percentile is good enough, or whether it should really be 99th, or even 99.9th percentile, but if we're going to get to that level of analysis you'd want to look at a frame time graph to see what's going anyways. And once again, those results tend to get washed out more and more when you move to higher resolutions.

as resolution increases, the importance and the value of high percentile becomes even more important
in current times, at 1080p there is a mix in gameplay that comes GPU and CPU bound, so 95th percentile is usually cpu bound
at 4k most of the gaming scenes are GPU bound, so you need to look at the 99th or even better 99,9% percentile frame time ( and those are usually the same as with 1080P- boss fights, lots of objects, difficult movements, fast aiming needed etc)
atm its still CPU bound - look at the techspot bf5 multiplayer, at 4k https://www.techspot.com/review/1754-battlefield-5-cpu-multiplayer-bench/ the 1080ti/2080nonTi is about ryzen2600x bottleneck, the pentium g does mins of 24fps (at that is 1% not 0,1%), so its not that you can choose any CPU at 4K
CPU usually stays longer within the build than GPU- the next GPU of rtx3080Ti/other at 4K will be bottlenecked by the 2600x/8400 or slower
since GeForce 1 came out, I've been reading that GPUs will take the job- it didn't happen for nearly 20 years...the same is still here with cpu vs gpu- CPU=critical scenes, GPU=general average performance
and the future is not 8k or 16k, 4k is "enough" atm but the future it is also high fps smooth 144fps gaming and CPU comes more important, not less than ever in this age
 
Last edited:

Thala

Golden Member
Nov 12, 2014
1,355
653
136
We can argue over whether 95th percentile is good enough, or whether it should really be 99th, or even 99.9th percentile, but if we're going to get to that level of analysis you'd want to look at a frame time graph to see what's going anyways. And once again, those results tend to get washed out more and more when you move to higher resolutions.

There is no argument - looking at the 95th percentile is senseless. This still means that 5 out of 100 frames do not achieve the deadline for the framerate. So assuming 60fps - 5 out of 100 frames have a frametime higher than 16.6ms and produce micro stutter. Either you directly look at maximum frametime or at least 99.9th percentile.
Then of course your examples which show similar performance are heavily GPU limited - especially if a GTX1080 is used driving 4k resolution - thats absurd.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Good timing starts this summer/autumn.

- memory prices are coming down to reasonable levels
- SSD prices are at all-time low and expected to slowly go down throughout 2019, with fast NVME price premium mostly gone
- summer will mark a strong release from AMD and Intel is bound to respond (just pick what suits you)

The second half of this year will be a very good time for all-round desktop upgrades.
OP, I came here to say basically this. Wait for Ryzen 3 to drop, I expect we will see a couple of the Ryzen niggling issues like lower top RAM speed and lower top clocks get improved and it will keep its current strengths (lots of cores for cheap)
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Unless you care about bragging rights and benchmarks at resolutions you won't game it, it really doesn't matter what CPU you go with if you're running at the highest resolutions.

I'd just suggest getting whatever CPU has the best performance for the work-related tasks you need it for and realize that it's probably going to be fine for any gaming that you do and that you won't be able to notice any differences in most titles unless there's some bug that probably gets fixed within a few weeks of being discovered.

This is all true and I'm not disagreeing with you. But the caveat is that while CPU doesn't matter much for high resolution gaming, it matters greatly for high frame rate gaming (90+ FPS). Those two being the 2 most strenuous things you can do with a graphics card.
 

Mopetar

Diamond Member
Jan 31, 2011
7,826
5,971
136
Not to necro this thread too much, but Hardware Unboxed put out a video earlier today that covers Ryzen core scaling in recent games with a 2080 Ti:

There it's making a much bigger difference and even at 4K in some titles. Steve did point out in the comments that there might be an issue with the 2600X results that make them come in a bit lower.

I'm also not sure to what extent the clock speed differences affect the results, and they don't indicate that the clock speed was controlled. However, the 2-core 200GE is so much less performant that it's clearly not a matter of clock speeds causing the bottleneck.
 

dsc106

Senior member
May 31, 2012
320
10
81
Hi everyone, I'm back, with a few questions to piggy back off the speculation in this thread from Feb.

It looks like what I am seeing is Intel 10nm DESKTOP may not even surface until 2022 now?? With "Comet Lake" as a 14nm update?

And we have the Ryzen 3900 12 core top end CPU for $499 launching on 7/7, matching or surpassing Intel's current top performer.

Is there any reason at all to stay with / wait for Intel to see what happens with a desktop Ice Lake or the 14nm Comet Lake?

It looks like AMD may not be first to market by a matter of years, with better single and multi core performance, and a highly competitive price? Any missing AMD features for video (h.265 hevc, or other chipset instructions)? Any unknown drawbacks?

Would love to hear thoughts. Otherwise this summer may mark upgrade time to AMD's $500 cpu...
 
  • Like
Reactions: scannall

scannall

Golden Member
Jan 1, 2012
1,946
1,638
136
Hi everyone, I'm back, with a few questions to piggy back off the speculation in this thread from Feb.

It looks like what I am seeing is Intel 10nm DESKTOP may not even surface until 2022 now?? With "Comet Lake" as a 14nm update?

And we have the Ryzen 3900 12 core top end CPU for $499 launching on 7/7, matching or surpassing Intel's current top performer.

Is there any reason at all to stay with / wait for Intel to see what happens with a desktop Ice Lake or the 14nm Comet Lake?

It looks like AMD may not be first to market by a matter of years, with better single and multi core performance, and a highly competitive price? Any missing AMD features for video (h.265 hevc, or other chipset instructions)? Any unknown drawbacks?

Would love to hear thoughts. Otherwise this summer may mark upgrade time to AMD's $500 cpu...
That would be a solid choice. I may get one as well, though I don't really *need* it per se. But, new, shiny and fast...
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
Is there any reason at all to stay with / wait for Intel to see what happens with a desktop Ice Lake or the 14nm Comet Lake?

If you need something now, nope. Unless you need an IGP.

There is always something better down the road. If you keep waiting for that, you're never going to new kit.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,540
14,494
136
Hi everyone, I'm back, with a few questions to piggy back off the speculation in this thread from Feb.

It looks like what I am seeing is Intel 10nm DESKTOP may not even surface until 2022 now?? With "Comet Lake" as a 14nm update?

And we have the Ryzen 3900 12 core top end CPU for $499 launching on 7/7, matching or surpassing Intel's current top performer.

Is there any reason at all to stay with / wait for Intel to see what happens with a desktop Ice Lake or the 14nm Comet Lake?

It looks like AMD may not be first to market by a matter of years, with better single and multi core performance, and a highly competitive price? Any missing AMD features for video (h.265 hevc, or other chipset instructions)? Any unknown drawbacks?

Would love to hear thoughts. Otherwise this summer may mark upgrade time to AMD's $500 cpu...
Intel having anything decent, to beat AMD is going to be a while. How long is that 2020 ? 2021 ? 2022 ?

What we DO knw is that the new 3000 series from AMD are going to be very good and power efficient. I would wait for reviews before ordering, but I think that will happen anyway,. Next 36 days will be interesting !
 

dsc106

Senior member
May 31, 2012
320
10
81
I don't need IGP, I have 1080 TI. This is for video editing.

Timing isn't a huge rush as my system is solid, so I've more been looking for optimal upgrade timing. Editing 4K and dealing with h.265 is long in the tooth, though, so if a Ryzen 3900x is going to be a very strong bet for the next 2-3 years it's time for me.

No *big* reason to wait for a THREAD RIPPER?? Or Intel response (nope here it sounds like)? If there's something big/obvious around the corner in the next 12 months (timing isn't urgent since I could wait), then I'll wait. If there's not, time to upgrade I think.
 

dsc106

Senior member
May 31, 2012
320
10
81
Will the new X570 Ryzen 3 platform support 128gb RAM? Seems it will but most boards I see only have 4 DIMM sockets, and 32gb Modules seem expensive and hard to come by (Aka almost non existent)?

And if not, I don't suppose I can reuse my 64gb of DDR3 from my 2012 Sandy Bridge Extreme s2011 platform eh?
 
Last edited:

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,540
14,494
136
Will the new Ryzen 3 platform support 128gb RAM?

And if not, I don't suppose I can reuse my 64gb of DDR3 from my 2012 Sandy Bridge Extreme s2011 platform eh?
Not sure, but it sure does sound like threadripper is for you. If you can't wait for that, the 2970wx is a very nice chip.
And here is one for $849 or less (with make an offer) https://www.ebay.com/itm/AMD-Threadripper-2970WX-CPU-and-carrier-only-Runs-link-new/123787099042?hash=item1cd2493ba2:g:Sg0AAOSwIMlc3hEz&LH_BIN=1

I don't think you can get 128 gig support on Ryzen AMD, but its easy on TR4 (threadripper). At the moment, it looks like 128 gig would be the easiest, (16gig x 8 chips), but if you can find bigger capacity, I think it supports it.

Ryzen and threaripper are DDR4, so no you could not reuse your memory.
 

dsc106

Senior member
May 31, 2012
320
10
81
Are there updated thread rippers coming after the july lunch? Or is that a long way off?

As I understand thread ripper is only about higher core count, and maybe easier to find an 8 DIMM motherboard?
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,540
14,494
136
Are there updated thread rippers coming after the july lunch? Or is that a long way off?

As I understand thread ripper is only about higher core count, and maybe easier to find an 8 DIMM motherboard?
The 2970wx is Zen+. We don't have a date on TR Zen2 (3000 series) Yes. Virtually all TR motherboard have 8 slots and are quad channel like yours. The reason AM4 only has 4 dimm slots, is its dual channel. Who needs more than 64 gig ram for 12 cores ? The 2970wx is 24 core, 48 threads, and is a monster.
 

DrMrLordX

Lifer
Apr 27, 2000
21,609
10,803
136
@dsc106

To address a few of your questions or concerns:

1). We don't know exactly how much RAM Matisse (Zen2) will address. Safe bet is 64GB at any kind of reasonable speed. Summit Ridge and Pinnacle Ridge had serious problems with 4x16GB setups - you had speeds as low as DDR4-2133 or DDR4-1866 in those configurations. Hopefully Matisse will do better there. If they can support DDR4-3200, 4x16GB, that will be a big plus for 3900x owners looking to do "serious work" on AM4. It still won't be a Threadripper, though.

2). Intel's date for introducing 10nm anything to the desktop is . . . unknown. Ditto for HEDT. They are pushing IceLake-SP aggressively, though the maximum core count will be 26c with unknown clockspeed. So maybe Intel's HEDT will get IceLake or TigerLake-X at some point. Expect to pay through the nose for that. In the meantime, Cascade Lake-X should come eventually, at significant cost without any major performance boost over Skylake-X refresh. Yay?

3). We haven't seen any video encoding benchmarks for Matisse yet. It has proper AVX2 support thanks to its 256-bit FMACs, so all encoding performance should go up in your favorite AVX2-enabled encoding software. Intel may still have an edge where AVX512 helps them, but that's Skylake-X/Skylake-X refresh only.

4). Comet Lake is missing in action on the desktop. Instead we're getting something called the 9900KS in Q4 2019. I think Intel may have pulled it when they realized that AMD was really going to sell 12c Matisse chips (with 16c Matisse in reserve).

5). Threadripper 3 is still a big unknown. AMD recently confirmed that it is in the works. It's just a matter of when.
 

NTMBK

Lifer
Nov 14, 2011
10,230
5,007
136
Hi everyone, I'm back, with a few questions to piggy back off the speculation in this thread from Feb.

It looks like what I am seeing is Intel 10nm DESKTOP may not even surface until 2022 now?? With "Comet Lake" as a 14nm update?

And we have the Ryzen 3900 12 core top end CPU for $499 launching on 7/7, matching or surpassing Intel's current top performer.

Is there any reason at all to stay with / wait for Intel to see what happens with a desktop Ice Lake or the 14nm Comet Lake?

It looks like AMD may not be first to market by a matter of years, with better single and multi core performance, and a highly competitive price? Any missing AMD features for video (h.265 hevc, or other chipset instructions)? Any unknown drawbacks?

Would love to hear thoughts. Otherwise this summer may mark upgrade time to AMD's $500 cpu...

Have you considered upgrading the CPU in your current system? Your motherboard can take up to a 12 core Xeon, and those chips are now available on eBay for about $300. That way you can continue using your 64GB of DDR3 RAM, and give yourself a big boost to video work.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,540
14,494
136
Have you considered upgrading the CPU in your current system? Your motherboard can take up to a 12 core Xeon, and those chips are now available on eBay for about $300. That way you can continue using your 64GB of DDR3 RAM, and give yourself a big boost to video work.
Whilr that allows him to reuse his memory, he would be putting $300 to a dead platform. Not a good use of funds IMO. Also, threadripper may be his answer anyway.
 

NTMBK

Lifer
Nov 14, 2011
10,230
5,007
136
Whilr that allows him to reuse his memory, he would be putting $300 to a dead platform. Not a good use of funds IMO. Also, threadripper may be his answer anyway.

What is the point of getting a "not dead" platform if you're always going to chase the next platform? ;) If it almost doubles his performance at a fraction of the price (and much less hassle!), I don't see it as a waste.
 

Timmyotule

Member
Aug 31, 2002
64
3
71
Have you considered upgrading the CPU in your current system? Your motherboard can take up to a 12 core Xeon, and those chips are now available on eBay for about $300. That way you can continue using your 64GB of DDR3 RAM, and give yourself a big boost to video work.

This is an interesting idea. Here is the PassMark comparison to your current chip: https://www.cpubenchmark.net/compare/Intel-Xeon-E5-2697-v2-vs-Intel-i7-3930K/2009vs902

You'll get about 1.4 times better multithreaded performance with the Xeon but your single threaded (important for some games) will be 10% lower.

The Ryzen 3900x will likely get close to 30k on the PassMark CPU test, around 2.5 times faster than your current CPU. That is probably the direction I would go but if you really need / want 128GB of RAM Threadripper may be the better option since it has four memory channels.
 
  • Like
Reactions: scannall

dsc106

Senior member
May 31, 2012
320
10
81
Thanks for all the great thoughts and options! That is interesting about Xeon chip... there's a minor con, and a good boost, for cheap. Interesting. One downside here though is this would lack the new chipset instructions for H.265 / HEVC yes?

Then we have the AM4 3900x which looks very compelling and available immediately...

And then the next Threadripper which was unfortunately pushed off of a 2019 release but sounds like it would offer all the single threader performance of the 3900x, plus more cores and quad channel 128gb? (albeit it at a cost).

Hmm...

Would the Threadripper be just as good in games and single threaded apps as the Ryzen 3?
 

DrMrLordX

Lifer
Apr 27, 2000
21,609
10,803
136
@dsc106

That Xeon is an Ivy Bridge, so it supports AVX but not AVX2.

As for TR3 in games versus Ryzen 3xxx . . . we don't know. The old difference in gaming performance between Ryzen and Threadripper was due to the multiple-die layout of Threadripper CPUs versus the single-die layout of Ryzen. Now that AMD is on a chiplet design, we have no idea what chiplet-chiplet latency will look like, or how it will affect gaming performance. Even the 3900x has two chiplets. Presumably, TR3 will have anywhere from 2-4 chiplets once it finally hits the market. It might be a late 2019 release, but we don't know that either.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,540
14,494
136
@dsc106

That Xeon is an Ivy Bridge, so it supports AVX but not AVX2.

As for TR3 in games versus Ryzen 3xxx . . . we don't know. The old difference in gaming performance between Ryzen and Threadripper was due to the multiple-die layout of Threadripper CPUs versus the single-die layout of Ryzen. Now that AMD is on a chiplet design, we have no idea what chiplet-chiplet latency will look like, or how it will affect gaming performance. Even the 3900x has two chiplets. Presumably, TR3 will have anywhere from 2-4 chiplets once it finally hits the market. It might be a late 2019 release, but we don't know that either.
Yes, and any comments I had on Threadripper were for the current zen+ 2970wx, where the multi-threaded performance would knock you out of the park on MT score, but gaming would suffer a little compared to Intel, and probably suffer more the Ryzen 2 (3000 series). But if gaming is not that important, but MT performance is, you may be very happy witha 2970wx or 2990wx.
 

eek2121

Platinum Member
Aug 2, 2005
2,929
4,000
136
AFAIK Adobe suit uses the iGPU on intel (additionally to the dGPU) for certain stuff.

See https://www.gamersnexus.net/guides/3310-adobe-premiere-benchmarks-rendering-8700k-gpu-vs-ryzen

For 144hz gaming, intel is still king, at least now but only on their consumer version chips. This might change with Ryzen 3000 series ut we don' t know that yet. So all you can do is wait. But anyway you will need to decide what is more important. Video encoding or gaming. A high core count cpu will always have lower clocks and hence worse gaming performance. (Unless AMD releases some form of gaming mode on their thread rippers that disables some cores and increases clocks).

Regarding clocks -- not true. Threadripper chips traditionally have clocked higher than their Ryzen counterparts. The Ryzen 9 3900X clocks highest of all the Ryzen 3000 series products. It's just a matter of binning.
 

dsc106

Senior member
May 31, 2012
320
10
81
Thank you all! I am going to keep my eye on this. And I think you have answered all you feasibly can - much appreciated - so let me just ask in summary to make sure I am tracking all I reading here and online correctly:

1. TR3, while an unknown quantity, MAY not see the same decline in gaming/single threaded performance as the current 2000 generation due to a new design? But this is TBD, but there are some hopeful indicators?

2. Threadripper in general is a beast when it comes to cores + some high end features such as quad-channel memory up to 128gb. However, the performance gains here are questionable in most of today's benchmarks, including Adobe Premiere and Davinci Resolve, due to sub-optimal multicore optimization at high core counts? And, they may be made even less beneficial in light of Ryzen 3 having a 12-core CPU?

If #2 is the case, it's a tough one. All it takes is a software update to these programs to really unlock the potential of Threadripper, especially if #1 ends up being that the single threaded performance stays high on TR3. But then, even Ryzen 3000 with 12-cores is pretty compelling, sans the 64gb memory cap (well, I guess you can technically get around that and do 128gb dual channel... technically). It's just a shame I can't get an AM4 board and Ryzen 3000 and then simply swap out to a TR3 when it launches if its worth it.

I'll play the waiting game a bit longer here and watch for developments. I wouldn't spring before August, and if TR3 is worth it and around the corner it wouldn't be too much of an issue for me to wait until Q1 2020.