• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Question 5600 (not X) or 3600 for gaming at 1440p?

aleader

Senior member
Oct 28, 2013
487
145
116
I've read that the 5600 when it releases in January will be equivalent to the 10600K. From this review, the 10600K is only about 3% faster than the 3600 at 1440p (the only resolution I care about):

https://www.techspot.com/review/2031-intel-core-i5-10600k/

I don't use my desktop for anything other than gaming. I don't really want to pay $450 CAD for a 5600X if the difference is so marginal in gaming, I'd rather wait for the 5600. I have a 3600 now (paid $165 USD new), but I just bought a B550 MSI Gaming Plus MB because it was so cheap ($95 USD with the discount/Steam code). I want to give my B450 MSI Gaming Pro Carbon AC board to my son, but either he or I will need a new CPU. Just get another 3600 if it gets real cheap in the next few weeks/month? Thoughts? Are there any other legitimate reasons to pay so much for a 5000 series chip?

EDIT: Actually from this review that has 1440p data for the 5600X, there's only a 3% difference between it and the 3600 too:

https://www.techpowerup.com/review/amd-ryzen-5-5600x/17.html

I maybe just answered my own question ;)
 
Last edited:

ZGR

Golden Member
Oct 26, 2012
1,865
282
126
I am CPU limited in most games I play at 1440p144hz. I would absolutely go for 5600 if patient. If at 60-75hz, a 3600 would be better fit if below $200.
 

Gideon

Golden Member
Nov 27, 2007
1,248
2,350
136
I'd say go for a 3600 and save the extra cash for a future GPU upgrade. You can always upgrade to 5xxx series down the line, when you're more CPU limited (say with Hopper or RDNA3). Around the time 6xxx series launches there should also be very good deals for 5xxx processors, should you indeed need a CPU upgrade.

EDIT:
Overall it really depends on the GPU you have. If you're running anything below RTX 3080 or 6800XT, then 3600 will be totally fine and probaby an optimal choice (your linked techpowerup benches are done with 2080 Ti).

If you're running 3080 or up, you might also consider the 5xxx series, as the difference can be bigger:

(unfortunately they didn't test the 3xxx series there with thre RTX 3090, but even comparing 10900K and 5950X it's easy to see that gap between CPUs widens a bit than with 2080 Ti)
 
Last edited:

damian101

Senior member
Aug 11, 2020
265
98
61
CPU gaming benchmarks should be done at the lowest resolution possible. It's supposed to be a CPU benchmark, not a system benchmark.
 

aleader

Senior member
Oct 28, 2013
487
145
116
CPU gaming benchmarks should be done at the lowest resolution possible. It's supposed to be a CPU benchmark, not a system benchmark.
I get that, but I don't care how it performs at 1080p. I want to know what it's like at my resolution so I can see what the real-world performance is.
 

aleader

Senior member
Oct 28, 2013
487
145
116
I'd say go for a 3600 and save the extra cash for a future GPU upgrade. You can always upgrade to 5xxx series down the line, when you're more CPU limited (say with Hopper or RDNA3). Around the time 6xxx series launches there should also be very good deals for 5xxx processors, should you indeed need a CPU upgrade.

EDIT:
Overall it really depends on the GPU you have. If you're running anything below RTX 3080 or 6800XT, then 3600 will be totally fine and probaby an optimal choice (your linked techpowerup benches are done with 2080 Ti).

If you're running 3080 or up, you might also consider the 5xxx series, as the difference can be bigger:

(unfortunately they didn't test the 3xxx series there with thre RTX 3090, but even comparing 10900K and 5950X it's easy to see that gap between CPUs widens a bit than with 2080 Ti)
I have a Sapphire Pulse 5700xt in the mail (paid $380USD new). Should get it this Friday. I was going to wait and get a 3060ti/6700XT, but I am going to wait a year or so and get one around this time next year. By then all this chaos will be over, and all the variants should be out too. Or, maybe the 5700XT will be enough and I'll wait for the next gen, who knows. I can always pass this stuff down to my two sons, or sell it I guess.

I have a LG 32" 1440p IPS 75Hz Freesync monitor ($210 from Costco - I really like it), and the only game I play a lot that gives me any issues at that resolution with my 1070 is DCS World. I should say 'gave'...I sold the 1070 a while ago and have my 1060 3GB in right now. DCS is unplayable at 1440p with that card.

Thanks everyone. From all the advice here I will likely just get a 3600 if I can get one sub-$200 again. I'd wait for the 5600, but I'm guessing availability is going to be non-existent until spring, and my son is already on me about upgrading his system. He bought RAM the other day and I'm giving him the MB (for his bday), but it isn't much good without a CPU ;)
 
Last edited:
  • Like
Reactions: Gideon

VirtualLarry

No Lifer
Aug 25, 2001
50,897
6,313
126
I have a Sapphire Pulse 5700xt in the mail (paid $380USD new).
I have a LG 32" 1440p IPS 75Hz Freesync monitor ($210 from Costco
I will likely just get a 3600 if I can get one sub-$200 again.
I think that sounds just great! I think that you'll like your RX 5700XT Pulse card. It should go well with that monitor.

I'm hoping for some $150-165 Ryzen R5 3600 CPUs again too, if they're ever available at that price again.

Otherwise, I'm looking at 3700X for sub-$270, or a 5900X at probably MSRP, whenevere they are available in qty. But I might stick with Zen2, just so thatI don't have to upgrade my mainboards.
 

aleader

Senior member
Oct 28, 2013
487
145
116
I think that sounds just great! I think that you'll like your RX 5700XT Pulse card. It should go well with that monitor.

I'm hoping for some $150-165 Ryzen R5 3600 CPUs again too, if they're ever available at that price again.

Otherwise, I'm looking at 3700X for sub-$270, or a 5900X at probably MSRP, whenevere they are available in qty. But I might stick with Zen2, just so thatI don't have to upgrade my mainboards.
Fingers crossed I don't have any driver issues. I'd be lying if I said I'm not nervous switching back to AMD after all these years. The software looks like Greek, but it's probably just because I'm used to Nvidia's control panel. Any advice on what not to do when installing the drivers? What should I NOT install? It will be a clean format, new MB, so old drivers won't be an issue.

I'm actually starting to see the odd 3700x show up used on Kijiji here now. That would be the first Zen 2 I've ever seen for sale used on there. I wouldn't be against a used one, if the price is right. I've bought a few used CPU's over the years and never had an issue.
 

VirtualLarry

No Lifer
Aug 25, 2001
50,897
6,313
126
Yeah, I could go for a used 3700X (actually, three of them), if the price is right for them.

As far as RX 5700XT cards go, I think that one is one of the "good" ones as far as stability and cooling.

Their driver package is a bit full-featured (speaking about their Adrenaline 2020 driver package here), so it can be a bit intimidating.

After getting them installed, right-click on the desktop, select "Radeon Settings", then when it pops up, click the "Gear icon" in the upper-right corner cluster, and then on the top-left, click "Display", and make sure that "HDMI Link Assurance" is ENABLED. I find that to often be a necessary first step, to prevent "black screen" issues.

Secondarily, you may want to enable "GPU Scaling", as well as set the Color Depth to "4:2:0", depending on your monitor and HDMI cable. If you have a capable monitor and HDMI 2.0-rated cable, you should probably leave that at "4:4:4".

Also, change the driver update settings from "Recommended" to "Recommend+Optional".

If you want to "Tweak" your card, go to "Performance", "Tuning", click "Manual", and then you can adjust the core clock/voltage, fan speed/curve, Power Limit, VRAM clock, etc.

If you're mining ETH like I am, click the "fine-tune settings" text under the core clock/voltage graph, to open up three dials. For the far-right set of dials, set the top one to "1350" (Mhz), and the bottom one to "860" (mV). Then click outside the boxes, then click "Apply" in the upper-right to save the settings. Then enable Power Limit Settings, click on the Power Limit slider, and set that to -20%. Then again click "Apply". That should lower your wattage for mining ETH, to 85W in software, which is probably like 100-110W at the wall.

Edit: And IMHO, you should mine ETH with your card, if you're not gaming. It will pay for the electric cost, and itself, over a year or so of time. (At $0.15 USD per KWh)

Check out Nicehash (or CudoMiner), which also requires signing up with an Exchange site like Coinbase, to actually sell your BTC that you earn, and transfer it into USD, then into your Paypal or bank account. (I prefer Nicehash, it has fee-free transfers of BTC to CB. CB then has fees to sell the BTC, using the simplified interface.)

Edit: Also note that mining puts a decent amount of stress on cards, especially the fans which have to run harder, but if you use my recommended "tweak" settings for the RX 5700(XT) (and RX 5600XT, same Navi10 GPU chip), you'll reduce the thermal load on the card, and extend its lifespan.

You should expect to make $1-2 (if using Ryzen CPU to mine Monero as well) per day, as of this writing. (*Earnings WILL fluctuate, quite a bit. But right now, CC is heating up, and earnings should be pretty good, and profitable after electric costs. And since you bought the card anyways... might as well make a few bucks on the side to make it pay for itself. At least, that's my philosophy.)
 
Last edited:

aleader

Senior member
Oct 28, 2013
487
145
116
Excellent, thanks for all that. I had the Gigabyte Ver. 2.0 card in my cart, but waffled, and it sold out. It was $20 cheaper, but the reviews are quite a bit worse, so I think I'll be happy with the Sapphire card. I use a quality (I think) DP cable, not HDMI. I had to to get Freesync to work with my Nvidia card. Should I switch back to HDMI? Install the full adrenaline package? Nothing to leave out?

I've never considered mining, but you may have just convinced me to try it. Power is pretty cheap here ($0.105 USD per KwH, with the carbon tax) so I'll have to look into it.
 
Last edited:

VirtualLarry

No Lifer
Aug 25, 2001
50,897
6,313
126
Yeah, install the full Adrenaline 2020 package. As far as DP, I don't use that, but I think that LCD monitors that are only "FreeSync", and not "FreeSync 2", have to use DisplayPort. Also, with NVidia cards to use "G-Sync Compatible" (aka FreeSync), I think that you likewise do need to use DP.
 

aleader

Senior member
Oct 28, 2013
487
145
116
Yeah, install the full Adrenaline 2020 package. As far as DP, I don't use that, but I think that LCD monitors that are only "FreeSync", and not "FreeSync 2", have to use DisplayPort. Also, with NVidia cards to use "G-Sync Compatible" (aka FreeSync), I think that you likewise do need to use DP.
Ok, thanks. I'll have to check that on my monitor. It's got to be the older Freesync for that price. But it did come with an HDMI cable, I had to buy the DP.
 

VirtualLarry

No Lifer
Aug 25, 2001
50,897
6,313
126
Yes, the monitor may well be a "FreeSync" monitor, with both HDMI and DP, but AFAIK, it was "FreeSync 2" from AMD, that allowed use of FreeSync (VESA VRB) over HDMI. So in order to actually make use of FreeSync features on those monitors, you had to connect via their DisplayPort input.
 

jpiniero

Diamond Member
Oct 1, 2010
9,167
1,798
126
I am still of the opinion that there won't be a 5600 (ie: a cheaper 6 core) available to DIY. For OEM perhaps.
 

aleader

Senior member
Oct 28, 2013
487
145
116
Yes, the monitor may well be a "FreeSync" monitor, with both HDMI and DP, but AFAIK, it was "FreeSync 2" from AMD, that allowed use of FreeSync (VESA VRB) over HDMI. So in order to actually make use of FreeSync features on those monitors, you had to connect via their DisplayPort input.
I don't want to turn this into a monitor thread, but this is my monitor (I actually only paid $205 for it when I checked):

https://www.lg.com/us/monitors/lg-32QK500-W-led-monitor

It calls it 'Freesync', not Freesync 2, or Premium, so I'm assuming that's what it is. This excerpt is from AMD's Freesync page, specifically the 'Freesync' heading (the others are Freesync Premium and Premium Pro):

Every AMD FreeSync™ monitor goes through a rigorous certification process to ensure a tear free, low latency experience. Pair your Radeon™ graphics card with an AMD FreeSync monitor over HDMI® or DisplayPort™ for effortlessly smooth gameplay.

That implies that either cable works does it not?

Also this, although this seems kind of ambiguous to me:

Does AMD FreeSync™ technology work over HDMI®?
Yes, FreeSync technology has supported HDMI since its inception. Many FreeSync certified displays have supported variable refresh rate technologies over HDMI since long before HDMI 2.1 was released and HDMI VRR was adopted. Buying a FreeSync certified display that supports FreeSync over HDMI, provides the immediate benefit of variable refresh rate, even if the display does not support HDMI 2.1.
 

aleader

Senior member
Oct 28, 2013
487
145
116
I am still of the opinion that there won't be a 5600 (ie: a cheaper 6 core) available to DIY. For OEM perhaps.
So you don't believe the leaks of a January release to coincide with B450 MB support? That would seem to make sense to me. They soak up all the people willing to pay for a 5600X, and then release the 5600 for everyone else.
 

jpiniero

Diamond Member
Oct 1, 2010
9,167
1,798
126
So you don't believe the leaks of a January release to coincide with B450 MB support? That would seem to make sense to me. They soak up all the people willing to pay for a 5600X, and then release the 5600 for everyone else.
There really isn't a point since it's effectively a price cut that doesn't really make sense to do when everything is selling out so quickly and the odds of a die failing the 5600X's specs are minescule. You also have to factor in Milan is eating up a ton of dies and will only grow bigger with the public release.
 

ZGR

Golden Member
Oct 26, 2012
1,865
282
126
Aleader, you mention 75hz 1440p. Not a lot of games where you will be cpu bottlenecked at 75hz.

I would only upgrade to ryzen 5000 if you are annoyed at not maintaining 75fps in single threaded bound games like Kerbal Space Program.
 

aleader

Senior member
Oct 28, 2013
487
145
116
Aleader, you mention 75hz 1440p. Not a lot of games where you will be cpu bottlenecked at 75hz.

I would only upgrade to ryzen 5000 if you are annoyed at not maintaining 75fps in single threaded bound games like Kerbal Space Program.
As I said, the only game I play a lot that has an issue is DCS World...and good luck getting 75fps with everything turned up, even on new cards. 75 fps is plenty enough for me as I don't play FPS games, other than Squad and Insurgency. At some point I'll upgrade my monitor to a 3440 ultrawide, but that's a few years off at least.
 

Mopetar

Diamond Member
Jan 31, 2011
5,747
2,509
136
As I said, the only game I play a lot that has an issue is DCS World...and good luck getting 75fps with everything turned up, even on new cards. 75 fps is plenty enough for me as I don't play FPS games, other than Squad and Insurgency. At some point I'll upgrade my monitor to a 3440 ultrawide, but that's a few years off at least.
CPUs are even less important if you're trying to run at higher resolutions. In many games you can use a cheap Celeron and still hit the same frame rate as a top of the line i7 because everything becomes GPU bound at those resolutions.

I'd just stick with the 3600 at this point. It could be months before a 5600 comes out and who knows what the availability will be like. It's unlikely that there will be any performance difference for you unless you're pairing it with a top-end GPU and even then it's going to be limited to a small number of games that are more CPU intensive.
 
  • Like
Reactions: guidryp and Tlh97

aleader

Senior member
Oct 28, 2013
487
145
116
CPUs are even less important if you're trying to run at higher resolutions. In many games you can use a cheap Celeron and still hit the same frame rate as a top of the line i7 because everything becomes GPU bound at those resolutions.

I'd just stick with the 3600 at this point. It could be months before a 5600 comes out and who knows what the availability will be like. It's unlikely that there will be any performance difference for you unless you're pairing it with a top-end GPU and even then it's going to be limited to a small number of games that are more CPU intensive.
Yah, it's nice that CPU's are really showing limited gains these days as they will last a lot longer, but it would also be nice to be able to have an 'exciting' CPU upgrade once in a while. I remember the good old days going from a 386SX to 486DX...big gains...for $1,200...that still looked like crap :D
 

aleader

Senior member
Oct 28, 2013
487
145
116
Is there a run on 3600's right now or something? The price keeps going up, and it doesn't even matter because they are completely sold out everywhere. Is this an AMD bait and switch to try and get people to buy the 3700x and up so they can get rid of that stock?
 

ASK THE COMMUNITY