Question AMD 5700G vs Intel 11600k (8-core vs 6-core)

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

13Gigatons

Diamond Member
Apr 19, 2005
7,461
500
126
Non-gaming machine.

6-cores, better motherboard, lower price.
or
8-cores, mediocre motherboard, slightly higher price.
 
Jul 27, 2020
16,282
10,320
106
Non-gaming machine.

6-cores, better motherboard, lower price.
or
8-cores, mediocre motherboard, slightly higher price.
Why are you even considering Intel 11600K? Unless someone is giving you only these two options and it doesn't involve you spending any money, sure OK but otherwise, no reason to even consider 11600K. It was a mistake Intel made to cover up the other big mistake (being late to 10nm).
 
Last edited:

mmaenpaa

Member
Aug 4, 2009
78
138
106
I will say this, that Polaris GPUs "don't" decode Vp1 in hardware either, but they DO work, because AMD implemented a Hybrid decoder, that uses the CPU, and the shader processors. By now, I would assume that AMD implemented the same scheme for Vega-based APUs.

The fact that some knuckle-heads are saying this is a bad thing, somehow, and that video-decode support isn't "truely supported" unless it's contained in a full hardware pipeline (even though Intel has implemented hybrid decoding in the past as well), is truly mind-boggling.

For reference regarding AV1 playback I tested my passive 5600G (set at 45W) build remotely (via teamviewer).

Highest CPU% I saw was 29% and GPU usage usually around 20%.
(CPU & GPU might use some % due to Teamviewer viewing).

With 5700G CPU usage could be about 22% (at 45W)

Elecard website (videos section)
Summer Nature
3840x2160 AV1 22.7 mbps

5600G_45W_AV1.jpg



(Netflix is rolling bitrate AV1 videos for supported devices, seems to be 2020 or newer select Samsung TVs)

"Netflix said that all of its AV1 streams are encoded in 10-bit and in the highest available resolution and frame rate including HFR – but not yet HDR. Netflix did not specify bitrates other than saying that "AV1 delivers videos with improved visual quality at the same bitrate" compared to MPEG4 and HEVC and that "some streams have a peak bitrate close to the upper limit allowed by the spec", which probably refers to AV1 level 5.0 (30 Mb/s bitrate for Main) or level 5.1 (40 Mb/s bitrate for Main)."
 

maddie

Diamond Member
Jul 18, 2010
4,740
4,674
136
Running an $800 cpu on a $125 board. You hear these ridiculous suggestions on forums but no sane person in the real world does this unless they don't mind losing some performance, which begs the question: why go for a monster cpu and seriously cheap out on the board in the first place? Anyway, running a 5950x on a B550 board is not a panacea to a 5700g running on a B550 board and being limited to PCI-E 3, it's no longer a budget build, it's a ridiculous cheaping out with performance loss and other hazards down the road, and outside the scope of the OP's considerations.
Did you once post that you saw your "job" as counteracting the "AMD bias" that you saw here? I think it was last year. Please correct me if I'm mistaken.
 

blckgrffn

Diamond Member
May 1, 2003
9,123
3,064
136
www.teamjuchems.com
Because for some odd reason AMD chose to run the memory at full speed while decoding. GDDR6 is a real power hog when running at full speed.

Or if you use multiple monitors. Or a high refresh monitor. Or or or. Yeah, it’s frustrating :/

The use case for clear superiority and having a are so niche for making a definitive recommendation. The 11600k would make a superior Plex Premium (ie hardware decode supported) server down the road. Firing up some free 2 play games ever? 5700G is a clear favorite.

Both CPUs are likely to hold their value reasonably well.

5600G is still the real 11600k killer imo.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,482
20,570
146
The 11600k would make a superior Plex Premium (ie hardware decode supported) server down the road.
If we were talking about the potential for the lack of HW support resulting in bad performance, I would agree. But the 5700G has the CPU power to easily decode it. So what makes the HW support a must have? What am I missing?
 
  • Like
Reactions: Tlh97 and scineram

blckgrffn

Diamond Member
May 1, 2003
9,123
3,064
136
www.teamjuchems.com
If we were talking about the potential for the lack of HW support resulting in bad performance, I would agree. But the 5700G has the CPU power to easily decode it. So what makes the HW support a must have? What am I missing?

Hardware transcoding is just considered the holy grail for these types of appliances, especially for people hosting a server that might serve multiple streams.

Intel and Nvidia support transcoding much better than AMD does.

Transcoding can be crazy intensive vs just decoding. The dedicated hardware is faster (both in raw speed and in letting a stream become watchable) and a lot more efficient 🤷‍♂️

Niche use is niche.

Like when you compare handbrake benches with hardware acceleration - it’s cool but if you never handbrake then what’s the point?
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
Or if you use multiple monitors. Or a high refresh monitor. Or or or. Yeah, it’s frustrating :/

Didn't they get it to work right eventually on some cards? Power management never has worked on my 5600XT... :(

Not that it matters too much, that system is always running full-bore gaming anyway.
 

Ajay

Lifer
Jan 8, 2001
15,451
7,861
136
So, this is a mid-range system with no dGPU for run of the mill computing? I’d say it doesn’t really matter. Sounds like it’s overkill for anything the OP uses - so it’s a bit of future proofing. That’s all. All these graphs, charts and comparisons are besides the point here. @13Gigatons, buy whichever one tickles your fancy.

Carry on you tech fiends :p.
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
Did you once post that you saw your "job" as counteracting the "AMD bias" that you saw here? I think it was last year. Please correct me if I'm mistaken.
Busted! No, I only go after silly posts like the ones below, and the blatantly biased ones. Why do you ask?

With the higher heat of the 11600k, I would go with the 5700g.
:D


What TPU certainly does is to mix ST with MT benchs in their average, this way, for exemple, you can get a 6C very close to a 8C by averaging say Cinebench MT with CB ST, even if the latter is perfectly irrelevant since no one in his right mind would ever use a single thread for renderings..

I guess that with comparisons like these you are completely unaware of the CPUs real perfs.
Yea, the budget RENDERING MONSTER!
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,482
20,570
146
Hardware transcoding is just considered the holy grail for these types of appliances, especially for people hosting a server that might serve multiple streams.

Intel and Nvidia support transcoding much better than AMD does.

Transcoding can be crazy intensive vs just decoding. The dedicated hardware is faster (both in raw speed and in letting a stream become watchable) and a lot more efficient 🤷‍♂️

Niche use is niche.

Like when you compare handbrake benches with hardware acceleration - it’s cool but if you never handbrake then what’s the point?
Last time I looked, CPU was still the best quality in handbrake. 5700G should be better for that yes?
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,482
20,570
146
So, this is a mid-range system with no dGPU for run of the mill computing? I’d say it doesn’t really matter. Sounds like it’s overkill for anything the OP uses - so it’s a bit of future proofing. That’s all. All these graphs, charts and comparisons are besides the point here. @13Gigatons, buy whichever one tickles your fancy.

Carry on you tech fiends :p.
Just gonna suck all of the fun right out of it huh Ajay? ;)
 
  • Like
Reactions: Tlh97 and Leeea

blckgrffn

Diamond Member
May 1, 2003
9,123
3,064
136
www.teamjuchems.com
Last time I looked, CPU was still the best quality in handbrake. 5700G should be better for that yes?

Yup, but that is distinctly different different than Plex Premium functionality.

Having actually used my Quadro to encode some videos via handbrake, I’d say the time saved if you are just trying to resize for mobile/iPad screens or put dvd dumps into recent encoding formats the time saved is easily worth the small loss of quality.
 

Asterox

Golden Member
May 15, 2012
1,026
1,775
136
Are you using Firefox? That's the reason your CPU utilization is high even with iGPU being partially utilized. Turing and Vega doesn't have av1 decoding, so you're most likely being served vp9 videos. That's why I asked, what does stats for nerds say?

For Youtube 4K/60fps AV1, if the CPU is capable enough you dont need AVI hardware decoder.

On AMD side, if you have minimum Desktop 6/12 Zen2 CPU or APU, it will run(very acceptable CPU power consumption) Youtube 4K/60 AV1 with no problems.

Yes CPU and 3D GPU usage is high(Firefox), and power consumption in watts or CPU package power around 45W.This is example on my Renoir R5 4650G.

My Internet conection speed, it is roughly on the verge of pleasant viewing for 4K/60 VP9 Youtube video.But for AV1 4K/60 videos, i have no problem because AV1 4K/60 video uses less bandwidth.

GPU tab, "video codec 0" this is AMD VCN hardware decoder =no suport for AV1=lying down and sleeping. :mask:

2021-11-26_000242.jpg

This is the same 4K/60 video, but in standard Youtube VP9 format.If we compare CPU Package power consumption, 4K/60 AV1 vs 4K/60 VP9 we see that 4K/60 AV1 playback CPU Package Power is no doubt more then very acceptable.

2021-11-26_005832.jpg


I use 4K TV as PC/HTPC monitor.
 
Last edited:

blckgrffn

Diamond Member
May 1, 2003
9,123
3,064
136
www.teamjuchems.com
Didn't they get it to work right eventually on some cards? Power management never has worked on my 5600XT... :(

Not that it matters too much, that system is always running full-bore gaming anyway.


I just checked and my 6800 w/21.8.2 (I see it wants me to install newer drivers) is finally idling the memory (60-140 mhz) with my monitor set to 164 hz. About six to eight months ago that was certainly not the case and it was running the memory at full tilt with any refresh rate over 60 hz.

So... I actually feel better about that. Thanks for prompting me to check :D
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,551
14,510
136

tamz_msc

Diamond Member
Jan 5, 2017
3,795
3,626
136
What TPU certainly does is to mix ST with MT benchs in their average, this way, for exemple, you can get a 6C very close to a 8C by averaging say Cinebench MT with CB ST, even if the latter is perfectly irrelevant since no one in his right mind would ever use a single thread for renderings..
You use Computerbase who does 6 out of 10 benchmarks which are rendering (having very little relevance for your average user) and then conclude that the 5700G is 34% faster.

Nobody who does rendering seriously uses a 5700G.

Your comment implies that mixing ST with MT benches is somehow wrong, which is absolutely bs.
 

Hotrod2go

Senior member
Nov 17, 2021
298
168
86
Wow, what a thread, the OP has not indicated anywhere what kind of rocket lake? they'd be interested in. 1 post demonstrates maybe a 5700G, that is all.
Until the OP communicates what cpu in the Intel line they are interested in, its all speculation. FTR, there are a few different i5s from rocket lake anyway...
To cut down on power consumption they could even get the base model i5 that has 65w TDP.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,795
3,626
136
Here it uses 70 watts more, and peaks at 104c. Thats a lot hotter than any Ryzen, before it shuts down. NO bias, and not silly. Want to try again ?

At least read whatever you're linking so that your obvious bias isn't as obvious.

The Core i5-11600K, with fewer cores, gets a respite here. Our peak power numbers are around the 206 W range, with the workload not doing an initial spike and staying around 4.6 GHz. Peak temperatures were at the 82ºC mark, which is very manageable. During AVX2, the i5-11600K was only at 150 W.
 
  • Like
Reactions: Zucker2k