Question AMD 5700G vs Intel 11600k (8-core vs 6-core)

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

13Gigatons

Diamond Member
Apr 19, 2005
7,461
500
126
Non-gaming machine.

6-cores, better motherboard, lower price.
or
8-cores, mediocre motherboard, slightly higher price.
 

maddie

Diamond Member
Jul 18, 2010
4,744
4,684
136
LOL you are hopeless. They are both APUs, what difference does it make which part of the chip does the decode? That is a rhetorical question, because again, you are beyond help.
I agree. The funny thing is that case 2 probably uses the most energy.

1) 5700G - avg. low to mid 20s percent usage Vega 8 low to mid 20s - Dropped a frame here and there
2) 5700G avg. 1-3 percent 2060 Super 60-70 percent - Zero dropped frames
3) 5600G avg. low to mid 30s Vega 7 pretty much stayed at 15 percent usage - Dropped 2 frames at a time here and there.
 

Abwx

Lifer
Apr 2, 2011
10,953
3,474
136
There isn't much difference between the two when it comes to temperatures.

AMD Ryzen 7 5700G Review - Temperatures | TechPowerUp

cpu-temperature.png
So about the same temp but with the 5700G performing 28% better, this mean that in 45W eco mode the 5700G would still perform substancially better (90% of the default 65W perf) than the 11600K at way lower power and temps...

 

arandomguy

Senior member
Sep 3, 2013
556
183
116
I just finished testing AV1 4K60 on the following -

5700G Vega 8
5700G 2060 Super
5600G Vega 7

5700G
- avg. low to mid 20s percent usage Vega 8 low to mid 20s - Dropped a frame here and there
5700G avg. 1-3 percent 2060 Super 60-70 percent - Zero dropped frames
5600G avg. low to mid 30s Vega 7 pretty much stayed at 15 percent usage - Dropped 2 frames at a time here and there.


You need to double check if it's actually using AV1. That video is showing as still using VP9 for me at 2160p60 (4k60) playback. It's only showing as AV1 for 2880p60 (5k60).

That's why the other poster is saying to check with "stats for nerds."
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,330
4,918
136
The wonderful thing is that we finally have real competition, so outside of edge cases like needing AV1 hardware-accelerated playback it would be hard to go wrong with either choice.

However, if the cost of building a 12600K + DDR4 system doesn't break the budget, I think that would be my go-to recommendation if needing that much CPU horsepower. DDR5 ain't it, yet.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,498
20,615
146
You need to double check if it's actually using AV1. That video is showing as still using VP9 for me at 2160p60 (4k60) playback. It's only showing as AV1 for 2880p60 (5k60).

That's why the other poster is saying to check with "stats for nerds."
I thought the dropped frames info was indicative to readers, that I was using stats for nerds. I will be explicit: I used stats for nerds. It displays AV01.0.13M.08 as the codec. I specified prefer AV1 for all content in the settings, did you remember to do that as well?
 

tamz_msc

Diamond Member
Jan 5, 2017
3,821
3,642
136
So about the same temp but with the 5700G performing 28% better, this mean that in 45W eco mode the 5700G would still perform substancially better (90% of the default 65W perf) than the 11600K at way lower power and temps...


Where do you find it performing 28% better? It's only 9% better as per TPU:

relative-performance-cpu.png


I thought the dropped frames info was indicative to readers, that I was using stats for nerds. I will be explicit: I used stats for nerds. It displays AV01.0.13M.08 as the codec. I specified prefer AV1 for all content in the settings, did you remember to do that as well?
Yes, you're forcing av1 to use the CPU and then saying that it works, disregarding the fact that the video-decoder isn't doing much. There's no getting around the fact that Gen 12 video encoding is better than on integrated Vega, and will save the OP tons of CPU cycles IF they're into streaming high res videos.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,564
14,520
136
Where do you find it performing 28% better? It's only 9% better as per TPU:

relative-performance-cpu.png



Yes, you're forcing av1 to use the CPU and then saying that it works, disregarding the fact that the video-decoder isn't doing much. There's no getting around the fact that Gen 12 video encoding is better than on integrated Vega, and will save the OP tons of CPU cycles IF they're into streaming high res videos.
Why do you keep pushing the 11600k ? You even admit its slower and runs hotter and takes more power while having 2 less cores. Its a nobrainer, 5700g
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
A $125 B550 has no trouble running a 5950x. And will support the yet released 3D V-cache line up. Z590 is limited to a 10 core CPU from an older gen.
Running an $800 cpu on a $125 board. You hear these ridiculous suggestions on forums but no sane person in the real world does this unless they don't mind losing some performance, which begs the question: why go for a monster cpu and seriously cheap out on the board in the first place? Anyway, running a 5950x on a B550 board is not a panacea to a 5700g running on a B550 board and being limited to PCI-E 3, it's no longer a budget build, it's a ridiculous cheaping out with performance loss and other hazards down the road, and outside the scope of the OP's considerations.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,498
20,615
146
Running an $800 cpu on a $125 board. You hear these ridiculous suggestions on forums but no sane person in the real world does this unless they don't mind losing some performance, which begs the question: why go for a monster cpu and seriously cheap out on the board in the first place?
Are you day drinking? We were discussing "hedging against obsolescence" quote unquote, remember? What even is this drivel? No one suggested a 5950x now, but that it or the next line up not even out, would be a drop in years down the road, smokes anything z590 offers.
 

mmaenpaa

Member
Aug 4, 2009
78
138
106
Non-gaming machine.

6-cores, better motherboard, lower price.
or
8-cores, mediocre motherboard, slightly higher price.

I would go with AMD.


5700G wins in following categories (and while not a gaming PC, 5700G Vega seems to be about 2X faster compared to Intel UHD.)

office
programming (developing)
browsing
encoding
power usage / efficiency

11600K wins Adobe CC (Photoshop & Premiere practically a tie, After Effects clear win for Intel)

System board should not make a big difference as OP is not doing anything extreme heavy lifting with those chosen components (of course I could be wrong). While *good* PCIE4 storage is faster than *good* PCIE3, the difference is visible mainly in benchmarks. If 1G NIC is not enough when (if) he has >1G network, 2.5G cards should be quite inexpensive by that time. Currently 2.5G (or 5G/10G) switches are quite expensive.

Both platforms have different upgrade paths if he chooses. Intel gets PCI4 storage from the start & can use PCIE4 GPU. CPU can be upgraded to 11900K max.

AMD has 5950X currently available and new & faster V-Cache version possibly on the CPU side (and of course PCI4 for storage & GPU with nonapu 5X00 processors assuming B550/X570 system board)

(I used numbers from TPU 5700G review)

Office use
5700G > 11600K (2% - 23% better, AMD wins 3/3 tests)

Adobe CC
5700G < 11600K (-12% - +1% worse, Intel wins 2/3 tests)

Visual Studio C++
5700G > 11600K (5% better, AMD wins 1/1 test)

Browsing
5700G > 11600K (-2% - +11% better, AMD wins 2/3 tests)

Encoding
5700G > 11600K (2% - 29% better, AMD wins 4/4 tests)

Power usage from whole system

Power usage IDLE
5700G > 11600K (9% better, 52W/57W )

Power usage ST
5700G > 11600K (20% better, 74W/89W )

Power usage MT
5700G > 11600K (56%-89% better, 150W/235W CB, 107W/203W Prime95)

Efficiency ST (Kj, less is better)
5700G = 11600K (1% better, 14.1/14.3)
Efficiency MT (Kj, less is better)
5700G > 11600K (92% better, 9.8/18.9)
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
Are you day drinking? We were discussing "hedging against obsolescence" quote unquote, remember? What even is this drivel? No one suggested a 5950x now, but that it or the next line up not even out, would be a drop in years down the road, smokes anything z590 offers.
The drivel was about PCIE-3 limitation on 5700g.
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,330
4,918
136
Running an $800 cpu on a $125 board. You hear these ridiculous suggestions on forums but no sane person in the real world does this unless they don't mind losing some performance, which begs the question: why go for a monster cpu and seriously cheap out on the board in the first place? Anyway, running a 5950x on a B550 board is not a panacea to a 5700g running on a B550 board and being limited to PCI-E 3, it's no longer a budget build, it's a ridiculous cheaping out with performance loss and other hazards down the road, and outside the scope of the OP's considerations.

I've been running a 5950X on my MSI B550 Gaming Edge WiFi for exactly 1 year. It was $149.99 and included an additional $30(?) Steam gift card on top of it. All my benchmarks were within margin of error versus top tier X570 boards, so I would have to disagree with your assessment. Drop-in upgrade to top SKUs are possible even on budget and mid-range boards with the 5000 series. It's easy to research VRMs if that's a concern.

I really wouldn't even have entertained upgrading to the B550 Unify except I wanted more m.2 NVMe slots, I needed another AM4 board, and the price was right at $219 before cashback/discounts.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,821
3,642
136
Why do you keep pushing the 11600k ? You even admit its slower and runs hotter and takes more power while having 2 less cores. Its a nobrainer, 5700g
Because we don't know what the OP is going to use it for? There might very well be use cases where the 11600K is better - lightly threaded tasks and streaming videos are some of them. It consumes more power, yes, but it does not run hotter.

Plus there is the fact that the OP is getting a better motherboard with better features for an overall lower price than what they can get the 5700G+motherboard for.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,349
10,049
126
I will say this, that Polaris GPUs "don't" decode Vp1 in hardware either, but they DO work, because AMD implemented a Hybrid decoder, that uses the CPU, and the shader processors. By now, I would assume that AMD implemented the same scheme for Vega-based APUs. (Edit: Talking about AV1 decode, here.)

The fact that some knuckle-heads are saying this is a bad thing, somehow, and that video-decode support isn't "truely supported" unless it's contained in a full hardware pipeline (even though Intel has implemented hybrid decoding in the past as well), is truly mind-boggling.
 
Last edited:

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,564
14,520
136
Because we don't know what the OP is going to use it for? There might very well be use cases where the 11600K is better - lightly threaded tasks and streaming videos are some of them. It consumes more power, yes, but it does not run hotter.

Plus there is the fact that the OP is getting a better motherboard with better features for an overall lower price than what they can get the 5700G+motherboard for.
So the fact that it only wins ONE of the benchmarks above,, and the motherboard is not a problem, given the use cases and CPU, makes you think its still best ? Its NOT alderlake. And that motherboard has almost NO upgrade option, while the AMD does.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,498
20,615
146
I will say this, that Polaris GPUs "don't" decode Vp1 in hardware either, but they DO work, because AMD implemented a Hybrid decoder, that uses the CPU, and the shader processors. By now, I would assume that AMD implemented the same scheme for Vega-based APUs.

The fact that some knuckle-heads are saying this is a bad thing, somehow, and that video-decode support isn't "truely supported" unless it's contained in a full hardware pipeline (even though Intel has implemented hybrid decoding in the past as well), is truly mind-boggling.
Good info, I was not aware AMD did that. Maybe that explains why the Vega was doing some work? Instead of sitting there basically idle.
 
  • Like
Reactions: Tlh97

tamz_msc

Diamond Member
Jan 5, 2017
3,821
3,642
136
So the fact that it only wins ONE of the benchmarks above,, and the motherboard is not a problem, given the use cases and CPU, makes you think its still best ? Its NOT alderlake. And that motherboard has almost NO upgrade option, while the AMD does.
Do you even know what use case the OP is going to have? I only mentioned two of the most common use cases where the 11600K might have an advantage. And nobody said a word about upgrade - how frequently the OP upgrades their CPU is not mentioned anywhere. So it is irrelevant to this discussion.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,821
3,642
136
The fact that some knuckle-heads are saying this is a bad thing, somehow, and that video-decode support isn't "truely supported" unless it's contained in a full hardware pipeline (even though Intel has implemented hybrid decoding in the past as well), is truly mind-boggling.
Yeah it is truly mind-boggling that AMD is behind on video decode support (at least until Rembrandt is launched) and has been for years. Why do you think that discrete GPUs prior to Navi 2x were such power hogs when it came to video playback?
 

UsandThem

Elite Member
May 4, 2000
16,068
7,380
146
Who would of known that a pretty simple thread about which CPU to buy (both which are almost identical in performance) would have 45+ responses.

I'm shocked as most AMD vs. Intel threads are usually pretty short and to the point, with both sides usually coming to the same recommendation.

Nope.gif
 

Abwx

Lifer
Apr 2, 2011
10,953
3,474
136
Where do you find it performing 28% better? It's only 9% better as per TPU:

relative-performance-cpu.png



Yes, you're forcing av1 to use the CPU and then saying that it works, disregarding the fact that the video-decoder isn't doing much. There's no getting around the fact that Gen 12 video encoding is better than on integrated Vega, and will save the OP tons of CPU cycles IF they're into streaming high res videos.
Fortunately you are here to help me correct the thing, it s not 28% but 34%, so at 45W eco mode it would perform roughly 20% better, FTR the 11600K use 115 W.


Click on the graph button to display more CPUs.

What TPU certainly does is to mix ST with MT benchs in their average, this way, for exemple, you can get a 6C very close to a 8C by averaging say Cinebench MT with CB ST, even if the latter is perfectly irrelevant since no one in his right mind would ever use a single thread for renderings..

I guess that with comparisons like these you are completely unaware of the CPUs real perfs.
 
Last edited: