Question Spiderman has entered the chat.

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tamz_msc

Diamond Member
Jan 5, 2017
3,708
3,554
136
As I recall I said it wouldn't be wise to buy one today if you plan on keeping it awhile. Also it depends on the game. Try playing Battlefield 1 on a quad core. I did and it was playable, but a jump to a hex core made a nice improvement. That game is six years old.
You probably played it on a Skylake or its derivative quad core, a 7-year-old architecture. Play it on a 12100 - which has >40% ST performance over Skylake. It's faster than most 6C CPUs in gaming that isn't a Zen 3 with full 32 MB of L3 cache.
They've moved on to 6c/12t is bare minimum for an office machine and 8c/16t is gaming minimum. At least the whole quad core or 6c/6t were good values has died now that the i3 4c/8t chips prove to be better.
Anyone who says that 6c/12t is the bare minimum for an office PC is delusional.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,647
3,706
136
You probably played it on a Skylake or its derivative quad core, a 7-year-old architecture. Play it on a 12100 - which has >40% ST performance over Skylake. It's faster than most 6C CPUs in gaming that isn't a Zen 3 with full 32 MB of L3 cache.

That's fine and good now but a 2022 i3 12100 quad core wasn't available in 2016. Battlefield is a CPU intensive game and back then the only option was more cores. I still wouldn't recommend a quad core CPU if you intend to keep it for 3-5 years.

Anyone who says that 6c/12t is the bare minimum for an office PC is delusional.

Now that I would agree with. An office PC will do just fine with four cores.
 

Ranulf

Platinum Member
Jul 18, 2001
2,331
1,138
136
Anyone who says that 6c/12t is the bare minimum for an office PC is delusional.

I've seen tech tubers recommend that in the last year or so, its why it sticks in my mind. I think most of those people though are looking at it from a 5 year future proofing mindset.
 

psolord

Golden Member
Sep 16, 2009
1,875
1,184
136
Hello.

I read some comments regarding the PC not having a hardware decompressor and that's why Spider-man and Stray are having some troubles at various points, plus high cpu usage.

So what is RTX IO then?


This page is two years old and .....


giphy-downsized-large.gif
 

DiogoDX

Senior member
Oct 11, 2012
746
277
136
Hello.

I read some comments regarding the PC not having a hardware decompressor and that's why Spider-man and Stray are having some troubles at various points, plus high cpu usage.

So what is RTX IO then?


This page is two years old and .....


giphy-downsized-large.gif
I think Forspoken will be the first game to use Direct Storage on PC.
 

psolord

Golden Member
Sep 16, 2009
1,875
1,184
136
Moreover, although I posted them at the gpu spider-man section (but now I am posting in cpu related capacity) my old 8600k did manage to run the game quite well, accompanied by my GTX 1070 and that at very high, which has setting that not even the PS5 has. See DF's video review regarding the comparison.


(there are stutters in the video due to shadow play recording-gameplay was absolutely smooth despite the jaggies in the frametimes-non monetized)

To put things in perspective, this is a 5yo cpu with a 6yo gpu, that runs the game at 60fps, with BETTER settings than the PS5. So where's the problem for the PC? I don't see it.

==============

And also, how about a Sandy bridge running Spider-man? This cpu existed 2 years inside PS3's life. How well does PS3 run Spider-man today? Hmmmm?


Ok this is at medium and has drops to 45fps BUT I'm pretty sure the 970 is not helping here. Especially the 3.5GB vram. Still much better than Gen 8 consoles. This is another to do test at some point, the 2700k with a better gpu. We'll see....
 

moinmoin

Diamond Member
Jun 1, 2017
4,933
7,619
136
I read some comments regarding the PC not having a hardware decompressor and that's why Spider-man and Stray are having some troubles at various points, plus high cpu usage.

So what is RTX IO then?
RTX IO is something completely different. PS5 games can use Sony's Kraken compression. It's proprietary, so the chance it will ever be decompressed in hardware on PC is pretty much nil.

Past press e.g.
 
  • Wow
Reactions: igor_kavinski

Zepp

Member
May 18, 2019
159
158
116
in the 1080p-High chart, why is the 5500, which is just a 5600G with disabled G, so much higher than the 5600G?
 

Attachments

  • 1080p-hgh.jpg
    1080p-hgh.jpg
    139.5 KB · Views: 16
Jul 27, 2020
15,738
9,806
106
in the 1080p-High chart, why is the 5500, which is just a 5600G with disabled G, so much higher than the 5600G?
Possibly the iGPU on the 5600G, despite not being used, is hindering the CPU. There is likely some adverse impact on the bandwidth available to the CPU chiplet and since Spiderman looks to be very bandwidth sensitive, 5600G suffers.
 

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
I suspect that it is having some sort of configuration issue and is underperforming in that benchmark. There aren't many cases where it should underperform the 3600X by any appreciable amount, and in geekbench, the 5600g outperforms the 5500 when both are using the same platform by a good 5%+.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,271
19,908
146
@psolord

Thanks for sharing the clips. Good stuff.

I noted that your 12 thread CPU was being basically maxed out at times going as high as 97%. The game also grabbed 20GB+ of ram. there were a few frame dips in the 50s just in that short clip. Those are some crazy demands on the system for what we are seeing on screen, don't you think? I don't use shadowplay, but is stuttering like that typical? Or was it due to game related bandwidth limitations?

That is all without ray tracing coming into play. Using an RTX card, you would likely see the same issues the Ryzen 2600X did.

You should test that same opening sequence with 16GB and see if anything changes. With everything it is swapping in and out, will 4GB less introduce performance issues?
 
  • Like
Reactions: Tlh97 and psolord

Schmide

Diamond Member
Mar 7, 2002
5,581
712
126
^^^^

DDR5 6400 makes a big difference. Also PCIe 4x16 vs 3x16 is also an notable improvement at 1080p too.

If this compiling shaders at such an extreme level is the new order, makes you wonder what pcie5 cards/storage could do to this metric?

Extreme AM5 will have ddr5/nvme5x4/pcie5x16 which should make the high water mark for this situation.
 

psolord

Golden Member
Sep 16, 2009
1,875
1,184
136
@psolord

Thanks for sharing the clips. Good stuff.

I noted that your 12 thread CPU was being basically maxed out at times going as high as 97%. The game also grabbed 20GB+ of ram. there were a few frame dips in the 50s just in that short clip. Those are some crazy demands on the system for what we are seeing on screen, don't you think? I don't use shadowplay, but is stuttering like that typical? Or was it due to game related bandwidth limitations?

That is all without ray tracing coming into play. Using an RTX card, you would likely see the same issues the Ryzen 2600X did.

You should test that same opening sequence with 16GB and see if anything changes. With everything it is swapping in and out, will 4GB less introduce performance issues?


Thanks mate. Personally I love all PCs, old and new. I especially like to see what old systems can do when they are facing newer hurdles. That's what a PC enthusiast means to me. I mean I could sum all my expenses over time, on one PC, while selling old stuff and have like a 12/24 cpu with a RTX high end whatever card, but where's the fun in that? xD

Anyhoo, if you are are referring specifically to these two clips I posted on this thread, don't forget that the 8600k is a 6/6 cpu, not a 6/12 cpu. Also I only use the core enhancement setting and that's it. This cpu can go to 5Ghz easily, so there's more juice left in it. Truth be told, even at 4.3Ghz, I have rarely seen it closing to 100% cpu usage. Yes spidy is heavy. He is spreading his threads around town and so does the engine that runs his simulated world. xD Also the 8600k+1070 recording is capped at 60fps. Overall it would be higher than 60fps unlocked and that's at very high.

You are talking about dips in the 50's on the system that used 20GBs. That system is the 8600k+1070. Can you pin point the timestamp please? Indeed that system showed increased RAM usage. However on all my other systems, I only have 16GB and the game runs fine (except on the 7950 one :( ).

Here are the other two clips I posted on the GPU section, on the newer 12400f+3060ti. (I repeat non monetized-just a hobbyist channel for fun). This system has 16GB only. Also these clips are recorded with an external recorder, so no performance hit and are uncapped. And yes, the 3060ti is clearly cpu limited by the 12400f at 1080p. Also the RAM usage is only at 10GB here and beats me why. It seems that if the game finds more RAM, it allocates it. If it uses it and how, is beyond me.


 
Last edited:

psolord

Golden Member
Sep 16, 2009
1,875
1,184
136
===========

Regarding shadowplay recordings, there's seem to be a conflict between shadowplay and MSI Afterburner's OSD. At least how I have set it up. The same thing happens even with the software recording with the 7950.

I asked at Guru3D and the dev said that he is not developing Shadowplay, so he doesn't know.

Here is a clip that shows how bad it can be.


I will be getting an external recorder for the other two systems by Christmas too (my systems are in different houses).
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,271
19,908
146
Ah yes, my bad. I'm over here thinking it is an i7 like a big dummy. The 6 should have clued me in.

Can't agree more about PCs. I still play on a FX8350 and FX6100 sometimes. Don't need a powerful system for the vast majority of my game libraries. Heck, a 5700G handles probably close to 100 games really well with no dGPU.

Gratitude for the 3060ti footage too. I plan to play Spidey on it with a 5800x and 32GB of 3600. Crazy seeing the CPU usage in the high 70s on a 12400f, this game is serious business when you turn things up.

Your capture clip will pick up a few seconds before it hits 96%, then it hits 97% shortly after during the fight. I just noticed it hits 98% a few seconds after that.

 
  • Like
Reactions: psolord

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,271
19,908
146
On the high ram allocation: This happens in most of my games with win11 pro. Even older games like the Witcher 3 and Fallout 4 easily exceed the standard 8GB many gamers spec'd back then. They may not need it, but 11 does the "It's free real estate" meme and grabs it.
 
  • Like
Reactions: psolord

psolord

Golden Member
Sep 16, 2009
1,875
1,184
136
On the high ram allocation: This happens in most of my games with win11 pro. Even older games like the Witcher 3 and Fallout 4 easily exceed the standard 8GB many gamers spec'd back then. They may not need it, but 11 does the "It's free real estate" meme and grabs it.

Oh wow, really?

I am still on 10/64 and will stay here for a while!
 
  • Like
Reactions: DAPUNISHER

Bigos

Member
Jun 2, 2019
127
281
136
Any competent OS will use all of your RAM if it can, for caches and whatnot. It should however free the caches if needed to serve a memory allocation.

I guess similar can be said about VRAM, though this is in fact in the hands of the game engine and not the OS (at least in vulkan/d3d12). Having more memory can reduce pipeline stalls and increase overall framerate.

For system memory, the game engine could technically implement its own adaptive caching, based on how mach RAM the system has. This might be especially relevant in case of SpiderMan, which if I understood correctly streams a lot of data. While the OS can cache data read from asset files, if they need to be decompressed on the CPU before reaching the GPU the game engine probably contains its own decompressed-asset cache.

This could mean that the CPU usage of SpiderMan could be reduced if given enough RAM, as the cache can have a better hitrate if allowed to grow more.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,271
19,908
146
Any competent OS will use all of your RAM if it can, for caches and whatnot. It should however free the caches if needed to serve a memory allocation.

I guess similar can be said about VRAM, though this is in fact in the hands of the game engine and not the OS (at least in vulkan/d3d12). Having more memory can reduce pipeline stalls and increase overall framerate.

For system memory, the game engine could technically implement its own adaptive caching, based on how mach RAM the system has. This might be especially relevant in case of SpiderMan, which if I understood correctly streams a lot of data. While the OS can cache data read from asset files, if they need to be decompressed on the CPU before reaching the GPU the game engine probably contains its own decompressed-asset cache.

This could mean that the CPU usage of SpiderMan could be reduced if given enough RAM, as the cache can have a better hitrate if allowed to grow more.
Interesting and informative. Thanks for taking the time to post that.
 
Jul 27, 2020
15,738
9,806
106

1661515632657.png

3090 delivering better minimum FPS than 3090 Ti? Hmm...does this stink of a cost cutting substitution of some component by Nvidia?
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,271
19,908
146
[
3090 delivering better minimum FPS than 3090 Ti? Hmm...does this stink of a cost cutting substitution of some component by Nvidia?
I don't think so. I ignore .1% lows, they can be erratic from run to run and almost never reflect anything relevant to the gaming experience.

Thanks for linking the review though. I hit their site about once a month to see what's new there. Another of the old schoolers, so I turn off adblock for them and click through on ads once in a while to help them out.