Question Spiderman has entered the chat.

nicalandia

Platinum Member
Jan 10, 2019
2,635
4,083
106

jpiniero

Lifer
Oct 1, 2010
12,551
3,949
136
CPU intensive. Especially with ray tracing. Perhaps a sign of things to come with console ports? I am guessing this is the PS5 version?

I know it is 720p, but the 2600x and 10600k being on the struggle bus is mildly shocking.

https://www.computerbase.de/2022-08/spider-man-remastered-pc-benchmark-test/4/
I watched the Digital Foundry video, and the claim was that it is very heavy on the CPU doing asset decompression. And it's a bigger problem with RT enabled. Not sure what to make of it.

 
Mar 11, 2004
22,139
4,467
146
The real question should be, if you have a large enough and fast enough SSD, why don't they have it decompress while downloading, and then stream it in when playing? This would alleviate the load times, and downloads take a long time, so it could be happening during that process.

It being a bigger issue with raytracing is weird. Is this another BS situation where they also load in higher quality textures to claim how much better ray-tracing looks?

I really wish game developers/companies would be more transparent about this stuff. Wasn't supposedly multiple languages with uncompressed audio (because consoles were bad at decompression or didn't have the resources to spare for it on the fly - hence my earlier comment about decompressing during downloading) one of the claimed reasons for game sizes ballooning? Tell people and give them the option.
 

NTMBK

Diamond Member
Nov 14, 2011
9,975
4,355
136
The real question should be, if you have a large enough and fast enough SSD, why don't they have it decompress while downloading, and then stream it in when playing? This would alleviate the load times, and downloads take a long time, so it could be happening during that process.
That would reduce your "effective bandwidth". You would be able to stream fewer assets per second.
 

igor_kavinski

Diamond Member
Jul 27, 2020
6,074
3,754
106
1660387994594.png

I have the same question. Why does the game need to compile shaders continuously? I thought this was the graphics driver's job. So even if AMD/Nvidia/Intel pre-compile the shaders in their drivers, the game will still waste CPU cycles recompiling them?
 

Bigos

Member
Jun 2, 2019
78
154
76
Take a look at this article:


Long story short: the whole shader pipeline is mostly monolithic and there might be many combinations to compile.

And drivers have to compile each pipeline mostly separately.

So another console port which has not been optimized on Windows properly - what a surprise.
 

maddie

Diamond Member
Jul 18, 2010
4,333
3,866
136
Take a look at this article:


Long story short: the whole shader pipeline is mostly monolithic and there might be many combinations to compile.

And drivers have to compile each pipeline mostly separately.

So another console port which has not been optimized on Windows properly - what a surprise.
What I see is the future further splitting between gaming and compute GPUs.

I know this is off topic, but this is why the argument about Intel dropping gaming GPUs and keeping compute GPUs funded might make sense. It appears that at a deep level the requirements are very different, leading to different designs in the future. All the major companies must know this.
 

KompuKare

Senior member
Jul 28, 2009
853
557
136
Strange that the CPU rankings change between non-RT and RT with the 5800X3D winning without RT but being in 3rd place when RT is turned on:

Doesn't seem to be a thread thing as 5800X vs 5950X relative positions stay similar, and the 6C/12T i5-12500 comes in 2nd with RT.

Still, I'd single out the 5800X3D and the 12500 as doing well here, as the 12900K is just too brute force for me (although no indication on it's power consumption running this game), whereas the 12500 usually pretty frugal.
 

nicalandia

Platinum Member
Jan 10, 2019
2,635
4,083
106
  • Like
Reactions: Leeea

KompuKare

Senior member
Jul 28, 2009
853
557
136
Okay, all the more reasons for reviewers to include power numbers when reviewing this and while I love to see CPU only figures, in this case it sounds like the PCIe data transfers really need complete system power draws as well.
 
  • Like
Reactions: Leeea and moinmoin

Tuna-Fish

Golden Member
Mar 4, 2011
1,202
1,096
136
I watched the Digital Foundry video, and the claim was that it is very heavy on the CPU doing asset decompression. And it's a bigger problem with RT enabled. Not sure what to make of it.
This is what console ports are now until there are appropriate accelerators on GPUs. Both the major consoles have dedicated decompression HW.
 

Ranulf

Golden Member
Jul 18, 2001
1,880
450
136
Take a look at this article:


Long story short: the whole shader pipeline is mostly monolithic and there might be many combinations to compile.

And drivers have to compile each pipeline mostly separately.

So another console port which has not been optimized on Windows properly - what a surprise.
Yup, another sub par port for $60 no less. I paid $50 not long after it came out on ps4, 4 or so years ago. But hey, its got ray tracing!
 

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
25,122
10,026
146
Yup, another sub par port for $60 no less. I paid $50 not long after it came out on ps4, 4 or so years ago. But hey, its got ray tracing!
Yeah I played it on PS4 PRO years ago, so not double dipping until it's half price.

I feel like the bad port rhetoric is somewhat misplaced. Seems so far, to be a rare case where the consoles have hardware capability the PC doesn't is all. It scales well, and can be played on a modest PC for a PS4 like experience. Not what I'd call a terrible port personally. There are some bugs, but a patch should fix most of those shortly.

On another note: AMD GPUs not nose diving v RTX with ray tracing on is what I'd call interesting.
 

KompuKare

Senior member
Jul 28, 2009
853
557
136
Had a ask around on the CB forums. Hopefully get their reviewer, Wolfgang Andermahr, to see if he has the CPU power usage figures too. Someone on the forum there did say while normally in games their CPU pulls 80-110W, in Spiderman it's more like 170-180W. Another poster said someone had measured closer to 200W for their 12900K. Or as they said: double the power draw of the 5800X3D to gain 2% extra performance. P4 Prescot is back?
 

KompuKare

Senior member
Jul 28, 2009
853
557
136
So got to thinking it would be nice to get an idea what thread are running with and without RT, don't think any reviewers do those tests.

However, CB's CPU test don't cover this:
Nor do PurePC's:
Nor PCGH's:

The only place which at least looks at core utilization is that Russian site:
But then forget to mention what setting they are running in their core test :(
 

igor_kavinski

Diamond Member
Jul 27, 2020
6,074
3,754
106
1660428391158.png

That VRAM consumption chart is weird. 4K on the 8GB cards is going way over the limit. 6600XT is bogged down with almost 13GB. It's using system RAM to compensate and that slows it down. 6900 XT has it real easy with 10GB while 6700 XT has it up to 11GB. It's like the slower the card, the more the VRAM gets used.
 
  • Like
Reactions: Leeea and KompuKare

poke01

Senior member
Mar 8, 2022
292
288
96
Take a look at this article:


Long story short: the whole shader pipeline is mostly monolithic and there might be many combinations to compile.

And drivers have to compile each pipeline mostly separately.

So another console port which has not been optimized on Windows properly - what a surprise.
It’s also on Stray. Basically every console port these days.
 
  • Like
Reactions: Leeea

ASK THE COMMUNITY