Question Spiderman has entered the chat.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ranulf

Platinum Member
Jul 18, 2001
2,349
1,172
136
Yeah I played it on PS4 PRO years ago, so not double dipping until it's half price.

I feel like the bad port rhetoric is somewhat misplaced. Seems so far, to be a rare case where the consoles have hardware capability the PC doesn't is all. It scales well, and can be played on a modest PC for a PS4 like experience. Not what I'd call a terrible port personally. There are some bugs, but a patch should fix most of those shortly.

On another note: AMD GPUs not nose diving v RTX with ray tracing on is what I'd call interesting.

Its just not worth $60 for the non ray tracing version. Well, I hear the controls on KB/M are actually good so if you want to play it that way maybe its worth it to you. The RT seems to have issues that I doubt I'd even play it in 1-2 years when it hits $30 or less. Maybe if they put it on GoG like they did Horizon Zero Dawn I'd think about it for a drm free version.
 
  • Like
Reactions: Leeea

KompuKare

Golden Member
Jul 28, 2009
1,015
930
136

At stock going by the updated PCGH data, the first review looks like pre-patch.

A bit strange that PCGH added a non-stock CPU into the mix though. I would have though that if they did that, they should at least provide power numbers.

PCGH despite saying they'd have figures on Monday did take a bit of time to update everything after the patch. Still unsure if they aren't due more GPUs. Well it looks they've kept adding more as I'm sure there was missing cards for 4K + Upscaling earlier.

Wolfgang over at CB already said that he didn't save the CPU power figures for this runs.

Still ADL does very well there but showing the power cost of this performance is important as for gaming people keep saying that the are not worried about ADL's power usage as *most* games don't load the full CPU. This one gets a lot closer to full load.

Can't say I envy the work the reviewers had to put in to get these reviews out with a moving target.
 

Panino Manino

Senior member
Jan 28, 2017
821
1,022
136
This thing about "decompressing assets on the CPU, that's why it's demanding". Zen 3 always aced decompressing scores in benchmarks, why that strength doesn't help here? That was only valid to zip?
Yes, I'm a very ignorant person that browses here to learn something sometimes.
 
  • Like
Reactions: Leeea

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
If hybrid architectures are going to be a thing in desktop grade CPUs from now into the future, developers are going to have to come up with new programming methods to utilize the efficiency cores to increase gaming performance.

Hitman 3 is a great example of this but we need to add more to the list. Intel has plans for 32 efficiency cores in the future, which is a lot of processing power that skilled developers can tap.

Shader compilation can be offloaded asynchronously to the efficiency cores, and so can asset decompression.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,482
20,569
146
Where's the "quad cores are useless for gaming" gang at?
Finally some common ground between us.
lch62.jpg



When I was ranting about the Pentium being called a budget gaming CPU, and pimping the 10100f and 12100f because they were closely priced, I was getting some pushback. I knew a 10100f could game well because I tested it myself.

Have to give props to Steve, this is him at his best. All due to his being required to play the game to test it. ;) He obviously got dragged by many besides myself, for his Spiderman the walking simulator review, and corrected his error this time. Context: He does some swinging at the end of the run now. I also greatly enjoyed that he could talk about how frame pacing felt for a change, instead of relying on the inadequate data log based analysis he and most of the big reviewers lame out with.

Slightly hyperbolic to make the biggest gap he observed of 9% between 3.0 and 4.0 seem important right now. He among others, always uses the "We are only talking about single digit differences here, so not a big deal." Nor did it make any difference to how Spiderman would feel. Both settings were well beyond what's need for Spiderman to play perfect. It may hint at problems in the future, but we are not there yet. Of course, It could end up being an extreme outlier too. Which is much more likely. I don't anticipate most engines will need one strong thread the way this does.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,482
20,569
146
As I recall I said it wouldn't be wise to buy one today if you plan on keeping it awhile. Also it depends on the game. Try playing Battlefield 1 on a quad core. I did and it was playable, but a jump to a hex core made a nice improvement. That game is six years old.
Right on. I can't think of another game at the moment where 12 thread Zen or Zen+ struggle to provide a smooth experience at any settings.

I only took exception to the idea of calling 2/4 or 4/4 ultra budget gaming. No one should be building a modern gamer around one. Greatest Hits or retro? Go for it. But a 10100f for the $65 NIB I've seen it go for recently? That is a good buy even for modern builds.

This game appears to be singular in CPU performance, hence, not a good baseline for judging 4/8 to begin with. For example: When was the last time you can think of, where a Ryzen 3600 was pushing 64fps 1% lows when a stock 10900K was almost 10% slower, and tied with an 11400 no less?

I personally think that despite the 12th gen i3 being as impressive as it is, it will age out faster than the i5 because of core count. We keep hearing cores are not the most important thing, but that is only true on a relatively short timeline. We have seen this historically be the case. Reviewers always talk about now, but higher core count, when there is an "on the bubble zone" has aged better since the 2000s. I don't think that is changing anytime soon based on where Intel P+E and Ryzen are at now. New consoles have 16, most of which are available to games, so 12 feels like a better bet going forward. Windows and apps aren't getting any lighter either. Many gamers have more than the game running now-a-days. It will matter more as time goes by. Yeah, not a problem for those of us that upgrade frequently. But how many members post about being on Ivy or Haswell era parts still? That is the norm; a CPU being used for 5yrs or more.
 
  • Like
Reactions: Thunder 57

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,482
20,569
146
They've moved on to 6c/12t is bare minimum for an office machine and 8c/16t is gaming minimum. At least the whole quad core or 6c/6t were good values has died now that the i3 4c/8t chips prove to be better.
THAT, that is what people need to chew on and digest. Reviewers recommend/sell you, even if it is inadvertently, the CPU du jour. As I wrote in my last post, historically if there is a bubble, when it pops, there is a core count that makes it, and anything below that starts aging out for new titles.

Why I started that Big reviewers grind my gears thread months back was because of that kind of crap. They are so afraid to extrapolate performance because anything that references futureproofing is bad, very very bad!1!!!. Yet we see them pull out their magic 8 ball all the time. Steve from GN is currently breaking his arm patting himself on the back, because Intel references GN in their "We are fixing the issues" statement. Yet, I don't recall him eating crow for telling everyone it was cool not to build back in early 2020, that was panicky, and everything is fine!. Aged like warm milk.
 

Ranulf

Platinum Member
Jul 18, 2001
2,349
1,172
136
THAT, that is what people need to chew on and digest. Reviewers recommend/sell you, even if it is inadvertently, the CPU du jour. As I wrote in my last post, historically if there is a bubble, when it pops, there is a core count that makes it, and anything below that starts aging out for new titles.

Why I started that Big reviewers grind my gears thread months back was because of that kind of crap. They are so afraid to extrapolate performance because anything that references futureproofing is bad, very very bad!1!!!. Yet we see them pull out their magic 8 ball all the time. Steve from GN is currently breaking his arm patting himself on the back, because Intel references GN in their "We are fixing the issues" statement. Yet, I don't recall him eating crow for telling everyone it was cool not to build back in early 2020, that was panicky, and everything is fine!. Aged like warm milk.

I don't remember that from GN. He said not to build during the start of the pandemic stuff?


I have to eat crow for favoring 8c/16t in that the 6c/12t chips have done better than I thought they would. Given the rate of cpu progress we have now and for the likely next 5 years the mid range at 6c/12 for about $200 is a good option price/perf if you want to upgrade every 2-3 years again, especially if you are just gaming.

Its the same for the i3 4c/8t chips. For the price, I bet you get good use out of them for 2-4 years easy, especially at 1080p 60hz. This isn't the last gasp of the 20 year anniversary 2core overclockable pentium g3258 from 4th gen or the i3 7350k 2c/4t.

The low and mid range are running with the big dogs, real close performance wise and with the mid to high end flirting with prices at $300-600+ and 20% IPC jumps each generation it really doesn't make sense for most to buy into that high end.
 
  • Like
Reactions: DAPUNISHER

VirtualLarry

No Lifer
Aug 25, 2001
56,339
10,044
126
The low and mid range are running with the big dogs, real close performance wise and with the mid to high end flirting with prices at $300-600+ and 20% IPC jumps each generation it really doesn't make sense for most to buy into that high end.
Agreed, kind of. I ran 6C/12T for a long time (1600, then 3600), then when I jumped to 5000-series, I wanted something beefier, to hold me over longer for DC purposes, so I went with a 12C/24T 5900X. Ok, I admit it, I just wanted to see more pretty graphs in Task Manager.