• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Question RX6800 or RTX3070 for new build (Hogwarts Legacy discussion)

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

In2Photos

Platinum Member
Looking to build my daughter a new computer for gaming and possibly some streaming. I found that I can get either an RX6800 or RTX3070 for roughly the same price in our budget. I'll be pairing this with a 7600X and 32gb of DDR5. Currently she has a gtx1660 and plays at 1080p with 144hz monitors. Looks like the 6800 has better raster performance in most titles, especially at higher resolution due to the 16gb of VRAM. 3070 is better at RT, but she's never used RT so I don't know if that's really a pro for the 3070. Any other thoughts?
 
So it seems Hogwarts is buggy on Ryzen 7000 series. Surprise surprise!

View attachment 76154

Really, that's bizarre. Maybe they didn't test it well since they are "new" but man, are they really doing so much optimization that a new platform like that is left out in the cold? Obviously Zen 4 is good for the performance and Zen 2 had to be an optimization target due to all the consoles. That boggles my mind a bit.
 
I feel these recent post should probably be split off into a new thread? Why was a dead thread necro'ed to talk about an entirely different subject?

@DAPUNISHER is that something you can do? It makes total sense to have a thread on this game, but odd to have it tied to this older thread that had nothing to do with it.

With that now said, I have been decently excited about this game. And a streamer I watch has been playing it with a 3090Ti, and it has not been the best experience. But I don't know any streamers playing it with AMD hardware, it may be better. But for now, I am going to hold off a bit before buying it for PC.
 
I feel these recent post should probably be split off into a new thread? Why was a dead thread necro'ed to talk about an entirely different subject?

@DAPUNISHER is that something you can do? It makes total sense to have a thread on this game, but odd to have it tied to this older thread that had nothing to do with it.

With that now said, I have been decently excited about this game. And a streamer I watch has been playing it with a 3090Ti, and it has not been the best experience. But I don't know any streamers playing it with AMD hardware, it may be better. But for now, I am going to hold off a bit before buying it for PC.
It's all good man. OP also had a thread in the computer building forum when he was deciding on parts. In which Hogwarts Legacy was the primary motivator for his doing his daughter's build. That makes the discussion of the game relevant to which card he ended up choosing. He has also provided an update here concerning it, and it's his BBQ after all. :mask:

Picking a card with twice the ram for similar money is already aging well. Seeing the 4070ti run out of vram in this title, is a good example of why vram matters.

The vid @SteveGrabowski linked, showed the 6700XT performing better than the 3060ti. The 6700XT did allocate and use more vram than the 3060ti has.

Anyways, that is why Hogwarts Legacy ended up in a thread about deciding between vid cards. I am personally glad it did, because it is my considered opinion that vram is more important than ray tracing performance. It is the icing on the cake for me, that RT in this game is horrendously bad.
 
Yeah, getting a 3070 for the RT performance vs a 6800/6800XT going forward is really going to run into issues with VRAM where you're not going to be able to flex that muscle. It'd be like buying a brand new F150 with all the max towing and max payload packages to haul your camper, and then shafting it with the smallest V6 engine.
 
I saw on the Steam forums yesterday or the day before that some were saying to not have it do any upscaling. Run the game at the desktop/monitor's native res and they had less frame stutter. I saw one streamer having problems with a 2070 and cutscenes just stuttering and freezing. It would shut down his stream software, OBS I believe.
 
Really, that's bizarre. Maybe they didn't test it well since they are "new" but man, are they really doing so much optimization that a new platform like that is left out in the cold? Obviously Zen 4 is good for the performance and Zen 2 had to be an optimization target due to all the consoles. That boggles my mind a bit.
AM5 has an incredibly small install base. If being pressured to hit a deadline, or any other type of constraint on time or workforce hours, that'd be the last platform I'd optimize for too.

If Daniel-San's video is any indication, all AMD performs better in this title than AMD+Nvidia. Wonder if Aussie Steve has tested AM5 with the 7900XTX in Hogshardware Legacy yet? 😛
 
AM5 has an incredibly small install base. If being pressured to hit a deadline, or any other type of constraint on time or workforce hours, that'd be the last platform I'd optimize for too.

If Daniel-San's video is any indication, all AMD performs better in this title than AMD+Nvidia. Wonder if Aussie Steve has tested AM5 with the 7900XTX in Hogshardware Legacy yet? 😛

This I concede, no problem.

I guess what surprised me that their optimization is so specific that AM5 cpus are so penalized, they should be fine by virtue of their pure performance. If Hogwarts is doing some thread scheduling or something manually when detecting a Zen CPU or just spacing out when it doesn't see that CPU model in some whitelist that's terrible.

I get this could happen to AM5, but how we get to that point mystifies me.

Looking for that game sized patch to drop any day now so we get the real gold version.
 
Folks, I'm going to sound like a real dummy here.

Why would x86 performance be so different between AMD & Intel? Graphics, I understand, but CPUs. I'm baffled. Is the game using some unique Intel instructions not existing in Zen4?

Tester fail?
 
This I concede, no problem.

I guess what surprised me that their optimization is so specific that AM5 cpus are so penalized, they should be fine by virtue of their pure performance. If Hogwarts is doing some thread scheduling or something manually when detecting a Zen CPU or just spacing out when it doesn't see that CPU model in some whitelist that's terrible.

I get this could happen to AM5, but how we get to that point mystifies me.

Looking for that game sized patch to drop any day now so we get the real gold version.
Some are blaming UE4 since recent titles have been nerfed on AMD, especially with RT turned on.

That twitter discussion has feedback that is all over the place. Someone having issue on a 9900K and saying turning off HT helped. Others saying they are good using a 5800X. Others are saying they get horrible FPS drops with Zen 3. On and on. Whatever is going on, it seems like some of these next gen console ports, are nerfed on AMD PCs, for some reason. Spiderman and Callisto Protocol for PC, have issues with AMD and RT right?
 
This I concede, no problem.

I guess what surprised me that their optimization is so specific that AM5 cpus are so penalized, they should be fine by virtue of their pure performance. If Hogwarts is doing some thread scheduling or something manually when detecting a Zen CPU or just spacing out when it doesn't see that CPU model in some whitelist that's terrible.

I get this could happen to AM5, but how we get to that point mystifies me.

Looking for that game sized patch to drop any day now so we get the real gold version.
Well I have 2 AM5 systems in the house now so I could do some testing. They're both pretty similar though hardware wise so it might not make sense to test both.
 

I’ve run across bugs like this before where game options only apply if done in a certain sequence. It’s very frustrating when you run into it. It’s another reason I don’t trust tests across multiple users over the internet. Without any kind of quality control or verification, you never know if each system is really using the same settings, even if they show them set correctly in a screenshot or video.
 
Aussie Steve's testing is more like it. All AMD plays nice together the way I was anticipating.

TPU's data is clearly nerfed. Steve says he tested with the 13900k and saw similar scaling but different CPU usage. With Intel having a couple of cores maxed at Hogsmeade.

LULZ at the 3060 spanking its big bros when they run out of vram. 😀

And yeah, 6800XT was the right choice for this game.

I should have linked his review -

 
Last edited:
Aussie Steve's testing is more like it. All AMD plays nice together the way I was anticipating.

TPU's data is clearly nerfed. Steve says he tested with the 13900k and saw similar scaling but different CPU usage. With Intel having a couple of cores maxed at Hogsmeade.

LULZ at the 3060 spanking its big bros when they run out of vram. 😀

And yeah, 6800XT was the right choice for this game.

I should have linked his review -



LOL

1440p-Ultra-RT-Ultra.png


1440p-Ultra-RT-Ultra-Radeon.png
 
TPU is the only site with such variance so far. HUB got very different results in the areas they tested.

FoneWlYWcAMyMEN


I noticed this is the 1080 screenshot. I'll go see what the 4K looked like.
 
TPU had to have not known about the frame generation bug. HUB still has the 7900XT just below the XTX at 4K

View attachment 76246

I mentioned it in the other thread, but TPU's issue isn't the frame generation bug, the RTX3000 and especially ARC card results are way too good in relation to the RTX4000 series for that to be the case. TPU's numbers also don't make sense when looking at how much VRAM this game takes with RT enabled versus Nvidia card performance on SKUs that have lower amounts of VRAM. Between the issue with their AMD card results, ARC's insanely good performance, and seemingly impervious to VRAM limits of some cards, I don't know what to make of TPU's results, tbh. I've seen multiple users with AMD cards though report frame rates far more in line with HWUB's results. I haven't seen any report the pitifully low frame rates that TPU got with their AMD cards.
 
Interesting.

Someone suggested TPU found a spot in the forbidden forest? where RT shadows were tanking the AMD cards. Turn them off and performance jumped way up. Good a take as any for why their numbers don't agree with other sites. W1z knows his stuff, so highly unlikely he nerfed it that bad?

EDIT: I will probably look even sillier than usual when all this shakes out. I like speculation though, so I'll take my lumps later, smiling.
 
Interesting.

Someone suggested TPU found a spot in the forbidden forest? where RT shadows were tanking the AMD cards. Turn them off and performance jumped way up. Good a take as any for why their numbers don't agree with other sites. W1z knows his stuff, so highly unlikely he nerfed it that bad?

EDIT: I will probably look even sillier than usual when all this shakes out. I like speculation though, so I'll take my lumps later, smiling.

Doubt it, but maybe. Computerbase has released some initial results and they also fall mostly in line with HWUB's results (1080p is the exception but even then they are much closer to HWUB than TPU).

1676055717764.png

1676055893467.png

 
Just a side note, just nice to see that AMD again provides a solution (FSR) for older systems to be able to play current gen games


AMD-FSR-OLD-Cards.png
 
AMD-FSR-OLD-Cards.png
The textures used are pretty good in this game, even at low quality. Kudos to them for caring about the less privileged gamers. Of course, there's a financial incentive for that as it will mean more sales.

Pretty cool that the Ryzen 1600X with 5700 XT is getting 60+ fps at High settings with FSR.
 
Back
Top