Question RX6800 or RTX3070 for new build (Hogwarts Legacy discussion)

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

In2Photos

Golden Member
Mar 21, 2007
1,628
1,651
136
Looking to build my daughter a new computer for gaming and possibly some streaming. I found that I can get either an RX6800 or RTX3070 for roughly the same price in our budget. I'll be pairing this with a 7600X and 32gb of DDR5. Currently she has a gtx1660 and plays at 1080p with 144hz monitors. Looks like the 6800 has better raster performance in most titles, especially at higher resolution due to the 16gb of VRAM. 3070 is better at RT, but she's never used RT so I don't know if that's really a pro for the 3070. Any other thoughts?
 
  • Like
Reactions: Tlh97 and scineram

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,486
20,572
146
Just because the YT personalities spout their spiel doesn't make it bad hardware though if someone's not trying to use it for the things they're having issues with during testing. Not everyone is going to be using it for the titles they have issues with. The more time that passes and more refinements of the drivers will improve things.
Shot the messenger as expected. You better have a bunch of rounds, there isn't a respected reviewer that doesn't share the sentiment that ARC isn't ready for prime time. I.E. not for the vast majority of gamers.

The rest I didn't quote because it is immaterial to his daughter's build. I agree with a lot of what you wrote. But it won't help when she is having issues with the card. ARC is like the old days right now, plug and pray. Or to borrow a great quote - My momma says ARC is like a box of chocolates. You never know what you're gonna get.

As someone that did client facing I.T. as a side hustle for a couple of decades. There is no way I would spec ARC for someone else's build. Both for my sanity and theirs'. ;)

People can buy whatever they want. My advice here is free, and I am only trying to be helpful. That advice is stay away from ARC unless you are an experienced user that enjoys troubleshooting and experimenting with hardware. Because if not -

south-park.gif
 

In2Photos

Golden Member
Mar 21, 2007
1,628
1,651
136
I'm not going to be the guinea pig for a 100+billion dollar company. And I also don't want to spend my time always fixing my daughter's PC. That means there's now 2 of us that can't be playing games. I'll pay the "premium" for a working card.
 

Leeea

Diamond Member
Apr 3, 2020
3,625
5,368
136
my tip:
Buy a card with at least 12 GB.


Techpowerup had the 3090 8% on the 6900 XT at 4k for the 6900 XT's launch, for the 6950 XT's launch they had the 3090 up 1% on the reference 6950 XT and 7% on the 6900 XT at 4k. For 1440p respective numbers are 3090 by 6% over 6900XT at 6900 XT launch vs 3090 by 1% over 6900 XT and down 4% to the 6950 XT at the 6950XT launch.
thing is,

he is comparing the rtx3070 to the rx6800.

Which is a lot different then a rtx3090 vs a rx6950.

For one thing, that rtx3090 has 24 GB of vram, and the rtx 3070 has 8 GB.


Which is gonna matter come raytracing time. And it is going to matter in rasterizer to.

All the rtx in the world is irrelevant when it runs out of ram.
 

Leeea

Diamond Member
Apr 3, 2020
3,625
5,368
136
so if she wants to run heavy RT games like Control or Cyberpunk with RT the 3070 will do it much better
maybe, maybe not

the rtx3070 flat lines when it runs out of vram, and flipping raytracing on is asking to run out of vram.


Even in old games like control or cyberpunk which favor rtx over dxr1.1, when it runs out of vram it is game over.
 
Last edited:

blckgrffn

Diamond Member
May 1, 2003
9,126
3,066
136
www.teamjuchems.com
My kiddos PCs both have 6700XTs. I think its a great sweet spot in terms of price/perf. I am less impressed on perf/watt, but they also have 75hz Freesync monitors, so it's all good. I had my dad pick up a 6700XT as well, too early in the pricing correction but its all worked out.

As @DAPUNISHER pointed out, if you are going to step up from there I wouldn't stop at the 6800 unless you are considering PSU and case thermals as the 6800 is perf/watt a very sweet spot, but the price is so close to the 6800XT if you are chasing 144 fps primarily I think its worth it.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
6,887
5,824
136
My kiddos PCs both have 6700XTs. I think its a great sweet spot in terms of price/perf. I am less impressed on perf/watt, but they also have 75hz Freesync monitors, so it's all good. I had my dad pick up a 6700XT as well, too early in the pricing correction but its all worked out.

As @DAPUNISHER pointed out, if you are going to step up from there I wouldn't stop at the 6800 unless you are considering PSU and case thermals as the 6800 is perf/watt a very sweet spot, but the price is so close to the 6800XT if you are chasing 144 fps primarily I think its worth it.

Gotta say I'm really enjoying my 6700 XT for 1440p gaming in hard to run stuff like Cyberpunk and 1800p gaming for easier stuff like Elden Ring. Well 1440p upscaled to 4k and 1800p upscaled to 4k through FRS and RSR. 6800 XT must really be something.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,340
10,044
126
Gotta say I'm really enjoying my 6700 XT for 1440p gaming in hard to run stuff like Cyberpunk and 1800p gaming for easier stuff like Elden Ring. Well 1440p upscaled to 4k and 1800p upscaled to 4k through FRS and RSR. 6800 XT must really be something.
I'm now (sort of, if I can afford it) "in the market" for an upscale GPU.

As of today, it appears that the lower-cost 6800XT cards are sold out, and may not re-stock, so at the current moment, I'm stuck betweek 6700XT @ $359, or 6800 non-XT @ $509.

I have a 4K60 HDTV as monitor. Would like to play some games. Was under the impression that I needed 6800-and-up for 4K.

But if a 6700XT will do 4K60 with upscaling decently, I may just go that route.

BTW, I already own several 5700XT cards (mining rigs). I've heard that those are decent for 1440P as well - do they support the same sort of upscaling tech / trick as the 6700XT? Would I be better off just using the cards that I have, and waiting for RDNA3 (or it's refresh)?
 

SteveGrabowski

Diamond Member
Oct 20, 2014
6,887
5,824
136
I'm now (sort of, if I can afford it) "in the market" for an upscale GPU.

As of today, it appears that the lower-cost 6800XT cards are sold out, and may not re-stock, so at the current moment, I'm stuck betweek 6700XT @ $359, or 6800 non-XT @ $509.

I have a 4K60 HDTV as monitor. Would like to play some games. Was under the impression that I needed 6800-and-up for 4K.

But if a 6700XT will do 4K60 with upscaling decently, I may just go that route.

BTW, I already own several 5700XT cards (mining rigs). I've heard that those are decent for 1440P as well - do they support the same sort of upscaling tech / trick as the 6700XT? Would I be better off just using the cards that I have, and waiting for RDNA3 (or it's refresh)?

5000 series AMD gpus also support RSR, which basically forces a postprocess FSR for games that don't have FSR support such as Elden Ring. I'm really interested to see if RSR will be enough to run Control at say 1260p with RT reflections and RT glass reflections and run 45-55 fps (so glad I bought a monitor with FreeSync range down to 40 Hz) since that is one game where Nvidia's RT really kills AMD's and that's about the only game I care much about RT for.

If you're using a TV though I'd imagine you'll see the difference between 1440p / 1800p / 2160p way more than I do, as I'm running on a 4k monitor but it's only 28 inches. I don't see a huge difference between 1440p and 4k on that screensize. I mean 1800p and 2160p definitely look better but not night and day better like 1440p does over 1080p on that screen size. But if you're playing on a 32 inch or 40 inch HDTV as a monitor I think the 1440p upscaling would probably be way more noticeable from native 4k.

6700 XT is about 30% faster than the 5700 XT in 1440p gaming while 6800 is about 55% faster than 5700 XT at 1440p, from Techpowerup. At 4k the margins are the same for 6700 XT vs 5700 XT but the 6800 pulls away to nearly 65% faster than the 5700 XT. Personally in your situation I think I'd just wait for next gen if you can't get the sub $600 6800 XT we were seeing a couple of weeks ago, as 5700 XT is still enough to run 1440p in most games.
.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
6,887
5,824
136
I'm now (sort of, if I can afford it) "in the market" for an upscale GPU.

Also, what games are you interested in and does your TV have HDMI 2.1 VRR support? If you don't have the VRR support like a FreeSync monitor would then you'll probably be wanting to hit 60 fps minimums, which is a significantly higher performance target than my target of 60 fps averages and minimums of 45+ fps (since FreeSync still makes those dips look smooth). If you can't do VRR I wouldn't recommend the 6700 XT for 1440p even. For instance, Cyberpunk at high settings will still drop into the 50s for me rendering at 1440p.
 
  • Like
Reactions: igor_kavinski

SteveGrabowski

Diamond Member
Oct 20, 2014
6,887
5,824
136
no, just your "standard" dumb 4K60 UHD HDMI 2.0 TV, from 3-4 years ago.

I would definitely aim higher than 6700 XT then. While 6700 XT is generally considered a 1440p60 gpu, I think most people say that with the expectation it's being used on a FreeSync panel these days. I'd personally be looking 6800 XT and up for 1440p with 60 fps minimums and without sub $600 6800 XT I would probably be waiting for maybe a $750 7800 XT and just hold out with the 5700 XT for a while.
 

coercitiv

Diamond Member
Jan 24, 2014
6,199
11,895
136
@VirtualLarry stay with the TL;DR of this thread, only buy 6800XT if it returns to $500-550 in the next 1-2 months. For the time being there's nothing coming from Nvidia or AMD that can compete at this price point. In the meanwhile you can play around with the 5700XT, because the 6700XT isn't enough of an upgrade for you. If it takes too long (like months) then ditch the 6800XT route and wait for 7800XT or whatever Nvidia decides we're allowed to touch in the $500-700 price bracket.
 

Hans Gruber

Platinum Member
Dec 23, 2006
2,134
1,089
136
GPU sales are down big time.

Both Nvidia and AMD are playing a dangerous game. Reducing supply to counter reduced demand. The problem is that Nvidia has tons of non refundable TSMC silicon. As the CEO of Nvidia said it best. The days of cheap graphics cards are over.

My guess is the days of cheap graphics cards will be back much sooner than people think.

Minimum 6800 or 3070 moving forward. I still think 6 months out will be where the prices are in a good place. By then all the new cards will have been on the market for 3 or 4 months.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
6,887
5,824
136
Wow Nvidia's DX12 driver overhead problems showing hard in Hogwarts Legacy where the 3060 Ti is getting wrecked by the 6700 XT with a 7700X + 32GB DRR5-6000.


I remember when the shoe was on the other foot with DirectX 11 and AMD's drivers had massive overhead Digital Foundry would scream it from the rooftops in their videos but don't see too much discussion about it now that Nvidia is the one with major problems.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,486
20,572
146
Hogwarts is a mess on PC so far. It eats system ram and VRAM. 4070TI Runs out of vram trying to play at 4K ultra


Checkout the graphical corruption in this scene. May be a driver issue.


32GB of system ram and 12GB or more of VRAM are looking best for this title. I read some saying Denuvo is part of the issue. And it is UE4 so some stuttering on PC is par for the course. But what is being seen with some setups goes beyond that, as Daniel-San shows in that testing.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
6,887
5,824
136
32GB of system ram and 12GB or more of VRAM are looking best for this title. I read some saying Denuvo is part of the issue. And it is UE4 so some stuttering on PC is par for the course. But what is being seen with some setups goes beyond that, as Daniel-San shows in that testing.

Will be interesting to see since a cracker has claimed she'll have this cracked within ten days of release.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,486
20,572
146
Will be interesting to see since a cracker has claimed she'll have this cracked within ten days of release.
It is certainly ironic that many have to use a cracked version of a game they legit bought to get proper performance. The Assassins Creed games are the most infamous example I can think of.

The console version seems solid. Even the Series S is smoother than some of the PC setups I have seen.


Okay, I guess that vid isn't the best example. I saw these notes for it -

"All versions have stuttering problems inside the castle while advancing through the rooms. There will even be times when we get stuck in a doorway until the other room has finished loading.
- Xbox Series S has a lower NPC density.
"
 
Last edited:

coercitiv

Diamond Member
Jan 24, 2014
6,199
11,895
136
Wow Nvidia's DX12 driver overhead problems showing hard in Hogwarts Legacy where the 3060 Ti is getting wrecked by the 6700 XT with a 7700X + 32GB DRR5-6000.
The same experience can be had with a "slower" CPU in multiple other games, as HUB showed in their 13400 vs. 5700X gaming & GPU scaling benchmark.


I would be curious to see what happens with AMD 7000 series though, AFAIK they also chenged to software scheduling, although I have no good source for this other than forum talk. Maybe HUB will run a couple more tests.
 

Hans Gruber

Platinum Member
Dec 23, 2006
2,134
1,089
136
I would like to know how the 6800/6800xt perform on early release drivers to the latest and most current drivers. AMD had a lot of upside potential since their early drivers were pretty bad.

The next thing to consider is the 16GB of Vram. When you run multiple monitors, the memory does come in very handy. It seems today most serious desktop computer users have more than one monitor.

The wild card video card in the next 3-4 months could be the ARC A770 with 16GB of Vram. I would say a $250 or lower price point would be the strike price for the A770. Mostly for the 16GB of Vram and the greatly improved drivers.

I think the 6800xt is the card you want vs. a 3070. Then consider the 7700XT that will be arriving shortly with 12GB of vram.

I personally see myself jumping back on the AMD GPU wagon in the next 6 months. If I go for the RDNA2 stuff, I will undervolt the cards. 6800xt is probably my best fit because I have a 750w gold power supply.
 
  • Like
Reactions: scineram

In2Photos

Golden Member
Mar 21, 2007
1,628
1,651
136
I pre-ordered the game but haven't had a chance to play it yet. My daughter has been playing it though on her PC. She let it auto configure the settings and I'm not so sure the person who came up with the system requirements knew what they were doing. Granted she is playing at 1080p still, but the auto settings set it to ultra, capped her to 60 fps, which was fine honestly. It enabled FSR2 and upscaled from 960p I think (I only saw it for a moment). GPU was at like 30% utilization while in Hogwarts. The fans weren't even spinning on the GPU! She probably could have used the old PC (i7-4790 and GTX 1660). I might download it to my old PC and see if it will run (i7-920 and GTX 1660 from her old PC). Hoping to get to play around with it this weekend. She is enjoying the game though!