Review RX 6600XT Reviews Thread

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Rather than burry these in some 1000 page thread, may as well make them easy to find.

The performance is better than I expected. Fair bit faster than a 5700XT, but at a lot lower power consumption. Ray Tracing is poor, but that should not be a surprise. I do find it sad that TPU had to change their `Performance per Dollar` chart to include the obscene street prices :(



 
  • Like
Reactions: Mopetar

VirtualLarry

No Lifer
Aug 25, 2001
56,315
10,031
126
Yep, AMD's lowest chip having 8 lanes only isn't a new thing either. What's new is that they are charging $379+ for it.
For a moment, I thought that you were talking about AMD's 5700G APU.

Which, if you think about it, only has PCI-E x8 to the GPU... but also only 3.0 spec. Hmm, seems like a missed opportunity here for OEM rigs, B550A mobo (PCI-E 4.0 just to GPU slot), 5700G 8C/16T APU, and an RX 6600XT GPU, also PCI-E x8 4.0. Would have been a sweet config.
 

Ranulf

Platinum Member
Jul 18, 2001
2,345
1,164
136
Which is more than enough for this card. It does seem like a way to get more people to upgrade their older AM4 chipsets though. I still have a x370 3600 combo in one of my systems. While the performance hit is rare in pcie 3 mode, it may not be with unreleased future titles. I suppose you can always upgrade when you have to.

In theory its enough. I think HWUB's review numbers had it at 5% difference on average and with Doom Eternal it spiked to 25% as an outlier. Still, say its 5-10% on average depending on the game, why not just go buy a 3060 if its $50+ cheaper with more ram? Not that the 12GB ram will really matter much at 1080p. Then again, why not just buy a 3060ti over the 6600XT if its a mere $20-30 more? The only reason to buy a 6600XT right now is you need a card at that price point and you can't get a 3060ti or 3060 for near MSRP.
 

blckgrffn

Diamond Member
May 1, 2003
9,120
3,048
136
www.teamjuchems.com
In theory its enough. I think HWUB's review numbers had it at 5% difference on average and with Doom Eternal it spiked to 25% as an outlier. Still, say its 5-10% on average depending on the game, why not just go buy a 3060 if its $50+ cheaper with more ram? Not that the 12GB ram will really matter much at 1080p. Then again, why not just buy a 3060ti over the 6600XT if its a mere $20-30 more? The only reason to buy a 6600XT right now is you need a card at that price point and you can't get a 3060ti or 3060 for near MSRP.

That's the situation in a nutshell. Even with the PCIe "hit" it will likely be faster than a 3060 @ 1080p.

The 3060ti on the street regularly pushes $850-$1k. ¯\_(ツ)_/¯

It's s different tier of card and get's priced like it. The number (actual quantities) of 3060Ti's actually priced at say sub $420 at retail has to be laughably small. It's a semi-fictional price point that exists almost solely to be referred to by reviewers.

When it comes down to it, it's nice that the perf/watt of AMDs 3060 "equivalent" card is as good or better and that in terms of absolute performance it's noticeably/measurably better. It's been a bit since that could really be said.
 

Timorous

Golden Member
Oct 27, 2008
1,603
2,729
136
So got my power cable, gave it a quick test in Star Wars: The Old Republic.

Previously my 2200G on low everything apart from textures and AF @4k was averaging around 24fps with some pretty big dips.

Now with everything on max @4k I get 60 fps with a dip into the low 50s when I use one specific ability.

Seems good enough to me. Might actually get the MS game pass now and give FH 4 and some other games a try.

EDIT: As for games that it can't handle 4k then 1080p + integer scaling is good enough for me and if it suppports it I might try FSR.

EDIT 2: By 60 fps i mean locked because I have set that as a frame rate limit, will need to try 120 and see how it does uncapped. From the metrics the card didnt hit max clocks and it peaked around 100W so probably had a bit of headroom to go higher.
 
Last edited:

blckgrffn

Diamond Member
May 1, 2003
9,120
3,048
136
www.teamjuchems.com
So got my power cable, gave it a quick test in Star Wars: The Old Republic.

Previously my 2200G on low everything apart from textures and AF @4k was averaging around 24fps with some pretty big dips.

Now with everything on max @4k I get 60 fps with a dip into the low 50s when I use one specific ability.

Seems good enough to me. Might actually get the MS game pass now and give FH 4 and some other games a try.

EDIT: As for games that it can't handle 4k then 1080p + integer scaling is good enough for me and if it suppports it I might try FSR.

EDIT 2: By 60 fps i mean locked because I have set that as a frame rate limit, will need to try 120 and see how it does uncapped. From the metrics the card didnt hit max clocks and it peaked around 100W so probably had a bit of headroom to go higher.

Hopefully it looks like a new game with that change in settings :)

It's hard (not impossible with some research I suppose) to know if your frame dips are completely GPU related now or if you are hitting some CPU performance ceiling - which is a nice place to be if so :)
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
So got my power cable, gave it a quick test in Star Wars: The Old Republic.

Previously my 2200G on low everything apart from textures and AF @4k was averaging around 24fps with some pretty big dips.

Now with everything on max @4k I get 60 fps with a dip into the low 50s when I use one specific ability.

Seems good enough to me. Might actually get the MS game pass now and give FH 4 and some other games a try.

EDIT: As for games that it can't handle 4k then 1080p + integer scaling is good enough for me and if it suppports it I might try FSR.

EDIT 2: By 60 fps i mean locked because I have set that as a frame rate limit, will need to try 120 and see how it does uncapped. From the metrics the card didnt hit max clocks and it peaked around 100W so probably had a bit of headroom to go higher.

You can probably tweak the settings just a bit to get rid of that dip. I tend to disable things like motion blur and often setting shadows from Ultra to High in most games results in a big boost in performance with a very small IQ change.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,426
20,392
146
You can probably tweak the settings just a bit to get rid of that dip. I tend to disable things like motion blur and often setting shadows from Ultra to High in most games results in a big boost in performance with a very small IQ change.
Exactly. I few settings like turning down shadows, grass, depth of field, in many games, makes for buttery smooth gaming without any real visual difference. I mean you could zoom and analyse like Digital Foundry or something, but no thanks.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,426
20,392
146
why not just go buy a 3060 if its $50+ cheaper with more ram? Not that the 12GB ram will really matter much at 1080p.
I did buy the 3060. ;) And 12Gb does not matter now? fair enough. But let me add this: I love old and budget hardware. And it keeps becoming relevant again because of shortages and scalping. Some of the tech tubers I watch specialize in it, and that old GPU that came in 2 and 4 gb versions gets tested. When the cards launched, the data said the GPU was not powerful enough to make the extra ram worthwhile. But now, for the thousands that pick one of the old 4GB models up to get by until things shake out, that extra 2GB makes a difference in frames and texture quality. Comes back to the - better to have it and not need it, than need it and not have it.

If we were in the normal upgrade cycle and pricing, the above would sound silly. But normal is not a word I would use to describe 2020-2021.
 
Last edited:

Panino Manino

Senior member
Jan 28, 2017
820
1,022
136
By that same token, Nvidia should not have been roasted for Turings pricing.

High demand, monopoly conditions(AMD would not released anything for months) and increased cost(new memory larger dies, increased R and D expenditure) basically meant Turing had just as much right to increase cost over pascal.

But no, we roasted Nvidia like a toaster oven.

If Nvidia priced themselves lower than or the same as pascal, they would no incentive to release or even release turing because the chips were larger and the memory more expensive than the chips they were releasing along with the R and D spent on them.

The thing is what people don't realize is AMD is what keeps Nvidia prices in check. If Nvidia value prices their cards while being the more prestigious brand, they would have grinded AMD into dust. That is Nvidia lowering their prices thus, lowering their margins, would force AMD to accept even smaller margins. So low they could not afford to pay the bills or pay R and D. Same thing with Intel. If Intel just made better products, kept die sizes the same, increase core count and kept die sizes the same while not increasing prices, AMD would have died years ago.

The problem with AMD's current pricing is that they are basically equal to Nvidia's which just puts pressure on Nvidia to raise their pricing when you add in their mining performance.

That is AMD is the value brand which basically sets the floor pricing on video cards. Consumers will pay a premium for Nvidia cards vs AMD. So when AMD prices their cards high, it gives Nvidia a licence to raise their prices.

The circumstances are different from Turing days.
There are so much many things OUT OF AMD CONTROL that influences the price, AMD can't lover the prices too much even if it wants (same for the others). And in this market these companies need to profit as much as they can to stay in the game. AMD endured years of non-existent profits, it needs to recover it all back because each day the game gets more expensive and competitive. This "crises" is a blessing for AMD, gives it the opportunity to profit as much as it wants not matter how far ahead Nvidia may be on tech and performance.

I understand the frustration but people, let's stop seeing AMD like this bastion of consumer ethics and savior of grown men that wants to waste their money on toys, please.
It's ridiculous to talk like AMD have the responsibility to control Nvidia's prices.
 

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
Because of crappy gaming drivers it's likely to have much better raw compute/mining performance than it's comparable AMD/NV cards in gaming.

I'm a naive optimist. Intel will get it right this time in terms of drivers, at least for modern games. The issue will be older games (which nobody benchmarks anymore).
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
231
106
I did buy the 3060. ;) And 12Gb does not matter now?
In a few edge cases the 12gb 3060 already 0wn3z the 8gb 3070. IMO, 3060 is the best card for the money Nvidia currently has. The future is bright.

and what about the op, the 6600xt? Looks like it will be a hit for the miners 😁

7A759188-EA5B-47EE-9A18-1D941B233DF3.jpeg
 
Last edited:
  • Wow
Reactions: Tlh97

Mopetar

Diamond Member
Jan 31, 2011
7,826
5,969
136
according to reviews... the card downgrades to 8x pci-e 4.0

So the card can't even run @ 16x PCI-E 4.0 even if you have the physical capacity to do so

You can't tell me this is a great card.... not even by a longshot..

Does it need 16 lanes though?

When GN tested the 3080 running PCIe 3 vs. PCIe 4 they found pretty negligible difference in most of the tests.

8x PCIe 4 is the equivalent of 16x PCIe 3 so we could draw some comparisons. If a vastly more powerful card like the 3080 isn't being hurt by that then it's hard to imagine the 6600XT would be.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
231
106
Performance reduction in Doom Eternal in PCI-E 3.0 mode is ~25%, whether this can be fixed with a driver update remains to be seen. This lane issue is about the only major Con of otherwise a very good graphics card, especially in a performance per watt aspect. But not as good in V-Sync 60 hz locked gaming, sadly. It's a funny chart, otherwise. Notice how the 1650 Super consumes even more energy than 3080. It's all about the clocks and power profiles, sometimes it makes more sense to buy a higher TDP card to save power, ironically as it may sound. Especially true for the older games, for example, Borderlands 2 won't even start the fans on my 3090 with PhysX on Low (V-Sync 60hz, 1080p).

1628948904170.png
 
Last edited:

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,841
3,189
126
3090 with PhysX on Low (V-Sync 60hz, 1080p).

I would commit seppuku if i had to use my 3090 @ 1080p 60hz.
Thats probably not even a gsync monitor on top.. lol...

Thats like owning a supercar, and only using it as a weekend driver in Los Angeles rush hour traffic.
My poor sanity would only last so long under those conditions.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
231
106
I would commit seppuku if i had to use my 3090 @ 1080p 60hz.
Well, the benefit is, it sips power at lower loads, so I'm pleased to have AC wall power consumption under 200 watts (that is of course only true with older games, but I've been playing a lot of them lately). Cyberpunk @ 4K max settings drew about 350-450w of AC power depending on location, though, but the game sucked, so I didn't play much of it. I prefer good physics, reasonable realism and interactivity.

Thats probably not even a gsync monitor on top.. lol...
I need a VGA output (I like to play with older hardware a lot) on my monitor, so it's standard 16:10 1920x1200. But I did play with this one for a couple of weeks just to see what it feels like. The main problem was its audible fan + power consumption, because of that it attracted too much dust and I hate cleaning monitors too often. There you go, plus, the HDR wasn't that impressive. Besides, I do believe that 27" is a bit too small for the 4K resolution.

Thats like owning a supercar, and only using it as a weekend driver in Los Angeles rush hour traffic.
The speed/pleasure of driving fast is totally different, imo. I prefer go karts, though. The feel of speed is magical in a kart, so its acceleration going thru corners, it's an excellent workout for your body too. All you need is 20-30 mins.

My poor sanity would only last so long under those conditions.
LOL. Well, in computing, I prefer the efficiency the most, if I could run everything at Zero watts/heat output, I totally would. Every piece of my hardware is like that, my 3090 is an exception, because it was the only card with the least mark-up at the time of purchase, otherwise I may have gone with the 3060, but it wasn't available at the time.
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
7,826
5,969
136
Every piece of my hardware is like that, my 3090 is an exception, because it was the only card with the least mark-up at the time of purchase, otherwise I may have gone with the 3060, but it wasn't available at the time.

You could probably find someone to sell it to, possibly even getting a better price than you paid. I'd probably get a 3060 Ti over a 3060 though. It probably costs about the same with markup and even at MSRP it's enough of an upgrade to warrant the extra $100 in my opinion.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,841
3,189
126
Does it need 16 lanes though?

When GN tested the 3080 running PCIe 3 vs. PCIe 4 they found pretty negligible difference in most of the tests.

thats not the point tho.... i can say the same thing... did it have to cost 379.99 dollars then? for something which doesn't give us the full performance?
lol...

You do get a performance reduction correct?
Your paying way higher then what would be considered "normal" msrp for a product which can not run at full boat.

:T

This is bad marketing anyway i see it.... ah i take that back... this is Apple's marketing...

Even worse then Jensons, buy more RTX cards and then some more, comment during Turing launch when no one wanted to buy a 2000 series card.
 
  • Like
Reactions: DAPUNISHER

Abwx

Lifer
Apr 2, 2011
10,930
3,421
136
Performance reduction in Doom Eternal in PCI-E 3.0 mode is ~25%, whether this can be fixed with a driver update remains to be seen. This lane issue is about the only major Con of otherwise a very good graphics card, especially in a performance per watt aspect. But not as good in V-Sync 60 hz locked gaming, sadly. It's a funny chart, otherwise. Notice how the 1650 Super consumes even more energy than 3080. It's all about the clocks and power profiles, sometimes it makes more sense to buy a higher TDP card to save power, ironically as it may sound. Especially true for the older games, for example, Borderlands 2 won't even start the fans on my 3090 with PhysX on Low (V-Sync 60hz, 1080p).

View attachment 48761

Doom Eternal was tested by Computerbase with a PCIE comparison.


0% difference for average and 2% for 0.2% min framerate, the driver is Adrenalin 21.7.1, dunno what the other sites used but i dont think that Computerbase reviewer is less competent than HardWare Unboxed and other youtubers.
 

Mopetar

Diamond Member
Jan 31, 2011
7,826
5,969
136
thats not the point tho.... i can say the same thing... did it have to cost 379.99 dollars then? for something which doesn't give us the full performance?
lol...

Does it matter what they set the MSRP at when the AIB cards priced at $550 are sold out just as fast?

From what others posted it looks like one game is impacted. I suppose that's a big deal if that's the particular game you care about, but saying that it doesn't have "full" performance from a single title is a bit disingenuous.

If I wanted to find fault with the card, the 8 GB of VRAM would concern me far more than the lack of PCIe lanes. We already know that there are a few titles pushing up against that wall already.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,426
20,392
146
thats not the point tho.... i can say the same thing... did it have to cost 379.99 dollars then? for something which doesn't give us the full performance?
lol...

You do get a performance reduction correct?
Your paying way higher then what would be considered "normal" msrp for a product which can not run at full boat.

:T

This is bad marketing anyway i see it.... ah i take that back... this is Apple's marketing...

Even worse then Jensons, buy more RTX cards and then some more, comment during Turing launch when no one wanted to buy a 2000 series card.
Agreed, with the exception of thinking anything is worse than Nvidia's marketing; "It just works". When I hear marketese like that, my reaction is GTFudgeO!

On the 6600xt, the fact that the card does not run PCIE 3.0 x16 reeks of forced obsolescence.

TL;dR incoming:


And I am definitely one of the people that did not want to buy RTX 20 series. Pre-ordered a reference 5700XT. What a hot mess that card was, literally and figuratively. It was the last straw with my taking big reviewers seriously anymore. Not one felt it necessary to test quality of life stuff, like media decoding? I guess they thought "It just works!". Had to turn off hardware acceleration for every player and youtube to get anything working properly. Won't go into the rest. But, when the multitude of others with issues could no longer be ignored, the first reply from reviewers like benchmark Steve was essentially PINIC. Only later was any acknowledgement of the reality conveyed, and even then they waffled with the 'Well we have not had any issues." That's because you don't use the card, you run your test suite and done. That they don't put these cards in their daily driver and live with it, then add that info is unacceptable IMO.

But I digress, I returned it, and ended up with a 2060 super, and a bit later added a 2070 super. I could sell either for much more than I paid for them. Not too mention 2yrs of service from the 2060s and 1.5yrs from the 2070s already. Heck, my son and his crew have spent this entire college break playing Foxhole; it runs on a potato. Most the games they play are like that. I don't think they play a single AAA. I am tempted to swap it out when he is not looking, for a 1650 super. I doubt he would even notice. :D

And to end this rambling, semicoherent, reply: Many of those hordes of nose in the air "I only buy Nvidia" gamers that sat out 20 series, because the rules of acquisition stated to, now wish they had bought one. Being stuck on 9 series or mid tier 10 series, with no knowing when they can get a new, faster, card, without getting bent over = feels bad.
 

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
If I wanted to find fault with the card, the 8 GB of VRAM would concern me far more than the lack of PCIe lanes. We already know that there are a few titles pushing up against that wall already.

this is a 1080p card. More than 8 GB on 1080P?

I do agree that the PCIe lanes is overblown. I rememver even a 1080 TI didn't suffer from 8 lanes of PCIe 3.0 so 6600 XT won't either. Maybe it even saves power besides costs?