Let me paint you a picture.

The A580 shows up with a $149 MSRP. Reviewers pit it directly against RX 6400 and 6500 XT, GTX 1650 and 1630 instead of RX 6600. In every one of the games below you would want to call the authorities to report a murder. Flex tape couldn't fix the damage.
Callisto Protocol
Dead Space
Dying Light 2
The Spiderman series
Atomic Heart
Guardians of the Galaxy
Watch Dogs Legions
Plague Tale Requiem
Baldur's Gate 3
Hitman 3
Cyberpunk 2077
Gotham Knights
Hogwarts Legacy
Jedi Survivor
God of War
Ratchet & Clank: Rift Apart
The above are just the games I am aware of where ARC is the most performant vs competition. Add some light ray tracing in the games above that support it, and ARC walks away from similarly priced AMD GPUs.
That's the good. The bad and the ugly are The Last of Us and Starfield. I don't own The last of Us yet, so I can't test if the latest driver has a nice uplift. Starfield is getting there. On the A750 it is fully playable now, at least for me. I see some other owners stating it is still nerfed on their systems. It is also quite the boneheaded move to not have the A580 drivers sorted for these games at launch. ARC crew has to execute better than that. This was a big opportunity spoiled by the driver issues and MSRP.
GN's A580 review was on point. He hit all the same points we did here. TPU's review shows it in the best light. I don't agree with w1z listing the used market as a negative however. Ragged out mining cards are not just apples to oranges. It's rotten apples to oranges.