Question will RT be possible (maybe at 720P) on RX 570/580/590 (Polaris) cards? Using shaders?

VirtualLarry

No Lifer
Aug 25, 2001
56,348
10,048
126
I was reading somewhere that AMD's RT "engine" in the drivers uses the shaders, and then another thing that I read, said that they DO have some sort of "RT Core" per CU in their newer RX 6000-series hardware.

Depending on what is needed, is it possible that eventually, some form (perhaps not perfect, but hopefully "playable") of RT might be possible on Polaris? Or is that just out of the question?

Wondering how viable, going forward, seeing as how both NVidia's and AMD's newest GPUs released in late 2020/early 2021, support RT and DXR, some of these older Polaris cards are going to be, for gaming, if "most" gamers want to experience their games with "RT (DXR) ON", whether they have AMD or NV GPUs.

Then again, the cards that I was looking to purchase, were only 4GB models, so even if RT is possible using shaders at a way-reduced resolution (480P, maybe?), then they probably don't have enough VRAM for it anyways. (RTX 2060, lowest NV card that supports "RTX ON", has 6GB of VRAM. And even that's not that pleasant for gaming with RT enabled, so I hear.)

Edit: I guess my thinking is, the thing is, will all of these GTX 1660-series, GTX 1650-series, and GTX 1060-series cards, as well as Polaris, just become useless pieces of hardware, now that "RT is here"?
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,819
7,180
136
I believe DXR functionality is open to all DX12 cards (NV enabled RT on shaders with the Pascal 10xx series cards I believe, to make Turing look better), so I don't see any reason why it could not be done.

That being said, RT performance is terrible when run on shaders (AMD's newest RX6800 series cards feature fixed function hardware specifically to accelerate raytracing, there seems to be some confusion around this fact) and AMD is likely more keen on making sure their newest and best cards get the spin shine and polish over enabling then inevitably troubleshooting a functionally useless feature on older hardware.

For laughs and giggles, I would be interested to see RTRT run across Fiji/Vega/Vega 2's vast shader array and see how that affects performance vs the leaner shaders + RT core 6800 series.

I doubt older cards will become "trash" by any stretch of the imagination: its doubtful many newer games will use RTRT as anything more than an "ultra" level feature and fail back to rasterization given that even the most powerful hardware today starts to chug with more and more RT effects enabled.
 
  • Like
Reactions: Leeea

Mopetar

Diamond Member
Jan 31, 2011
7,842
5,995
136
Even if you could do it on Polaris the performance would be beyond utter garbage. Look at how much cards like the 2060 and 2070 SUPER struggled with RT when they have dedicated hardware!

Just get one of those in an RT rig and show people how awful the experience is and ask them if they want to buy a ~$700 card so it isn't. I think once people realize the actual cost they'll realize that they don't care that much.

Those people are are just going to buy consoles anyway if they're that easily pulled in by marketing since you're not going to be able to make a comparable PC for less than the starting price points of the new consoles.
 

Leeea

Diamond Member
Apr 3, 2020
3,625
5,368
136
Depending on what is needed, is it possible that eventually, some form (perhaps not perfect, but hopefully "playable") of RT might be possible on Polaris?
yes, but not playable. It was like <15 fps if I remember right.

Or is that just out of the question?
It has been done:
https://www.reddit.com/r/Amd/comments/cv6dlp/minecraft_ray_tracing_running_on_rx_580_seus_ptgi/

Edit: I guess my thinking is, the thing is, will all of these GTX 1660-series, GTX 1650-series, and GTX 1060-series cards, as well as Polaris, just become useless pieces of hardware, now that "RT is here"?

I think Raytracing is a gimmick, and will be for at least 2+ years.

Look at Godfall. The developers put ray tracing in, and it looks worse then rasterization in the same game.
Look at Battlefield, everyone turns raytracing off. Nobody can really tell the difference.
Look at Minecraft, raytracing requires a special map with no multiplayer (or did they fix that?)

If you take the list of games:
https://en.wikipedia.org/wiki/List_of_games_with_ray_tracing_support
take your pick of already released, and punch into google "game name is ray tracing worth it"

outside of Control, the answer almost always no. I have yet to find an exception.

I thought watch dogs might be the exception, but that game needs DLSS to run its ray tracing, and watch dogs DLSS implementation is notoriously bad ( link ).


Unless a developer is in Nvidia's pocket, the raytracing implementation and the dlss (if any) implementations tend to be inferior. Hopefully this will improve with console releases and "light" raytracing.
 
Last edited: