Question Ray Tracing is in all next gen consoles

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Muhammed

Senior member
Jul 8, 2009
453
199
116
PS5 is now confirmed to have hardware RT, meaning Ray Tracing will now the next standard in making games. RTX will spread into even more games. I am interested to know what those who thought RT will never be mainstream now think?



The straw man in your question is trolling.

AT Moderator ElFenix
 
Last edited by a moderator:
  • Haha
Reactions: JPB

Spjut

Senior member
Apr 9, 2011
928
149
106
Sony will most likely use Vulkan RT, which is similar to DXR.

He said the PS5 uses a completely different solution, he is wrong.

He said the Xbox will have an upgraded pipeline (DXR 1.1?), Turing got the upgraded pipeline as well. He said current hardware won't get the upgrade, all incorrect.

He also said current hardware will be paperweight with the new consoles, which is incorrect as well. The Xbox Series X performs worse than Turing in MineCraft.

I don't want to put words in his mouth, but it's always been known that consoles aren't as restricted by their APIs as PCs are. If a studio really wants to, they should be able to write their own raytracing solution.

Sony doesn't even use Vulkan either, they use their own APIs called LibGNM (lower level API) and LibGNMX (higher level API).
 

RetroZombie

Senior member
Nov 5, 2019
464
386
96
Its the same Minecraft. .
Same but not doing the same, from the link:

On a more positive note, to my eye, at least, Microsoft's new tech demo looks even better than the RTX video of Minecraft from last year. The shots of dark caves with beams of light coming in from cracks in the ceiling—my god, that makes me want to go exploring.
You can see several minutes of the demo in the video above from Digital Foundry, which also includes a crash course in the differences between simple ray tracing and the far more demanding path tracing
, which is what's going on here.
 
  • Like
Reactions: Tlh97 and Krteq

DXDiag

Member
Nov 12, 2017
165
121
116
On a more positive note, to my eye, at least, Microsoft's new tech demo looks even better than the RTX video of Minecraft from last year. The shots of dark caves with beams of light coming in from cracks in the ceiling—my god, that makes me want to go exploring.
Difference is due to the different testing area. This is subjective, different people won't agree on which scene looks better than the other.

FWIW, Digital Foundry confirmed that this is the same RTX code converted in DXR to run on AMD hardware.

You can see several minutes of the demo in the video above from Digital Foundry, which also includes a crash course in the differences between simple ray tracing and the far more demanding path tracing, which is what's going on here.
NVIDIA RTX demo is also using path tracing.
 

Gideon

Golden Member
Nov 27, 2007
1,646
3,709
136
NVIDIA RTX demo is also using path tracing.
Yes it is, but:

1. I highly doubt they managed to recode all the demo shaders in proper DXR 1.1 in the 2 weeks, using an optimal 1SRT implementation. It was probably ran on DXR 1.0 (which is a more straightforward port) and therefore it ran non-optimally on RDNA hardware. And Yes I'm aware that Turing supporst DXR 1.1, but I've got a hunch, it doesn't benefit from the new programming model as much as the new consoles do (while I'm as certain that Ampere does). There is a big difference:

aNuI2cI.png


2. The Xbox Series X is overall surprisingly competitive to High-End Turing (compare Gears of War for instance). Yeah it might happen that it can't beat 2080 Ti in ray-tracing (the best Turing has to offer this generation), but we certainly don't know yet from this apples to oranges comparison. Optimizing code to a certain GPU-architecture takes time (just consider how long RT Minecraft has been in development), it's too early to throw out 100% certainties. Furthermore, if you want to extrapolate AMD Desktop RDNA performance from Xbox, bear in mind the that won't be a 52CU fixed-clock GPU. Big Navi is rumored to have 80CUs and certainly has more bandwidth (which is a very big necessity for RT).

3. Compare to "your own size". Nvidia RTX 2080 Ti is a 754 mm² GPU Even when shrunk to 7nm, that monster would be larger than the entire Xbox chip (360.45 mm²), let alone the GPU part. When comparing the 980 Ti (28nm 601 mm²) to 1080 (16nm 314 mm²), which a simillar shrink, Nvidia achieved to almost halve the die-size, but also cropped memory channels from 384 bit to 256 bit and removed 512 shader cores in the process from the die (980 Ti has 2816, but it has some disabled, the chip itself has 3072).

My point being. Considering the cost and power constraints, AMD did really well here (especially when considering where they are coming from). Poo-pooing this arch as the worst BS ever, simply because it doesn't trounce highest end GPUs is ridiculous.

I have no doubt Nvidia could achieve similar (probably slightly better) results from a ~300 mm² 7nm Ampere, but that doesn't lessen AMDs achievements.

TL;DR:
I guess the glass is always half-full to some. "Oh noes, the new console doesn't (possibly) beat the socks off from the current >1100$ GPU in absolutely everything, merely being close. What a heap of utter garbage!" I wonder how do you Rate PS4/Xbox One at launch then? Those were miles worse compared to existing desktop GPUs at the time.
 
Last edited:

DXDiag

Member
Nov 12, 2017
165
121
116
ably ran on DXR 1.0 (which is a more straightforward port) and therefore it ran non-optimally on RDNA hardware. And Yes I'm aware that Turing supporst DXR 1.1, but I've got a hunch, it doesn't benefit from the new programming model as much as the new consoles do
Your hunch is wrong, DXR1.1 benefits all hardware.

3. Compare to "your own size". Nvidia RTX 2080 Ti is a 754 mm² GPU Even when shrunk to 7nm, that monster would be larger than the entire Xbox chip (360.45 mm²),
7nm Series X is being made on an enhanced 7nm process, you can't compare it to regular 7nm found in the Ryzen 3000 for example. Besides die size is irrelevant here.

The Xbox Series X is overall surprisingly competitive to High-End Turing (compare Gears of War for instance).
Yes, Digital Foundry found it performing at the level of the regular 2080, however the DXR Minecraft demo was performing way less than that.

Speculation?
There is literally zero detailed performance data on the XBoxSX,
No, Digital Foundry tested a demo provided by Microsoft, of Minecraft pathtracing (ported from Minecraft RTX), Series X performance was 30fps to 60fps @1080p, way less than Turing.
 

RetroZombie

Senior member
Nov 5, 2019
464
386
96
Your hunch is wrong, DXR1.1 benefits all hardware.
There is an interview* with some microsoft engineer where he said that it's 100% compatible with all existing products which is not the same thing, at latter part he said it's important to be DXR1.1 hardware accelerated.

*Sorry cant find it now, maybe it was on youtube.

On this DX12 level features we see that amd is behind nvidia, and even so it performs at the same level or even higher tier level than nvidia cards on DX12 games.

Edit: On second thought maybe he was referring to cards like pascal cards being 100% compatible, and the hardware accelerate part to cards like turing, but since he never mentioned companies or parts it's difficult...
 
Last edited:
  • Like
Reactions: Gideon

DXDiag

Member
Nov 12, 2017
165
121
116
Nah... The PS5 has a very different RT solution compared to what is implemented in DXR. Even the Xbox has an upgraded pipeline. So, hibrid RT will be the way to go in the future, but the current hardwares will be paperweights with the new consoles. They don't support the new pipeline, because the hardware designed aroud a fixed function unit, and it's need to be redesigned, to support the changes.

I think we need to wait two more generations to get a clean pipeline, and that might allow good upgradeability. Until then every new DXR version will require new hardwares.

There are three problems with the actual implementation.
- It's not support the 32-bit snorm format, because the fixed function hardware is not designed around it. This is a huge limitation, and it can sacrifice the performance and the memory too much, and the supported 32-bit float format don't gives you really better results.
- The used acceleration structures are not public, so it could result extremely wild performance variations depending on the scenes. This needs to be solved.
- The ray traversal stage is extremely limited. It should be programable.

There are other smaller problems, so I think the consoles will give some good basics, but on the PC, these can't be fixed without hardware changes.

I personally thinks that hibrid RT will be mainstream about 5-6 years from now. But not sooner.
Well, this post aged horribly, PS5 and Series X are slower than a 2060 Super in RT workloads, and the Turing/Ampere cards rule the field in ray tracing.
 

psolord

Golden Member
Sep 16, 2009
1,920
1,194
136
Yes, yes, I remember these as well.

And not only that. It dawned on me, that even if you have a next gen console, in many many games, you will be getting the old version of the game.

I saw VG tech's Lichdom Battlemage video on the PS5 the other day and I jumped off my chair. The guy specifically tested this because it was a redactedon the old gen consoles. Now it runs fine BUT, with old gen graphics.

And this really had me thinking, so I re-tested it on my old 10 freagin years old Radeon 7950 and here are the results on the same location.



I mean screw the fact that even my old 11yo 2500k+10yo 7950 can run this game at 60fps just fine at 1080p very high, it also looks much much better.

I mean imagine having a PS6 or a PS7 you will be still getting the same version, lol.

And no I am not saying the 7950 is better than the PS5 ffs. I am saying the console ecosystem is for the redacted.




No profanity allowed in the tech forums.


esquared
Anandtech forum Director
 
Last edited by a moderator:

psolord

Golden Member
Sep 16, 2009
1,920
1,194
136
What is a "tier of backwards compatibility", 'cause I am too PC to understand it. Jesus.

Also where did I mention anything about stutters on the PS5? I said it runs fine. All I said is that next gen consoles, are stuck at previous gen quality settings (see above) for all games that will not receive a re-release. Which are what? 99% of the games?
 

Mopetar

Diamond Member
Jan 31, 2011
7,846
6,000
136
Consoles get what ultimately ends up being a mid-range card for that generation of GPU hardware. This has always been the case and even though there's a lot of marketing fluff about next-gen performance and the like it winds up being similar to a mid-range PC unless you have a developer that really tries to optimize the hell out of the game for that specific hardware.
 

Panino Manino

Senior member
Jan 28, 2017
821
1,022
136
What is a "tier of backwards compatibility", 'cause I am too PC to understand it. Jesus.

Also where did I mention anything about stutters on the PS5? I said it runs fine. All I said is that next gen consoles, are stuck at previous gen quality settings (see above) for all games that will not receive a re-release. Which are what? 99% of the games?

That game is a PS4 game, it runs on PS5 via compatibility mode.
PS5 have slightly different modes of compatibility, it changes it's behavior. From what I understand, on that game it runs like just like a PS4, meaning that it disables half the GPU and lowers it's clock to 800MHz.

Games that support the PS4 Pro have access to a second mode that uses the full GPU but limits clock 911MHz.

And I think but I'm not certain that it also uses the Polaris ISA.
 
Last edited: