Discussion AMD raytracing - will it ever improve or even beat RTX?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Jul 27, 2020
28,173
19,203
146
AMD Announces Radeon Raytracing Analyzer - Phoronix

The optimist in me says that AMD has released this to increase the value of their 6000 series cards so developers can make them perform better in raytracing scenes by utilizing hybrid raytracing instead of full-on raytracing.

My inner pessimist, however, tells me that AMD's 7000 series won't be able to improve raytracing power that much and since hybrid raytracing is the thing consoles are using anyway, AMD isn't too worried and just expects that developers porting console games with hybrid raytracing over to the PC will appreciate that they don't have to do too much work for supporting Radeons.
 

poke01

Diamond Member
Mar 8, 2022
4,488
5,803
106
AMD prefers open standards. That makes them better than Nvidia.
Agree open source is good but Nvidia unlike Apple who is proprietary as well the difference is Nvidia has REALLY good products.

GPUs are powerful with better Ray tracing - check
CUDA is standard for a reason - its a very good framework
DLSS 2.1 - Again, great.
Nvidia drivers on Windows for gaming, pro work are also good compared to AMD.

Nvidia has a large market share in gaming, rendering, VFX and streaming because there products even if they are closed sourced they
are great. When your closed source IP works great, people use it.

Jensen and as well Lisa Su are great engineers. Jensen is still the CEO of Nvidia and for 25 years he has been one.
What other tech company can say that?
Jensen made Nvidia the company it is today unlike Intel Nvidia is not making stupid claims and making fun of AMD.

They do pull some bad threads yes but they consistently performed very well.

I am not defending Nvidia there are many things they can improve on like driver support for Linux.

PS not all of AMD driver stack is open. Their Pro workstation cards are closed source.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I am a lot more positive about Raja.

The rx6000 series was his project after all.

I think the situation at Intel was hopeless before Raja showed up. If anything, I am very impressed he managed to turn out any product at all over at Intel.

Look at Intel's failure on the CPU side. The problems with Intel do not start or end with Raja.

Yeah, I think he was a bit hamstrung once he got there. ARC was already well under development by the time he joined. There was probably time to tweak a bit, but the base architecture was already in place.
 

Leeea

Diamond Member
Apr 3, 2020
3,799
5,566
136
Agree open source is good but Nvidia unlike Apple who is proprietary as well the difference is Nvidia has REALLY good products.

GPUs are powerful with better Ray tracing - check
Is that actually true? Software implementing DXR 1.1 seem to be a crap shoot between the two. Nvidia's advantage with sponsored game titles only implementing DXR 1.0 is disappearing fast.

Even worse, with most games the average user is incapable of telling the difference between dxr1.0 and no raytracing at all.

CUDA is standard for a reason - its a very good framework
Is it?

If it is so superior to AMD, why are most Exascale supercomputers running AMD GPUs?

If CUDA is so great, why did Tesla abandon it as a technology for its self driving solutions?

DLSS 2.1 - Again, great.
For the vast majority of nvidia GPUs in use* today, AMD's FSR is a vastly superior choice. It is not even close. AMDs FSR simply destroys** DLSS in the real world.

*https://store.steampowered.com/hwsurvey/videocard/
**as the above link indicates, the vast majority of nvidia cards in use are not rtx and not capable of using dlss. FSR or nothing.

Nvidia drivers on Windows for gaming, pro work are also good compared to AMD.
Is that actually true? Nvidia certainly claims that, but is it actually true?

If we look at recent releases it certainly appears nvidia has more then its share of issues.

If one also considers the much higher CPU overhead associated with nvidia's drivers and the lower image quality associated with nvidia products it would appear nvidia has been making compromises for a very long time.

Nvidia has a large market share in gaming, rendering, VFX and streaming because there products even if they are closed sourced they
are great. When your closed source IP works great, people use it.
If Nvidia products are so great, why does nvidia has a long history of playing dirty and undermining the competition? Why does nvidia feel they need to do that?

Jensen and as well Lisa Su are great engineers. Jensen is still the CEO of Nvidia and for 25 years he has been one.
What other tech company can say that?
Jensen made Nvidia the company it is today unlike Intel Nvidia is not making stupid claims and making fun of AMD.
April of this year Nvidia was making stupid claims about AMD drivers again, so that statement is clearly false.

They do pull some bad threads yes but they consistently performed very well.
Are you aware that on older systems a five year old rx580 can out perform a rtx3090? That the cpu overhead associated with nvidia cripples the performance on non-optimal systems?


I am not defending Nvidia there are many things they can improve on like driver support for Linux.

PS not all of AMD driver stack is open. Their Pro workstation cards are closed source.
AMDs workstation cards also have opensource drivers:
 
Last edited:

poke01

Diamond Member
Mar 8, 2022
4,488
5,803
106
Is that actually true? Software implementing DXR 1.1 seem to be a crap shoot between the two. Nvidia's advantage with sponsored game titles only implementing DXR 1.0 is disappearing fast.

Even worse, with most games the average user is incapable of telling the difference between dxr1.0 and no raytracing at all.


Is it?

If it is so superior to AMD, why are most Exascale supercomputers running AMD GPUs?

If CUDA is so great, why did Tesla abandon it as a technology for its self driving solutions?


For the vast majority of nvidia GPUs in use* today, AMD's FSR is a vastly superior choice. It is not even close. AMDs FSR simply destroys** DLSS in the real world.

*https://store.steampowered.com/hwsurvey/videocard/
**as the above link indicates, the vast majority of nvidia cards in use are not rtx and not capable of using dlss. FSR or nothing.


Is that actually true? Nvidia certainly claims that, but is it actually true?

If we look at recent releases it certainly appears nvidia has more then its share of issues.

If one also considers the much higher CPU overhead associated with nvidia's drivers and the lower image quality associated with nvidia products it would appear nvidia has been make compromises for a very long time.


If Nvidia products are so great, why does nvidia has a long history of playing dirty and undermining the competition? Why does nvidia feel they need to do that?


April of this year Nvidia was making stupid claims about AMD drivers again, so that statement is clearly false.


Are you aware that on older systems a five year old rx580 can out perform a rtx3090? That the cpu overhead associated with nvidia cripples the performance on non-optimal systems?



AMDs workstation cards also have opensource drivers:
You seem be a AMD fanboy/stockholder and probably thinks Nvidia is not good at anything.

Can't you even say something good about any other company other than AMD.


Personal insults, such as "fanboy" or implying someone is a shill / financial incentive, is not allowed in the tech forums.

AT Mod Usandthem
 
Last edited by a moderator:
  • Like
Reactions: cmdrdredd

Leeea

Diamond Member
Apr 3, 2020
3,799
5,566
136
You seem be a AMD fanboy/stockholder and probably thinks Nvidia is not good at anything.

Can't you even say something good about any other company other than AMD.
I do not directly own any AMD stock. I do own general index fund based based on marketplace capitalization, but by its very nature said index fund likely has more Intel and Nvidia stock then AMD.

I also am not paid or receive any money to promote AMD. Although, I do agree, I am an AMD fan boy.

---------------------------------

More to the point, rather then smearing me as a poster, which of my points was incorrect? I will give you some tips to start with:

what about non-exascale supercomputers? Ease of implementation for non-elite tier users?

anyone running eight cores has enough extra CPU resources they can handle nvidia's software scheduling overhead without a performance hit?
nvidia's software scheduling results in better directx 9, 10, and 11 performance?

image quality is subjective and some people prefer the more blended look?

claim nvidia is the company pushing technology forward even if nvidia's approach is propertiary and anti-consumer?
- some rtx raytracing extensions may be propertiary, but it pushed everyone to support raytracing on GPUs
- dlss is mostly useless for nearly everyone, but it pushed AMD into making FSR which benefits everyone
- gsync was expensive, propertiary, and poorly designed. But it pushed AMD into making freesync which then benefited everyone, including consoles.


Your post was a one sided endorsement of nvidia propertiary technology. If your going to make bold sweeping claims, you need to be prepared for push back.
 
Last edited:
  • Like
Reactions: Lodix and Tlh97

poke01

Diamond Member
Mar 8, 2022
4,488
5,803
106
I do not directly own any AMD stock. I do own general index fund based based on marketplace capitalization, but by its very nature said index fund likely has more Intel and Nvidia stock then AMD.

I also am not paid or receive any money to promote AMD. Although, I do agree, I am an AMD fan boy.

---------------------------------

More to the point, rather then smearing me as a poster, which of my points was incorrect? I will give you some tips to start with:

what about non-exascale supercomputers? Ease of implementation for non-elite tier users?

anyone running eight cores has enough extra CPU resources they can handle nvidia's software scheduling overhead without a performance hit?
nvidia's software scheduling results in better directx 9, 10, and 11 performance?

image quality is subjective and some people prefer the more blended look?

claim nvidia is the company pushing technology forward even if nvidia's approach is propertiary and anti-consumer?
- some rtx raytracing extensions may be propertiary, but it pushed everyone to support raytracing on GPUs
- dlss is mostly useless for nearly everyone, but it pushed AMD into making FSR which benefits everyone
- gsync was expensive, propertiary, and poorly designed. But it pushed AMD into making freesync which then benefited everyone, including consoles.


Your post was a one sided endorsement of nvidia propertiary technology. If your going to make bold sweeping claims, you need to be prepared for push back.
So Nvidia creates new software features and AMD copies Nvidia and opens up the tech?
 

Leeea

Diamond Member
Apr 3, 2020
3,799
5,566
136
So Nvidia creates new software features and AMD copies Nvidia and opens up the tech?
While this is true,
it is also true AMD has created a number of things that have also be copied:

ATI, Stanford university, and Microsoft created GPU compute in 2006. AMD purchased ATI and rebranded this AMD FireStream. Nvidia would copy this to create CUDA.

AMD created the Mantle API, which Microsoft then would use as the starting point to create Directx12.

Mantle would also be used by Khronos group to create the Vulcan API, which is widely viewed as the spiritual successor to Open GL.

AMD also worked with Microsoft to develop DXR 1.1, which implements pruning of unneeded rays and greatly reduces the overhead associated with raytracing.

AMD worked with Apple to develop the OpenCL standard, which is Apples/AMD alternative to CUDA. This seems to be having a revival with AMDs instinct series accelerators and Apples m1 series of iGPUs.

AMD created resizable bar for GPUs, which was directly copied by nvidia and intel. If you think about it, REBAR is a miracle technology. +5-10% performance without increasing power consumption or transistor count. If AMD had kept this propertiary, NVidia and intel would be struggling to make up the difference for the next 20 years.

AMD and Microsoft worked together to create the direct storage API, and AMD released the first implementation called smart storage. Nvidia also released its own implementation called rtx io.

------------------------------

Both companies have created IP that others have copied.

However, AMD clearly trends toward partnerships and open standards. Nvidia tends to go it alone and embrace closed standards.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
This is regarding Guardians of the Galaxy, and yes it looks and performs wonderfully well with raytracing on even on the 6700XT I am running it at. I use 4K with FSR, so the internal resolution is around 2560x1440.

The game itself is also pretty good. Highly recommended! It's available on PC Gamepass as well.
Just started playing it. Got past the initial introduction level/prologue.

Really like it so far and it looks great. Playing maxed settings at 4k with DLSS.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
For the vast majority of nvidia GPUs in use* today, AMD's FSR is a vastly superior choice. It is not even close. AMDs FSR simply destroys** DLSS in the real world.

*https://store.steampowered.com/hwsurvey/videocard/
**as the above link indicates, the vast majority of nvidia cards in use are not rtx and not capable of using dlss. FSR or nothing.

That doesn't even matter regarding what was said. DLSS is superior. It produces superior image quality. The post was about DLSS being a good product, not the most supported.
 
  • Like
Reactions: xpea

coercitiv

Diamond Member
Jan 24, 2014
7,439
17,717
136
That doesn't even matter regarding what was said. DLSS is superior. It produces superior image quality. The post was about DLSS being a good product, not the most supported.
Image quality is necessary but arguably not enough to make a good product in this case. @Leeea argues that widespread support is just as important, and support can be split into hardware base and (software) adoption rate, each with their own discussion. One could argue that an additional version of DLSS that did not require tensor cores could have presented game developers with a much wider user base, incentivizing them to support the new technology and improving adoption rate accordingly. This would have benefited all Nvidia customers in general, but also RTX buyers in particular, by enabling key strengths in their newly purchased hardware.

Alas, in the beginning, Nvidia used DLSS as a mere side-kick to RTX, relying on the novelty of raytracing to push for developer support. Adoption suffered as a result, opening the way for competition to exploit this weakness. A "good product" shouldn't have such an Achilles' heel.
 

poke01

Diamond Member
Mar 8, 2022
4,488
5,803
106
While this is true,
it is also true AMD has created a number of things that have also be copied:

ATI, Stanford university, and Microsoft created GPU compute in 2006. AMD purchased ATI and rebranded this AMD FireStream. Nvidia would copy this to create CUDA.

AMD created the Mantle API, which Microsoft then would use as the starting point to create Directx12.

Mantle would also be used by Khronos group to create the Vulcan API, which is widely viewed as the spiritual successor to Open GL.

AMD also worked with Microsoft to develop DXR 1.1, which implements pruning of unneeded rays and greatly reduces the overhead associated with raytracing.

AMD worked with Apple to develop the OpenCL standard, which is Apples/AMD alternative to CUDA. This seems to be having a revival with AMDs instinct series accelerators and Apples m1 series of iGPUs.

AMD created resizable bar for GPUs, which was directly copied by nvidia and intel. If you think about it, REBAR is a miracle technology. +5-10% performance without increasing power consumption or transistor count. If AMD had kept this propertiary, NVidia and intel would be struggling to make up the difference for the next 20 years.

AMD and Microsoft worked together to create the direct storage API, and AMD released the first implementation called smart storage. Nvidia also released its own implementation called rtx io.

------------------------------

Both companies have created IP that others have copied.

However, AMD clearly trends toward partnerships and open standards. Nvidia tends to go it alone and embrace closed standards.
I learned something today. Thank you. I always wondered why Apple kicked out Nvidia out their Mac line and kept AMD. Looks like Nvidia wanted to push CUDA and probably unwilling to support Metal like AMD does.

Looks like AMD has great history in creating new tech/software or being the first do so.
 

Leeea

Diamond Member
Apr 3, 2020
3,799
5,566
136
DLSS is superior. It produces superior image quality.
They both have their artifacts:
FSR 2.0 is superior at avoiding ghosting in motion, DLSS is superior in fine detail in motion.

However, with FSR 2.0 getting 98% of DLSS quality, any "quality advantage" becomes irrelevant in comparison to the previously mentioned devastating flaws with DLSS.



Dedicated hardware is obviously not required for temporal upscaling, and Nvidia's decision to restrict it to its RTX branding is anti-consumer and harmful to both the majority of Nvidia's customers and gaming as a whole.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
While this is true,
it is also true AMD has created a number of things that have also be copied:

ATI, Stanford university, and Microsoft created GPU compute in 2006. AMD purchased ATI and rebranded this AMD FireStream. Nvidia would copy this to create CUDA.

AMD created the Mantle API, which Microsoft then would use as the starting point to create Directx12.

Mantle would also be used by Khronos group to create the Vulcan API, which is widely viewed as the spiritual successor to Open GL.

AMD also worked with Microsoft to develop DXR 1.1, which implements pruning of unneeded rays and greatly reduces the overhead associated with raytracing.

AMD worked with Apple to develop the OpenCL standard, which is Apples/AMD alternative to CUDA. This seems to be having a revival with AMDs instinct series accelerators and Apples m1 series of iGPUs.

AMD created resizable bar for GPUs, which was directly copied by nvidia and intel. If you think about it, REBAR is a miracle technology. +5-10% performance without increasing power consumption or transistor count. If AMD had kept this propertiary, NVidia and intel would be struggling to make up the difference for the next 20 years.

AMD and Microsoft worked together to create the direct storage API, and AMD released the first implementation called smart storage. Nvidia also released its own implementation called rtx io.

------------------------------

Both companies have created IP that others have copied.

However, AMD clearly trends toward partnerships and open standards. Nvidia tends to go it alone and embrace closed standards.

GPU Compute actually dates back to 2001. One of the people was Brook (BrookGPU) at Stanford but it could target OpenGL, DirextX or CTM (AMD).

Mantle involved DICE/EA I believe, they should be recognised for that. Spock should be very grateful that AMD helped create his home planet ;) .

DXR 1.1 also involved a lot more than AMD and it is a update on a already existing technology that neither AMD/ATI or Nvidia created.

OpenCL... remind me how that is going? You seem to forget how Intel, Qualcomm, IBM & Nvidia were also involved. AMD dropped Close To Metal to support OpenCL

ReBAR was proposed in 2008, implemented in 2010 & only started being used in 2020 by AMD. Can you provide the information where AMD proposed it as a part of the PCIe (optional) spec? Like Adaptive Sync they take a underlying already existing technology and rebrand it to claim "first" (see eDP Self Panel Refresh).

Direct Storage is another thing that has been floating around but not implemented until Sony & Microsoft and the current generation of consoles.


I am sure there are people with far better knowledge than me or better google-fu but it wasn't hard to at the very least water down the overstated claims about AMD "creating" things.
 

KompuKare

Golden Member
Jul 28, 2009
1,232
1,603
136
I learned something today. Thank you. I always wondered why Apple kicked out Nvidia out their Mac line and kept AMD. Looks like Nvidia wanted to push CUDA and probably unwilling to support Metal like AMD does.

Looks like AMD has great history in creating new tech/software or being the first do so.
There was a second reason too: Nvidia sold millions of parts with bad solder bumps and Apple were about the only OEM who stood by their customers (somewhat: they offered replacement motherboards but since the flaw was intrinsic some customers went through 2-3 boards, still most other OEMs did nothing).
 
  • Like
Reactions: Tlh97 and Leeea

KompuKare

Golden Member
Jul 28, 2009
1,232
1,603
136
Dedicated hardware is obviously not required for temporal upscaling, and Nvidia's decision to restrict it to its RTX branding is anti-consumer and harmful to both the majority of Nvidia's customers and gaming as a whole.
Yes, but they finally found something their tensor cores were good at as before that they were a solution looking for problem in the consumer space.
 
  • Like
Reactions: Tlh97 and Leeea

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
There was a second reason too: Nvidia sold millions of parts with bad solder bumps and Apple were about the only OEM who stood by their customers (somewhat: they offered replacement motherboards but since the flaw was intrinsic some customers went through 2-3 boards, still most other OEMs did nothing).

Yeah, solder gate was bad. It cost Apple a big chunk of cash to replace mother boards every year for many of us and nVidia refused to stand by their work. That really got things sour between them, as Apple is one of the few companies large enough to not allow themselves to be pushed around by nVidia. Then there was Apple embedding OpenCL into Mac OS X, and nVidia refusing to support current versions of it. nVidia has been used to having everything there way, and always being the bully. But Apple was having none of that.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
You'll find this discussion as an example in the dictionary next to the entry "off-topic"
It's because RT and to a lesser extent DLSS are actually useful, Nvidia is good at them, AMD needs to get better at it and everyone knows it. That's not a great topic if you hate team green, so we get redirected to the same old Nvidia bashing and everyone feels better.

RT needs to be an AMD focus going forward because it is the main reason you need to spend $$$ on a gpu. Most games run fine at medium-high with old cards (e.g. 5700XT or a 1070). What those cards lack is RT support so most people are going to look for a card that does RT well to replace them. That is a problem for AMD right now.
 
  • Like
Reactions: xpea

poke01

Diamond Member
Mar 8, 2022
4,488
5,803
106
AMD created resizable bar for GPUs, which was directly copied by nvidia and intel. If you think about it, REBAR is a miracle technology. +5-10% performance without increasing power consumption or transistor count. If AMD had kept this propertiary, NVidia and intel would be struggling to make up the difference for the next 20 years.
This is not true. REBAR is part of the PCI-e 3.0 spec. AMD did not invent this but rather was the first to utilise it.

"As it turns out, Resizable BAR (Base Address Register) was actually part of the PCIe 3.0 specification from 2010. "

Many fell for AMD's marketing as though AMD invented it. The PCI-SIG actually invented it.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
This is not true. REBAR is part of the PCI-e 3.0 spec. AMD did not invent this but rather was the first to utilise it.

"As it turns out, Resizable BAR (Base Address Register) was actually part of the PCIe 3.0 specification from 2010. "

Many fell for AMD's marketing as though AMD invented it. The PCI-SIG actually invented it.

AMD never claimed to invent it (Though they are a member of PCI-SIG and were involved in its development with Intel.). AMD was the first to make use of it for GPUs however.