Discussion Metro Dev: Ray Tracing Is Doable via Compute Even on Next-Gen Consoles, RT Cores Aren’t the Only Way

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

maddie

Diamond Member
Jul 18, 2010
4,744
4,685
136
I disagree that NVIDIA has become complacent in regards to hardware development. Turing has advances outside of RTX such as the ability to process both floating point and integer instructions simultaneously, which in many cases is a huge uplift in efficiency in titles like Wolfenstein II with the 2080 destroying the 1080TI. Yeah, NVIDIA took a huge risk and introduced unproven technology with RTX, but that's not complacency and only time will tell if that risk was wise or not. At the moment it sure doesn't look like it was, but the tech is promising and I'll reserve judgement for the future. The fact of the matter is AMD is VERY far behind and we have little information regarding NAVI so everything is speculation. We know more or less what NVIDIA's 3000 series will bring, and it should be great from an performance standpoint albeit with likely less performance/price than any of us would like to see.
This provided a good laugh. Facts, and everything is speculation, at once.
 

coercitiv

Diamond Member
Jan 24, 2014
6,211
11,940
136
I disagree that NVIDIA has become complacent in regards to hardware development.
If this is not complacency, then what would be? Would it take performance regression instead of stagnation at the same price point for some to admit it?

The improvements in FP & Integer processing are indeed a nice highlight, but try telling that with a straight face to people who just found out that ~50% of their GPU is not used during 99% of the games they played in the last 6 months, and that's unlikely to significantly change in the next 6 months either.

The competitive Nvidia we knew would have made a radically different 12nm product, in which maybe RT cores or Tensor cores would made the cut. The GTX 2070 would have easily matched 1080Ti and we would have talked a lot more about efficiency oriented features such as simultaneous processing and content-adaptive shading. Ray Tracing - if available - would have manifested itself as a less compute heavy global illumination. DLSS - if available - would be focused on offering better image quality instead of performance. RTX in it's current form was meant for a mature 7nm node at best.

What we got instead was a product aggressively marketed as a revolution in visual fidelity based on RT and a revolution on performance based on DLSS: no hard numbers on day 1, just a few tech demos and the promise of a bright super-sampled future ahead. They were so focused on RTX that the content-adaptive tech demo simply had to take a back seat in their marketing priority list.
  • DLSS was the first to show signs of weakness as reviewers analyzed tech demos, then had people in awe in the Port Royale benchmark, only to come crashing hard in both BF5 and Metro Exodus. Quality is all over the place.
  • RT had a very rocky start in BF5, with few people willing to trade in proper reflections for a high performance hit in a competitive shooter title. It is only in Metro Exodus that it comes through as a win, with a less performance intensive feature on a single player shooter.
  • Meanwhile, outside of 4 titles making use of Turing, performance per dollar stagnates and people see Turing as the chips they must buy in case their old card falls too much behind in current titles.
If today's Nvidia is not complacent in regards to hardware development, then please can I have the less optimized version of Nvidia from the Maxwell-Pascal era? Because it seems to me that consumers having to pay in advance for product testing and feature development falls directly under complacency. Turing was a dream of having cake and eating it too, it happens to the best of us.
 
  • Like
Reactions: Arkaign

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
@coercitiv yeah, it makes you think just how slow the ship moves in the semiconductor world. It's a long road from concept to design to sim to fpga proof of concept (eg; tensor or RT cores on a test die connected separately to run code) to various levels of tapeout, while keeping up with feedback from 3rd party fabs in order to get samples and info on both production volume and yield availability, and characteristics of getting the design onto their lines, building drivers, working with AIBs, working with MS, working with devs, just endless work everywhere that may not see fruit for years on any given project, and the final result may look little like you imagined.

I don't envy people involved in a business that sees so much uncertainty and complexity of execution.

Navi will be really interesting. If they're able to get 1080ti performance down to $299 yo $349 even without RT, and also have higher performance tiers then Nvidia could really have a crisis on whether to keep pushing tensor on consumer cards. Getting away from HBM seems wise.
 
  • Like
Reactions: coercitiv

coercitiv

Diamond Member
Jan 24, 2014
6,211
11,940
136
yeah, it makes you think just how slow the ship moves in the semiconductor world.
I don't envy people involved in a business that sees so much uncertainty and complexity of execution.
Just to be clear, I'm not throwing dirt here. I know we're second guessing giants and the work of a lot of talented and invested engineers, but at the end of the day all that hard work needs to deliver better consumer experience. Fail to do that, and we should be able to have a decent conversation on why that happened, not just repeat the PR talking points of the cost of innovation. Both companies innovate continuously, in both hardware and software, but none of that matters if innovation fails to shape competitive products.

Navi will be really interesting.
I'll reserve judgement on Navi when I see it. If anything, the recent VII launch shows RTG is still starving and prone to mistakes.
 
  • Like
Reactions: Arkaign

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
If this is not complacency, then what would be? Would it take performance regression instead of stagnation at the same price point for some to admit it?

The improvements in FP & Integer processing are indeed a nice highlight, but try telling that with a straight face to people who just found out that ~50% of their GPU is not used during 99% of the games they played in the last 6 months, and that's unlikely to significantly change in the next 6 months either.

The competitive Nvidia we knew would have made a radically different 12nm product, in which maybe RT cores or Tensor cores would made the cut. The GTX 2070 would have easily matched 1080Ti and we would have talked a lot more about efficiency oriented features such as simultaneous processing and content-adaptive shading. Ray Tracing - if available - would have manifested itself as a less compute heavy global illumination. DLSS - if available - would be focused on offering better image quality instead of performance. RTX in it's current form was meant for a mature 7nm node at best.

What we got instead was a product aggressively marketed as a revolution in visual fidelity based on RT and a revolution on performance based on DLSS: no hard numbers on day 1, just a few tech demos and the promise of a bright super-sampled future ahead. They were so focused on RTX that the content-adaptive tech demo simply had to take a back seat in their marketing priority list.
  • DLSS was the first to show signs of weakness as reviewers analyzed tech demos, then had people in awe in the Port Royale benchmark, only to come crashing hard in both BF5 and Metro Exodus. Quality is all over the place.
  • RT had a very rocky start in BF5, with few people willing to trade in proper reflections for a high performance hit in a competitive shooter title. It is only in Metro Exodus that it comes through as a win, with a less performance intensive feature on a single player shooter.
  • Meanwhile, outside of 4 titles making use of Turing, performance per dollar stagnates and people see Turing as the chips they must buy in case their old card falls too much behind in current titles.
If today's Nvidia is not complacent in regards to hardware development, then please can I have the less optimized version of Nvidia from the Maxwell-Pascal era? Because it seems to me that consumers having to pay in advance for product testing and feature development falls directly under complacency. Turing was a dream of having cake and eating it too, it happens to the best of us.

Price point is completely outside the scope of the discussion of complacency in "hardware development". We wouldn't be having this conversation if the NVIDIA had priced their cards lower, and the 2000 series wouldn't be seen as the failure that it is.

NVIDIA felt they could price their 2000 series cards where they did because of the extreme lack of innovation and competition from AMD, period. They took a bet on RTX that at the moment is absolutely a failure, but remains to be seen if that's the case moving forward. We don't know if it's a hardware failure or simply software needs to catch up (isn't that what the AMD fans have claimed for the past 6+ years LOL).

Again, I don't believe that the 3000 series will have the same ratios of RTX to CUDA cores and that 7nm will free up space for a better balanced, and better price/performance GPU than we have in the 2000 series.Time will tell, but claiming that NVIDIA has been complacent and stagnant in regards to hardware development is laughable in light of AMD's 6+ years of stagnation.
 
Last edited:

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
I'll reserve judgement on Navi when I see it. If anything, the recent VII launch shows RTG is still starving and prone to mistakes.


AMD hasn't had an interesting GPU since the 7970. If it wasn't for mining I doubt that AMD would even exist at this point. Hopefully some of that sweet CPU $$$ rolls into R&D and by 2022 we see some killer GPU products from AMD. Is I've said before, I'd love to rock an AMD GPU again for something other than mining.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
Just to be clear, I'm not throwing dirt here. I know we're second guessing giants and the work of a lot of talented and invested engineers, but at the end of the day all that hard work needs to deliver better consumer experience. Fail to do that, and we should be able to have a decent conversation on why that happened, not just repeat the PR talking points of the cost of innovation. Both companies innovate continuously, in both hardware and software, but none of that matters if innovation fails to shape competitive products.


I'll reserve judgement on Navi when I see it. If anything, the recent VII launch shows RTG is still starving and prone to mistakes.

Understood, I didn't get that impression. Observation of results and practical value in the market for a finished product does not mean you are condemning the professionals and executives that made this project happen, more of a fair assessment of what the ramifications of it is.

It's gotta be tough, because in a rapidly evolving industry you have to play several steps ahead, with mistakes proving incredibly dangerous. Nvidia is lucky that AMD and Vega/Polaris were so uninspiring relatively speaking. Even the Vega 7 sort of makes 2080 look 'ok' by comparison. And of course the wild success of 10xx is one tough act to follow, especially after such a long time on the market to find homes with consumers. Competition with your own past success is hard sometimes, particularly when that is further warped by the mining boom and subsequent bust. Your shareholders see the new highs and demand continued repetition of such feats, not understanding the vacuity of such a thought.

I counseled several people to buy big on Nvidia when I got word on how Vega was encountering some rough growing pains before release and seeing mining exploding. I also told them to follow mining prices daily for Bit and Eth, and told them to sell as soon as the decline was obvious because it would be a really gigantic hit on the stock down the line. They did really well, though one guy held a week or so too long.
 

coercitiv

Diamond Member
Jan 24, 2014
6,211
11,940
136
Read with comprehension, price point is completely outside the scope of the discussion of complacency in "hardware development". We wouldn't be having this conversation if the NVIDIA had priced their cards lower, and the 2000 series wouldn't be seen as the failure that it is.
Write with comprehension, we would still be having this conversation as die area still impacts graphic card BoM.
  • We wouldn't be having this conversation if RT features had less impact on overall performance and if DLSS was a pure win in terms of quality vs. performance balance.
  • We would be having this conversation if 2000 cards were cheaper, since atm they're still 50% pure muscle and 50% fat - at least for the time being. Maybe Nvidia manages to train them into better shape until next gen arrives, I would definitely prefer that over the current status quo.

Again, I don't believe that the 3000 series will have the same ratios of RTX to CUDA cores and that 7nm will free up space for a better balanced, and better price/performance GPU than we have in the 2000 series. Again, time will tell, but claiming that NVIDIA has been complacent and stagnant in regards to hardware development is laughable in light of AMD's 6+ years of stagnation.
Funny how when faced with arguments based on Nvidia's own track record in terms of sharp focus and product efficiency, the countermeasure is to fall back to pointing fingers at AMD. This discussion was about Nvidia and the path it has taken departing from Pascal. Compared to the ghost of ATI even the old Nokia seemed less complacent.

PS: Also, I said "stagnation" in terms of performance per price point not in terms of hardware development. One can be both complacent and and still maintain a steady rhythm of hardware development, since complacency is about a false sense of security which includes unhealthy risks. Take us as a species for example, we are innovating at a high rate righ now, maybe higher than ever, but there's a chance that we've grown so complacent as to not act soon enough to save our own planet.
 
Last edited: