Ray tracing the new "physX"?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ZeroRift

Member
Apr 13, 2005
195
6
81
What do you think low level coding and specialized paths are?

In a phrase, "Console Specific."

In theory, the same practices could be implemented for specific PC hardware configurations, but that isn't done due to the highly diverse ecosystem of PC hardware and the relatively high cost of development for such code paths. DX12 / Metal / Vulcan helps to mitigate this issue, but does not fully address it due to the required minimum levels of abstraction in these APIs.

That being said, I think the spirit of your question was more in the vein of, "do you think that low level coding on consoles relates directly to PC rasterization?"

To that question, I'd have to answer "No."

I think that the need for abstraction layers in PC software makes such an approach intractable, and that the percent of low-level code one could theoretically move between disparate hardware platforms is negligible due to being so closely coupled with the hardware itself.

For the same reasons, I would say that the kind of low-level coding which is done on consoles does not directly relate to rasterization in a broad sense. It may help accelerate rasterization for a highly specific hardware implementation, but I don't see implicit evidence that this type of software development has any kind of cross platform value.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Sorry but no, you are not correct. Rasterization on consoles absolutely has an impact on PC. They adapt techniques to specific hardware realities but the overall approach is not different between the platforms. They re-use code any time possible, so yes, rasterization techniques on consoles have a direct linkage with those on PC. Both platforms use a mixture of high level and low level code in different areas, and in differing amounts
 

ZeroRift

Member
Apr 13, 2005
195
6
81
Sorry but no, you are not correct. Rasterization on consoles absolutely has an impact on PC. They adapt techniques to specific hardware realities but the overall approach is not different between the platforms. They re-use code any time possible, so yes, rasterization techniques on consoles have a direct linkage with those on PC. Both platforms use a mixture of high level and low level code in different areas, and in differing amounts

I think we agree on that point.

It's specifically the re-use of low level code that I was disputing.

Where I think we don't agree is which force is more influential in the context of innovation.

In that regard, I would broadly claim the following:
1) Rasterization innovations on consoles are essentially work-arounds for hardware limitations. That is to say that, PC rasterization techniques are the preferred default for contemporary consoles, but many of them are too expensive for use on consoles. Thus, any optimizations to rasterization which are made due to these factors only apply to situations which are similarly hardware constrained. When these constraints are removed, the work-arounds are abandoned.
2) Like many other industries, the cost of introducing net-new functionality tends to be fronted by early adopters. Thus, it is only natural for net-new technologies to land on PC first. In other words, net-new features and industry paradigm shifts tend not be governed by consoles ("late adopters").
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,824
7,187
136
At this point in the game, I would liken RTX to Physx: a proprietary hardware based solution that adds eye candy to a scene and that is limited by it's availability to consumers and its closed nature.

Not to say the potential of either technology isn't/wasn't vast if leveraged properly.

Hell even Physx could, in theory, run on any CUDA enabled card from the 8K series onward while RTX looks like it will be restricted for the time being to three very limited and exclusive top end parts (for consumers), so RTX is already more limited in it's install base than Physx was at it's inception.

NV might shock us all by just putting RT/TN hardware down the rest of the stack and issue a challenge to devs to see what they could do with the hardware, but if Real-time RT isn't one of those things, interest in this round of RT might wane quickly, much like GPU accelerated Physx fizzled out slowly but surely (with the exception of some nice hair here or there).

I feel that, despite my skepticism/cynicysm regarding the RTX program, I again have to commend NV for really throwing the wiliest of curve balls our way. The activity on this board is proof positive that, love it or hate it, the RTX series/program has people talking.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I wonder how much of it ties to some of the stuff Peter Scott said in the thread regarding node sizes.

What if NV's game plan is lead 2000 series with moon in the sky product to give it the "you want this" for the whales (because they will buy it) using 12nm. Then once 7nm is nice and ready you get the lower portion of the product stack with TPU's at a more reasonable price.

Something along the lines of:

2018 all on 14nm and 12nm:
GTX 1080 Ti rebranded to $500 GTX 2080 on shelf with $700 RTX 2080 (premium for RTX of course)
GTX 1080 rebranded to $300 GTX 2060

2019 with 7nm pipe cleaner and prepping for next gen cycle or whatever they do:
RTX 2060/3060 first 7nm card in the $300-400 price range with RT hardware (name will be based on release, ie if closer to refresh might be first for 3000 series, if no refresh in sight, just slots in above GTX 2060 based on GP104)

Of course it throws a monkey wrench in to the historical trends of how NV has handle things, but they've already done it with Fermi to Kepler transition where GK104 got the x80 moniker. They could have easily named the current series GTX also, but they are doing something with their marketing here.

We're getting closer to more hard data, so that's exciting.
 

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
2019 with 7nm pipe cleaner and prepping for next gen cycle or whatever they do:
RTX 2060/3060 first 7nm card in the $300-400 price range with RT hardware (name will be based on release, ie if closer to refresh might be first for 3000 series, if no refresh in sight, just slots in above GTX 2060 based on GP104)

I doubt that. Much points to 2070 being based in TU106 chip meaning there are actually 3 Turing dies meaning there simply won't be a quick refresh on 7nm because else NV can't recuperate the costs. Will probably be 2020 till we see an NV 7nm mid-high range consumer gpu. (same for AMD as Vega 20 is professional / high-end)

I feel there was/is a lot going on behind the scenes on access to TSMC 7nm. I feel that AMD one-uped NV rather some time ago, 1-2 years to get early access for Zen 2 and GPUs. Hence NV is now stuck on 16/12nm for much longer than we would assume. maybe the reason they came up with 12nm to begin with.
And NV didn't all care that much because they can easily compete vs AMD with 16/12nm products well into 2020.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Nvidia hasn't had a calendar year without a GeForce flagship before (although 2011's was simply the dual GPU GTX 590). Since we don't get new architectures every year, they have delayed the big chip until the off year. But with the TU102 coming this year, what's in store for next year? I guess it could be a Titan RTX but there's not too many enthusiasts who went Titan Xp (launched after 1080 Ti), so not too many would pay huge sums again for just a fully enabled TU102.

Something is coming next year. We just had enthusiasts throw down ~$1300 for unknown performance GTX 2080 Ti's, simply because it's obvious they are a tier faster than the 1080 Ti and people want more performance. If Nvidia can get a decently faster 7nm card out in 2019 then they will and they will again charge a fortune for it.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Nvidia hasn't had a calendar year without a GeForce flagship before (although 2011's was simply the dual GPU GTX 590). Since we don't get new architectures every year, they have delayed the big chip until the off year. But with the TU102 coming this year, what's in store for next year? I guess it could be a Titan RTX but there's not too many enthusiasts who went Titan Xp (launched after 1080 Ti), so not too many would pay huge sums again for just a fully enabled TU102.

Something is coming next year. We just had enthusiasts throw down ~$1300 for unknown performance GTX 2080 Ti's, simply because it's obvious they are a tier faster than the 1080 Ti and people want more performance. If Nvidia can get a decently faster 7nm card out in 2019 then they will and they will again charge a fortune for it.

People like to have patterns, so they look at the past trying to find patterns, and when they see one, they think it is some kind of rule that can't be violated. But it isn't. In reality companies just do what makes the most sense to them at the time, and people are establishing patterns in retrospect. So no pattern is set in stone, and they get broken all the time.

Like the "new flagship" every year. What was 2017's "new flaghip"? 1018Ti? That really wasn't new. It was just rebranded Titan X at a lower price. If that's the pattern, it's already broken(because there is no Titan). So really there was no real new flagship HW in 2017. It was just a marketing update. I suppose they could fully enable TU102, and call it 2085Ti as Titan seems dead now for gaming cards. But of course a broken pattern on Gaming Titans could be reestablished. Because these are just marketing game and can change at a whim.

As for real new HW products, I feel fairly certain we will see a 7nm GPU from NVidia in 2019. Targets might be a lower end pipe cleaner GTX 2060. Will be a tiny, not RTX product great perf/watt, and perf/$ and great for laptops. Or they might really want to die shrink TU102 ASAP.

2019 should be a VERY interesting GPU year. With new unknown, unnamed, 7nm gaming GPU HW likely from both AMD and NVidia.

It will also be interesting to see how AMD handles Raytracing, since they might not have specific RT HW in their 2019 cards. Do they do some kind of driver that offers 2nd class RT capability, or do they play it down and concentrate on giving better Raster perf/$?

To tie back to the original topic. RT will soon be table stakes. AMD realizes and are no doubt working on their own RT HW solutions. But given lead times, it will probably be 2020 before they have something competitive.

RT is here to stay and soon everyone will have it on upper end cards.
 
Last edited:

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
2017 offered arguably 3 flagships.

GTX 1080 Ti - for those that ignore Titans (fastest GeForce GTX Number Symbol)
Titan Xp - Fastest consumer gaming card period
Titan V - Fastest gaming card, albeit marketed at prosumers/professionals

I don't think you can brush off the 1080 Ti as a faux flagship. Plenty of people always ignore the Titan as they rightfully figured out its a Ti early adopter tax. It was the flagship for the majority of 980 Ti owners who have since upgraded.

I think we'll see something at least 25% faster than the 2080 Ti next year, worst case it may only be a 7nm Titan with a higher price. Since they have stopped with the Titan early adopted Ti tax, maybe we'll see a 7nm shrunk 2080 Ti (or rebrand) at a lower price point as well.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I doubt that. Much points to 2070 being based in TU106 chip meaning there are actually 3 Turing dies meaning there simply won't be a quick refresh on 7nm because else NV can't recuperate the costs. Will probably be 2020 till we see an NV 7nm mid-high range consumer gpu. (same for AMD as Vega 20 is professional / high-end)

I feel there was/is a lot going on behind the scenes on access to TSMC 7nm. I feel that AMD one-uped NV rather some time ago, 1-2 years to get early access for Zen 2 and GPUs. Hence NV is now stuck on 16/12nm for much longer than we would assume. maybe the reason they came up with 12nm to begin with.
And NV didn't all care that much because they can easily compete vs AMD with 16/12nm products well into 2020.

By refresh I mostly mean what NV has done a few times with a bigger die, but I guess with the 2080 Ti hitting the scene with first batch that scenario is gone. Is TU102 cut down at all? (I assume it is).

Interesting to see how it all plays out. The 2080 Ti is most likely going to be my last card until 2020. Hopefully by then we get some competition, I'm dying to see what Intel brings to the fight.
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
By refresh I mostly mean what NV has done a few times with a bigger die, but I guess with the 2080 Ti hitting the scene with first batch that scenario is gone. Is TU102 cut down at all? (I assume it is).

Interesting to see how it all plays out. The 2080 Ti is most likely going to be my last card until 2020. Hopefully by then we get some competition, I'm dying to see what Intel brings to the fight.


Yeah, the Ti is a cutdown TU102. It is what it is.

I want to see what Intel has as well, but I owned their last foray into GPU's and it was underwhelming to say the least. They aren't going to come in with a NV killer. NO WAY.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Yeah, the Ti is a cutdown TU102. It is what it is.

If 7nm isn't ready for mass consumption, I could see NV doing a refresh with the cutdown TU102 slotting lower in a 3000-series, with a full TU102 taking the top spot. Definitely not commanding the same prices, but who knows anymore.

I want to see what Intel has as well, but I owned their last foray into GPU's and it was underwhelming to say the least. They aren't going to come in with a NV killer. NO WAY.

Yeah, no kidding. Intel has shown they have more money then brains lately. But, Raja is a good asset. I don't expect an NV killer, hell not even an AMD killer, but at least something to compete, which can grow. If they put out what they've done the last few times, I'm writing them off completely and will just tattoo NV's name on my forehead since they'll be owning me for a while.
 
  • Like
Reactions: DooKey

urvile

Golden Member
Aug 3, 2017
1,575
474
96
I think ray tracing will put some lighting artists out of work*. At the moment ray tracing doesn't seem like much more than Nvidia using it to hype their latest cards. They need some new whizz bang tech to justify the exorbitant price of their RTX cards I guess.

*I hope they have a union.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
If 7nm isn't ready for mass consumption, I could see NV doing a refresh with the cutdown TU102 slotting lower in a 3000-series, with a full TU102 taking the top spot. Definitely not commanding the same prices, but who knows anymore.



Yeah, no kidding. Intel has shown they have more money then brains lately. But, Raja is a good asset. I don't expect an NV killer, hell not even an AMD killer, but at least something to compete, which can grow. If they put out what they've done the last few times, I'm writing them off completely and will just tattoo NV's name on my forehead since they'll be owning me for a while.

Just do the logo....Name would look silly.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I think we agree on that point.

It's specifically the re-use of low level code that I was disputing.

Where I think we don't agree is which force is more influential in the context of innovation.

In that regard, I would broadly claim the following:
1) Rasterization innovations on consoles are essentially work-arounds for hardware limitations. That is to say that, PC rasterization techniques are the preferred default for contemporary consoles, but many of them are too expensive for use on consoles. Thus, any optimizations to rasterization which are made due to these factors only apply to situations which are similarly hardware constrained. When these constraints are removed, the work-arounds are abandoned.
2) Like many other industries, the cost of introducing net-new functionality tends to be fronted by early adopters. Thus, it is only natural for net-new technologies to land on PC first. In other words, net-new features and industry paradigm shifts tend not be governed by consoles ("late adopters").

I think that's a pretty fair assessment, and I agree. I think we've seen a bit of a shift on the last 2 console generations (e.g. Xbox One/PS4 and refresh) where the console companies are actively pushing new hardware development to support new techniques in advance of PC. I'm thinking of specifically some of the virtualization on Xbox and the checkerboard rendering on PS4. Also low level API's generally. Though I agree with you that largely the new techniques show up on PC first for consumers
 

biostud

Lifer
Feb 27, 2003
18,251
4,765
136
So it might be that raytracing will be a limiting factor in most games, but would a close to perfect scaling with NV-link in SLI in raytracing, boost sales? And also the reason gtx 2070 does not support SLI, as it would make the gtx 2080 ti obsolete as a single card?
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Obviously different things but also share some similarities.

The adoption of ray tracing depends on wide spread hardware support, and significant visual improvements in the games that supports it and lots of games that supports it.

With the launch prices, and hardware adoption in laptops and mid range gaming machines it will take years before Turing technology is wide spread, and thus developers will have to weigh whether the extra development cost is worth it.

Will ray tracing even make it to the mid range cards, and if so will it be powerful enough to enable it in games?

Will there be different levels of ray tracing, just like physX?

Will it be worth it visually vs fps drop?

Will AMD support raytracing (or intel in the future)?

Or is this simply too big to fail for nvidia?

What do you say, O mighty forum?

Did you make any kind of similar argument when Nvidia was first to market with T&L and programmable cores? Probably not.

Ray tracing is the "holy grail" of accurate rendering. Whether or not the performance is currently there, or the price to pay for it's ability is worth it, is another subject. But sooner or later, make no mistake, AMD and Intel will have full hardware ray tracing support.
 

biostud

Lifer
Feb 27, 2003
18,251
4,765
136
Is it an argument, if it is a question?

Believe it or not, but I actually wanted to hear arguments, not just bash nvidia