[Rumor, Tweaktown] AMD to launch next-gen Navi graphics cards at E3

Page 42 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Thala

Golden Member
Nov 12, 2014
1,355
653
136
Unless there's a version of Navi that matches up against the 2080 Ti it's a moot discussion anyway. If they're only competing against the 2070 and below, you're not going to purchase either of those cards if you want a good RT experience either.

Except of course, you can tweak you game settings to your liking anyway - be it with or without DXR enabled. There are less important game settings to reduce before you disable DXR such that raytracing is viable even on a RTX2060.
 

Glo.

Diamond Member
Apr 25, 2015
5,722
4,581
136
Lol no, AMD Navi GPUs do not even have Raytracing acceleration. And sorry, i am not going to play likes of Metro featuring raytraced global Illumination without this feature or even worse - without having the option to enable raytracing at all!
So Navi must be significantly cheaper than Turing.
You seriously say that before knowing anything about wheter Navi even needs RTX cores?
You seriously say that knowing full well, that RTX 2070, and 2080 are not allowing for enjoyable experience of Ray Tracing? So who cares?

I would 100% not buy into any RT gimmick at this point. The moment 200$ GPUs have RTX 2080 Ti performance levels, both in rasterization and RT - that is the moment we can start being excited about Ray Tracing tech.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
You seriously say that before knowing anything about wheter Navi even needs RTX cores?
You seriously say that knowing full well, that RTX 2070, and 2080 are not allowing for enjoyable experience of Ray Tracing? So who cares?

I am enjoing raytracing very much even on an RTX2060...thanks for asking. And the question of Navi needs RTX cores can only be answered if DXR or the according Vulkan extensions are supported...which we have not the slightest indication of.

I would 100% not buy into any RT gimmick at this point. The moment 200$ GPUs have RTX 2080 Ti performance levels, both in rasterization and RT - that is the moment we can start being excited about Ray Tracing tech.

If you dont want to buy into the single biggest IQ increase in games of the last few years, then thats your decision. Have fun with your old school screenspace effects and crude lighting approximations or missing shadows - but hey at least you have bad IQ at high resolution :)

The nice thing about RTX GPUs is, that you at least have the option of enabling advanced raytracing calculations, which you do not have with Navi.
 

Glo.

Diamond Member
Apr 25, 2015
5,722
4,581
136
If you dont want to buy into the single biggest IQ increase in games of the last few years, then thats your decision. Have fun with your old school screenspace effects and crude lighting approximations or missing shadows - but hey at least you have bad IQ at high resolution :)

The nice thing about RTX GPUs is, that you at least have the option of enabling advanced raytracing calculations, which you do not have with Navi.
Games haven't become better with using RT. For those who look for playability of games, like me, Image Quality is good addition but, overall meaningless. I will not pick a game to play just because it has Ray Tracing.

But there may be people who like Image Quality, over actual game. The Order: 1886 is the best example of it. When it was released it had great graphics. But the game itself was rubbish, and boring.

And especially. Even if I would buy a GPU with RT, which might happen, if Nvidia will lower non-Super RTX GPU prices, and GTX 1660 Ti/1660 prices, to not-only-for-morons levels, I would not buy it because it has Ray Tracing, but performance, and I will know that it does not offer anything meaningful in terms of Ray Tracing performance.

Buying ANY GPU that is slower than RTX 2080 Ti for Ray Tracing is pointless at this moment, regardless of what anyone thinks about it, becaue performance always is the most important thing, when you buy a GPU. Be it for rasterization, or Ray Tracing.

At least, it should be...
 

psolord

Golden Member
Sep 16, 2009
1,931
1,194
136
I believe Navi will support DXR. The performance is a whole other question.
Maybe they will add some DXR accuracy settings in the driver, in order to make it run faster, then leaving it to the user to chose acceptable RT quality, like they did before with tesselation in catalyst.

I remember reducing tesselation level quite a lot for my 5850s and in Crysis 2 I was getting quite better performance, with no perceivable image quality loss.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
Buying ANY GPU that is slower than RTX 2080 Ti for Ray Tracing is pointless at this moment, regardless of what anyone thinks about it, becaue performance always is the most important thing, when you buy a GPU. Be it for rasterization, or Ray Tracing.

At least, it should be...

This argument is flawed, as you can easily conclude that buying anything slower than RTX 2080 TI is pointless at all - you know, because performance is always the most important thing...

The reality however is, that there is room for GPUs below the RTX 2080TI as you can just reduce IQ options, which eventually makes the game playable to your liking. Raytraing is just an additional IQ option, which can even be tuned itself.

Point in case Shadow of the Tomb Raider:
RTX 2080TI RT ultra= 100% fps, RTX 2070 RT medium = 100% fps, RTX 2080 RT high = 94% fps, RTX 2060 RT medium = 85% fps.

from: Computerbase.de

In conclusion you are within 15% fps difference from RTX 2080TI down to RTX 2060 if you change the RT setting from ultra to medium - everything else being the same.

On Navi however? Not a chance at all - thats at least until DXR support is announced...then we can re-evaluate the situation.
 

Glo.

Diamond Member
Apr 25, 2015
5,722
4,581
136
In conclusion you are within 15% fps difference from RTX 2080TI down to RTX 2060 if you change the RT setting from ultra to medium - everything else being the same.

On Navi however? Not a chance at all - thats at least until DXR support is announced...then we can re-evaluate the situation.
So basically you still confirmed that its pointless at this point to buy anything below RTX 2080 Ti to get enjoyable experience with Ray Tracing.

Good. Now we can move on.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
So basically you still confirmed that its pointless at this point to buy anything below RTX 2080 Ti to get enjoyable experience with Ray Tracing.

Good. Now we can move on.

Not sure where you read this nonsense in my statements - in particular since i clearly said that i am enjoying raytracing on an RTX2060 a few post above!
And no - i am not getting an enjoyable exprience from fake screenspace effects, missing shadows or faked/static global illumination.

Are you trolling or what?
 

Glo.

Diamond Member
Apr 25, 2015
5,722
4,581
136
Not sure where you read this nonsense in my statements - in particular since i clearly said that i am enjoying raytracing on an RTX2060 a few post above!
Are you trolling or what?
Well you are the only one who may enjoy bare minimum above basic Rasterization.

The only GPU that gives you both: performance and Image Quality with Ray Tracing is RTX 2080 Ti. Anything else is waste of money, if you buy that GPU solely for Ray Tracing. Thats why I have, twice, written, that only GPU that gives good EXPERIENCE with RT is RTX 2080 Ti.

And if similar thing will happen will Navi, I will write the same thing.
 

coercitiv

Diamond Member
Jan 24, 2014
6,217
11,987
136
It doesn't get more ironic than this: claiming supremacy due to IQ features while defending the performance loss of said IQ gain with... IQ loss options.

Long live 1080p gaming, long live Low RT settings and forever live the Medium texture resolution! They are the key to an amazing IQ generational gain!
 

amenx

Diamond Member
Dec 17, 2004
3,944
2,179
136
1080p gaming to me = a degradation of visual fidelity. RT on it is like lipstick on a pig. I bought my RTX 2080 so I can do high res gaming on larger screens, which obviously is not viable with RT. Would have preferred a GTX 1080ti @ original MSRP, but none were around, so grudgingly went with the RTX 2080.
 
  • Like
Reactions: psolord

Thala

Golden Member
Nov 12, 2014
1,355
653
136
And if similar thing will happen will Navi, I will write the same thing.

What should happen to Navi? It will be mediocre in rasterization and a complete failure with raytracing...

The only GPU that gives you both: performance and Image Quality with Ray Tracing is RTX 2080 Ti.

Performance and image quality are not absolute, otherwise there would be no place for any GPU below the RTX2080TI. Repeating an illogical argument does not make it true.

It doesn't get more ironic than this: claiming supremacy due to IQ features while defending the performance loss of said IQ gain with... IQ loss options.

Because chosing the IQ features to your own liking is always a compromise. What is ironic about this? I increase one feature and reduce another. I am sure you also have a feature which you prioritize over others - does not get more ironic?

But that was not even the point.
I was merely showing that you can adjust the RT shadows feature to an extent, that there is literally no IQ difference between RTX2080TI and RTX2060 except the amount of shadows itself.
 

soresu

Platinum Member
Dec 19, 2014
2,670
1,874
136
If you dont want to buy into the single biggest IQ increase in games of the last few years, then thats your decision. Have fun with your old school screenspace effects and crude lighting approximations or missing shadows - but hey at least you have bad IQ at high resolution :)
It's not nearly as big an IQ increase as nVidia have been making out, especially if you are not pushing insane amounts of unrealistically glossy reflections, and certainly not when you have game engines like UE4 and Unity capable of such great graphics without needing RT/PT capabilities.
Take the latest UE4 version, and implement that Q2 RTX level and texture set, and you could get results close enough to the RT demo that it makes it seem a fairly trivial improvement, notice that they did not show an updated PBR raster engine next to the RT version of the demo, they compared it to an engine little better than the original demo release - raster engines have simply mastered the good enough level of PBR graphics approximation at this point, albeit at a high cost to engine complexity.

The main problem is that when you want to push further towards hyper real rendering, it becomes a real ball ache from a software complexity perspective to implement these hyper accurate graphical features in a purely raster engine - I say purely, because I have already seen a hybrid raster/machine learning based dynamic (character mesh) AO technique from EA/SEED that produces results extremely close to RT accuracy.

At the end of the day, RT simplifies the software implementation for hyper real graphics, but at the cost of greatly increasing the compute cost of rendering - even on RTX, without denoising it would not be truly viable/playable, and even then most current implementations are a hybrid of RT and rasterisation techniques.
 

DrMrLordX

Lifer
Apr 27, 2000
21,649
10,871
136
@soresu

Agreed, RT is not the "single biggest IQ increase in games of the last few years". Also seems out-of-place in a Navi thread. Not that, you know, I'm the best as keeping things on-topic here all the time, but still.

We have had ample discussions about RT in other threads. RT is not what's driving sales of Turing at the moment, and it certainly won't hold back Navi. It's a checkbox feature that can actually hinder performance enough that many people simply don't use it. I know I wouldn't, even on a 2080Ti.
 

soresu

Platinum Member
Dec 19, 2014
2,670
1,874
136
It doesn't get more ironic than this: claiming supremacy due to IQ features while defending the performance loss of said IQ gain with... IQ loss options.

Long live 1080p gaming, long live Low RT settings and forever live the Medium texture resolution! They are the key to an amazing IQ generational gain!
Future implementations of machine learning (AI/Deep lEarning) may mitigate the performance cost of RT, though likely by some degree of pre-training computation, either at loading time, or during game authoring.

This recent research from EA/SEED seems to be about getting higher quality AO in a sort of hybrid raster/machine learning technique, though it could likely benefit a RT hybrid method also.

There are already many research avenues being looked at to decrease RT/PT render time for offline VFX CG (non real time).
The last few years have shown great improvements in this area that have nothing to do with denoising, its simply a matter of optimising those techniques for low latency rendering with low sample rates (denoising mostly takes care of that part, and thats why faster denoising is also being researched all the time).
 

soresu

Platinum Member
Dec 19, 2014
2,670
1,874
136
@soresu

Agreed, RT is not the "single biggest IQ increase in games of the last few years". Also seems out-of-place in a Navi thread. Not that, you know, I'm the best as keeping things on-topic here all the time, but still.

We have had ample discussions about RT in other threads. RT is not what's driving sales of Turing at the moment, and it certainly won't hold back Navi. It's a checkbox feature that can actually hinder performance enough that many people simply don't use it. I know I wouldn't, even on a 2080Ti.
Crytek's efforts show that RT on cards without fixed function is very possible, so software and machine learning hybrid optimisations can certainly tip the game dev world towards RT, so its still relevant to Navi I think, as long as game engine devs work hard on optimising RT instead of releasing first and optimising later.
 

DrMrLordX

Lifer
Apr 27, 2000
21,649
10,871
136
Crytek's efforts show that RT on cards without fixed function is very possible, so software and machine learning hybrid optimisations can certainly tip the game dev world towards RT, so its still relevant to Navi I think, as long as game engine devs work hard on optimising RT instead of releasing first and optimising later.

Right, but the game engine work hasn't exactly happened yet. What we have right now is ephemeral DXR on non-NV cards (and on NV cards w/o RT features) and RT on RTX 2060/2070/2080/2080Ti that slows down performance considerably unless you also try DLSS which is very hit-and-miss to say the least.

I think it's safe to say that the majority of the purchasing decisions surrounding Navi will not be made with reference to DXR. The current expectation is that you'll have no RT at all (per @Thala's commentary) or you'll have to buy into NV's RTX ecosystem which is still sort of a mess. It could be years before DXR is a real option for everyone.
 

soresu

Platinum Member
Dec 19, 2014
2,670
1,874
136
Right, but the game engine work hasn't exactly happened yet. What we have right now is ephemeral DXR on non-NV cards (and on NV cards w/o RT features) and RT on RTX 2060/2070/2080/2080Ti that slows down performance considerably unless you also try DLSS which is very hit-and-miss to say the least.
What's pretty entertaining is the fact that people have made fun of Crytek for years, and now their SVOGI implementation and expertise is probably what enabled them to make such an effective RT implementation, I'm not sure it even uses DXR at present, so there's that too.

As you say, the trouble is that the others have largely not bothered with such high end gfx techniques until now, so they are essentially playing catch up to Crytek - though there's some possibility that the guy who moved from them to iD software has used the benefit of that experience to kickstart the RT effort on the id tech 7 engine, the lack of more recent trailers makes me think it's been delayed a bit for RT.
 
  • Like
Reactions: psolord

DrMrLordX

Lifer
Apr 27, 2000
21,649
10,871
136
there's some possibility that the guy who moved from them to iD software has used the benefit of that experience to kickstart the RT effort on the id tech 7 engine, the lack of more recent trailers makes me think it's been delayed a bit for RT.

I don't really want to see "one engine to rule them all" per se, but iD is already interesting in their ability to support mGPU in DX12/Vulkan titles. If they wound up with the industry's leading RT/DXR support and mGPU, I can see few reasons for publishers NOT to look hard at iDTech engines in the future.

So back on-topic-ish:

3 days until E3. Any new rumours before AMD does their thing at the show?
 

soresu

Platinum Member
Dec 19, 2014
2,670
1,874
136
I don't really want to see "one engine to rule them all" per se, but iD is already interesting in their ability to support mGPU in DX12/Vulkan titles. If they wound up with the industry's leading RT/DXR support and mGPU, I can see few reasons for publishers NOT to look hard at iDTech engines in the future.
You are confusing publishers with developers - though I can understand why, given that today most major developers are hand cuffed to a specific publisher.
id Tech seems to be landlocked to Bethesda based developers unfortunately, which is a shame as Raven used to be able to work magic with their engines. This is why id Tech is only found on Doom, Wolfenstein and Fallout now (Rage 2 uses something else).

As to RT and mGPU, in theory RT is already a highly parallel scalable compute problem - if you can do 1080p with one Vega 56, then 2-4 Navi cards should eat 4K for breakfast, even without FF ray traversal/intersection acceleration. The main problem in that is rasterising each frame, as even pure RT still needs to rasterise each frame to make it viewable, unless you are using a lightfield display that is.
 

soresu

Platinum Member
Dec 19, 2014
2,670
1,874
136
3 days until E3. Any new rumours before AMD does their thing at the show?
There's little point speculating much only 2 days out, the Next Horizon disclosure on Zen 2 was heavy on details of the uArch - I'd expect the one on monday to do the same for Navi/RDNA, though the absolute nitty gritty will likely be saved for HotChips in August.
 

Ajay

Lifer
Jan 8, 2001
15,473
7,882
136
Future implementations of machine learning (AI/Deep lEarning) may mitigate the performance cost of RT, though likely by some degree of pre-training computation, either at loading time, or during game authoring.

Pretraining requires huge datasets and too much compute power to be done on the fly with a singe consumer grade GPU.
 

soresu

Platinum Member
Dec 19, 2014
2,670
1,874
136
Pretraining requires huge datasets and too much compute power to be done on the fly with a singe consumer grade GPU.
Depends on the use case, were not talking about a neural net that can optimise light sampling for all possible scene configurations, only for specific geometry and environments, which as I mentioned could be trained during game production, similar to how UE4 or Unity builds the light maps when changes are made.
The training dataset size is relative to the ambiguity of the features/patterns you wish the NN to recognise in a given image, if the NN is pre trained on first light samples of a piece of gemoetry from all angles, and is only used for that specific piece of geometry, then the pattern is not ambiguous at all.
As to whether it could be done at load time or not, I would say never say never, given how much ML techniques have improved over the last few years.

Look up "Path Guiding" on Google, its quite an interesting subject given the huge compute demands of RT and PT rendering.
I researched the current state of the field (ML in rendering and DCC) for my masters dissertation, and the benefits are already looking extremely promising for the likes of Pixar/Disney working on it, and likely every developer of offline renderers are already salivating at the prospect of decreased render times - I'd expect to find ML and GFX focused hardware manufacturers like AMD/nVidia close behind, looking for ways to exploit similar techniques in real time rendering.

Some of the Path Guiding, or Neural Importance Sampling methods I have seen were trained on only a CPU, given that optimising variants of methods for GPU seemingly takes longer - ergo once ideal methods are discovered, heavy optimisation can be applied to the NN for GPU/Tensor Cores, or whatever works best - usually both in memory size and running speed.
 
Status
Not open for further replies.