Raytracing is worth the money. (Opinion)

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

killster1

Banned
Mar 15, 2007
6,208
475
126
LOL yeah, everyone will do that. The target market is so tiny it isn't worth the R&D costs.

If it worked the target wouldn't be so tiny. All of my friends ran SLI / Crossfire back in the day. I would defiantly buy 4 cards if it scaled correctly. If it doesn't then i am not even interested in ray tracing as FPS and res is more important. But if i could get ray tracing with high res and constant fps.. YES.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,853
3,211
126
f it doesn't then i am not even interested in ray tracing as FPS and res is more important. But if i could get ray tracing with high res and constant fps.. YES.

This is exactly my point too.
We want FPS... not to be deprived of it.
This is why we SLI'd until Nvidia drivers were so slow that by the time the drivers got optimized for it to work for SLI, we would be on the next AAA title that was out which again had no real stable SLI drivers for.

Its a constant dog chase tail.

This is why i am so tired of NV's shams.
And honestly 1200 dollars for a GPU is IMO too much.
The day your GPU is half the value of your entire PC is seriously over stepping bounds.

1200 dollars enterprise sector.... not a problem.
1200 dollars to gamers... WTH are you smoking.... and on top we get a FPS decrease?
 

Alpha One Seven

Golden Member
Sep 11, 2017
1,098
124
66
The first cards out are supposed to be better than the older cards for FPS but the cards they will release after the 10 series dries up will not even have raytracing as an option, but will yield better FPS than their current generation counterparts.
 

Alpha One Seven

Golden Member
Sep 11, 2017
1,098
124
66
This is exactly my point too.
We want FPS... not to be deprived of it.
This is why we SLI'd until Nvidia drivers were so slow that by the time the drivers got optimized for it to work for SLI, we would be on the next AAA title that was out which again had no real stable SLI drivers for.

Its a constant dog chase tail.

This is why i am so tired of NV's shams.
And honestly 1200 dollars for a GPU is IMO too much.
The day your GPU is half the value of your entire PC is seriously over stepping bounds.

1200 dollars enterprise sector.... not a problem.
1200 dollars to gamers... WTH are you smoking.... and on top we get a FPS decrease?
It would take a lot more than a 1200.00 card to make the GPU worth half of the cost of the PC for me though, I use Big Storage (4x4tb Samsung SSDs) and a lot of fast memory for my video production and music studio.
 
May 13, 2009
12,333
612
126
How do you know that?


Opinion:
Nobody should purchase any NV hardware (as long as NV does not collaborate fully with open source developers)
I did buy a 1080 recently. I agree they are a shady company. I was semi boycotting them for a good while but they offered the most fps for the money. Amd is pricing their cards like mining is still going strong.
 

Alpha One Seven

Golden Member
Sep 11, 2017
1,098
124
66
How do you know that?


Opinion:
Nobody should purchase any NV hardware (as long as NV does not collaborate fully with open source developers)
From an interview I read with the head of distribution. (I think she is the head of distribution at least, maybe a different title, but same thing.)
 

Alpha One Seven

Golden Member
Sep 11, 2017
1,098
124
66
How do you know that?


Opinion:
Nobody should purchase any NV hardware (as long as NV does not collaborate fully with open source developers)
I don't think anyone should buy any products from anyone but open source suppliers, including your water, sewer, garbage, electrical and phone services.
 

Alpha One Seven

Golden Member
Sep 11, 2017
1,098
124
66
wbtydlsubgq4qtcbe1vc.jpg

Image: Nvidia
I remember the first time I learned about raytracing using 3ds Max many years ago to render scenes with realistic lighting. It took a long time with the highest end computer(s) you could build at the time and professional studios used render farms of 20 to 30 PCs together to solve the complex equations required to preform a rendering of a video scene. I would often set up the scene animations, perform the raytracing calculations and set up how often it would recompute the scene as objects moved in the scene after every frame, then hit the render button and shut off the monitor and go to bed and let it do it's thing all night. In the morning there was a new fully rendered animation to view with ray traced lighting. Sometimes it was fine and other times there would need to be some tweaking done and a new render performed. It was to say the least, very time consuming as I only had 4 computers to set to the rendering task.
Today nVidia has created a new GPU that is capable of performing those computations in real time as you play a game. This is the most amazing thing to anyone that has the background in raytracced scene rendering that I have since sliced bread.
I am buying the first generation of these cards partly because I want to support the huge amount of resources and expertise that have culminated in this new product but also because, even though the technology will only get better as time goes on, I want to own the first generation of GPU that has this exciting new technology to have a piece of computer history in my collection. I still have old computers from an Altair to a Sinclair to a Vic and many many generations of graphics cards from old mx series through to today's most recent versions and all still work.
To me technology is art as much as it is science.


Here's an article on Gizmodo that has some more information you may be interested in:
Why Ray Tracing On Nvidia's New GPUs Is So Exciting
 

amenx

Diamond Member
Dec 17, 2004
3,983
2,225
136
wbtydlsubgq4qtcbe1vc.jpg

Image: Nvidia
I remember the first time I learned about raytracing using 3ds Max many years ago to render scenes with realistic lighting. It took a long time with the highest end computer(s) you could build at the time and professional studios used render farms of 20 to 30 PCs together to solve the complex equations required to preform a rendering of a video scene. I would often set up the scene animations, perform the raytracing calculations and set up how often it would recompute the scene as objects moved in the scene after every frame, then hit the render button and shut off the monitor and go to bed and let it do it's thing all night. In the morning there was a new fully rendered animation to view with ray traced lighting. Sometimes it was fine and other times there would need to be some tweaking done and a new render performed. It was to say the least, very time consuming as I only had 4 computers to set to the rendering task.
Today nVidia has created a new GPU that is capable of performing those computations in real time as you play a game. This is the most amazing thing to anyone that has the background in raytracced scene rendering that I have since sliced bread.
I am buying the first generation of these cards partly because I want to support the huge amount of resources and expertise that have culminated in this new product but also because, even though the technology will only get better as time goes on, I want to own the first generation of GPU that has this exciting new technology to have a piece of computer history in my collection. I still have old computers from an Altair to a Sinclair to a Vic and many many generations of graphics cards from old mx series through to today's most recent versions and all still work.
To me technology is art as much as it is science.


Here's an article on Gizmodo that has some more information you may be interested in:
Why Ray Tracing On Nvidia's New GPUs Is So Exciting
No different from the Toms OP-ed you posted earlier. We KNOW that RT is THE shizzle and the next revolutionary graphics leap forward. Just that the new RTX cards are very weak in demonstrating its practical application at this point in time from all evidence we've seen so far. If 2080ti owners will be thrilled with ditching their high res, high refresh rate displays to settle for 60fps @ 1080p just to enjoy a little RT under these limitations, it would certainly be interesting hearing their experiences with it... and seeing how long before the novelty wears off and become more keen to get back to their high res, high refresh gaming.
 
  • Like
Reactions: Elfear

Alpha One Seven

Golden Member
Sep 11, 2017
1,098
124
66
No different from the Toms OP-ed you posted earlier. We KNOW that RT is THE shizzle and the next revolutionary graphics leap forward. Just that the new RTX cards are very weak in demonstrating its practical application at this point in time from all evidence we've seen so far. If 2080ti owners will be thrilled with ditching their high res, high refresh rate displays to settle for 60fps @ 1080p just to enjoy a little RT under these limitations, it would certainly be interesting hearing their experiences with it... and seeing how long before the novelty wears off and become more keen to get back to their high res, high refresh gaming.
It's actually completely different and is more to the point of what is going on in the industry and what makes it such an important turning point. I take it you have not done a lot of rendering of ray traced scenes before so you probably won't get that.
You are also assuming that every consumer has the most expensive display available, I doubt that is true.
I personally don't care if no one else is excited about the tech, the OP was to share why I am so excited about it and what it means to me and a lot of others but not everyone, that would be boring.
 

amenx

Diamond Member
Dec 17, 2004
3,983
2,225
136
It's actually completely different and is more to the point of what is going on in the industry and what makes it such an important turning point. I take it you have not done a lot of rendering of ray traced scenes before so you probably won't get that.
You are also assuming that every consumer has the most expensive display available, I doubt that is true.
I personally don't care if no one else is excited about the tech, the OP was to share why I am so excited about it and what it means to me and a lot of others but not everyone, that would be boring.
You are simply deflecting and avoiding addressing the main points raised. I think its fair to say we are all excited about RT. Just not excited about weak, early implementations of it at high cost. Doubt many who spend $1000 upwards on a GPU would be saddled with a cheap 1080p display. They may exist, just would be weird to see how they prioritize their components choices. There are a LOT of decently priced displays above 1080p and a LOT of high powered GPU owners wishing to utilize them over 1080p.

And from your earlier post... "I am buying the first generation of these cards partly because I want to support the huge amount of resources and expertise that have culminated in this new product but also because, even though the technology will only get better as time goes on, I want to own the first generation of GPU that has this exciting new technology to have a piece of computer history in my collection." THAT is where many here may not agree with.
 
Last edited:

Alpha One Seven

Golden Member
Sep 11, 2017
1,098
124
66

Alpha One Seven

Golden Member
Sep 11, 2017
1,098
124
66
You are simply deflecting and avoiding addressing the main points raised. I think its fair to say we are all excited about RT. Just not excited about weak, early implementations of it at high cost. Doubt many who spend $1000 upwards on a GPU would be saddled with a cheap 1080p display. They may exist, just would be weird to see how they prioritize their components choices. There are a LOT of decently priced displays above 1080p and a LOT of high powered GPU owners wishing to utilize them over 1080p.

And from your earlier post... "I am buying the first generation of these cards partly because I want to support the huge amount of resources and expertise that have culminated in this new product but also because, even though the technology will only get better as time goes on, I want to own the first generation of GPU that has this exciting new technology to have a piece of computer history in my collection." THAT is where many here may not agree with.
I would hope that not many would agree, I like to think I am an individual and not a member of a flock.
 

mv2devnull

Golden Member
Apr 13, 2010
1,500
145
106
professional studios used render farms of 20 to 30 PCs together to solve the complex equations required to preform a rendering of a video scene.
I thought that RT is extremely simple math. Nothing complex. There is just a lot of it, and much of that is sequential.
 

Geegeeoh

Member
Oct 16, 2011
145
126
116
As it presents itself in Turing, real-time raytracing doesn’t completely replace traditional rasterization-based rendering, instead existing as part of Turing’s ‘hybrid rendering’ model. In other words, rasterization is used for most rendering, while ray-tracing techniques are used for select graphical effects.
[…]
Essentially, this style of ‘hybrid rendering’ is a lot less raytracing than one might imagine from the marketing material. Perhaps a blunt way to generalize might be: real time raytracing in Turing typically means only certain objects are being rendered with certain raytraced graphical effects, using a minimal amount of rays per pixel and/or only raytracing secondary rays, and using a lot of denoising filtering; anything more would affect performance too much.

https://www.anandtech.com/show/13282/nvidia-turing-architecture-deep-dive


Keep this word in mind: HYBRID, we are a loooooooong way from a "pure" raytracing.
It's still 99.999% rasterization.
 
Last edited:

Alpha One Seven

Golden Member
Sep 11, 2017
1,098
124
66

https://www.anandtech.com/show/13282/nvidia-turing-architecture-deep-dive


Keep this word in mind: HYBRID, we are a loooooooong way from a "pure" raytracing.
It's still 99.999% rasterization.


The 'Hybrid' label comes from the fact that there are 3 different types of cores doing the rendering, the RT cores, the CUDA cores and the Tensor cores. It does not mean that the raytracing is not true ray tracing in real time, it is, it means that there are different cores working together to do the work in real time, like a hybrid car has electric motors and gas engines working to do the job, it does not mean it is not a real car! lol You seem to be confusing the word "hybrid" (noun a thing made by combining two (or more) different elements; a mixture. "the final text is a hybrid of the stage play and the film" - adjective of mixed character; composed of mixed parts. "Mexico's hybrid postconquest culture")
with the word "pseudo" (
adjective not genuine; sham.)

NVIDIA RETHINKS THE GRAPHICS CARD WITH THE RTX 2080

From the original article:



The big new feature of Nvidia’s Turing architecture is the ability to support real-time ray tracing in games. Ray tracing is a rendering technique used by movie studios to generate light reflections and cinematic effects. Cars, released back in 2006, was the first extensively ray-traced movie, and many studios now use ray tracing in modern films. Even rendering tools like Adobe AfterEffects, Maya, and 3ds max all support some form of ray tracing, and it’s a technique that’s considered the “holy grail” for video games.
Turing sees Nvidia introduce new Tensor cores and RT cores. The RT cores are dedicated to ray tracing, while the tensor cores will be used for Nvidia’s AI work in games. Turing also includes some big changes to the way GPU caches work. Nvidia has moved to a unified memory structure with larger unified L1 caches and double the amount of L2 cache. It’s essentially a rewrite of the memory architecture, and the company claims the result is 50 percent performance improvement per core. Nvidia has also moved to GDDR6 with its RTX 2080, which means there’s as much as 50 percent higher effective bandwidth for games.

“This is a new computing model, so there’s a new way to think about performance,” said Nvidia CEO Jensen Huang at the company’s unveiling event in Germany last month. That new way to think about performance underlines many of the changes that Nvidia is making with RTX, in an era where Moore’s law is over. It’s not just how much faster a CPU or GPU is anymore that matters, nor the number of cores it has. It’s now the big architectural changes that will make a difference for both image quality and performance for years to come. Nvidia, Intel, AMD, Samsung, Apple, and many others will increasingly need to do more with the existing transistors on chips instead of continuing to shrink their size. Nvidia has clearly realized this inevitability, and it’s time for a change of pace.

Nvidia is offloading some of the work that would normally be handled by the company’s CUDA cores to these separate Tensor and RT cores. Nvidia is also introducing new rendering and texture shading techniques that will allow game developers to avoid having to continually shade every part of a scene when the player moves around. This will help improve performance, but Nvidia is going a step further with the help of supercomputers.

hcHkNeZ.png

Nvidia Deep Learning Super-Sampling (DLSS) could be the most important part of the company’s performance improvements. DLSS is a method that uses Nvidia’s supercomputers and a game-scanning neural network to work out the most efficient way to perform AI-powered antialiasing. The supercomputers will work this out using early access copies of the game, with these instructions then used by Nvidia’s GPUs. Think of it like the supercomputer working out the best way to render graphics, then passing that hard-won knowledge onto users’ PCs. It’s a complex process, but the end result should be improved image quality and performance whether you’re playing online or offline.

The real test for DLSS will be how well it improves performance for 4K games. While the GeForce GTX 1080 Ti struggled to hit 60 fps in 4K for modern games, Nvidia is promising some impressive performance gains at 4K with DLSS enabled on the RTX 2080. We only recently received cards for testing, so we’ll be checking to see just how well some of these DLSS games perform in the coming weeks.
 

Alpha One Seven

Golden Member
Sep 11, 2017
1,098
124
66
They use a principle of light called reciprocity, which states the inverse of a light beam works the same way as the original to cast rays from the virtual camera out into the scene. That means only rays which will contribute to the final scene are cast, greatly increasing efficiency. Those rays are then followed (traced) as they bounce around until they either hit a light source or exit the scene. Even when they exit the scene, it could be at a point that adds light (like the sky), so either way, the amount of illumination added to each surface the ray hits is then added to the scene. The software may also limit how many reflections it will follow for a ray if the light contribution is likely to be small.

How Nvidia’s RTX Real-Time Ray Tracing Works
 

Geegeeoh

Member
Oct 16, 2011
145
126
116
Raytracing is going to be used for some little things, this is not the CGI from the movies. We are still stuck with rasterization.
If you can't process the few lines i quoted from the Anandtech article you just can't face reality.
 
Last edited:

Alpha One Seven

Golden Member
Sep 11, 2017
1,098
124
66
Raytracing is going to be used for some little things, this is not the CG from the movies. We are still stuck with rasterization.
If you can't process the few lines i quoted from the Anandtech article you just can't face reality.
Do you have a source you can link to about this brand new revelation you are espousing? I have never heard that the new ray tracing gpu does not do ray tracing...lol I have been seeing the exact opposite in fact, but would be interested in reading what you found from someone that knows, like nVidia themselves.

BTW, ray tracing and CG are completely different things and not related other than a computer is used in both. :)