Raytracing is worth the money. (Opinion)

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Geegeeoh

Member
Oct 16, 2011
145
126
116
Surce? FFS did you read the Anandtech article linked? Or any other Turing architecture review?

CGi in movies are done with ray (or path) tracing.
Pixar movies are basically just raytracing ( https://graphics.pixar.com/library/RayTracingCars/paper.pdf )
Ironman suits, Hulk, etc... raytracing.

You swallowed the nVidia marketing without a critical thought.
 
Last edited:

mv2devnull

Golden Member
Apr 13, 2010
1,498
144
106
CG as in CGI (Computer Generated Images)? Plain "CG" might be mistaken as Garbage Collection.

I have never heard that the new ray tracing gpu does not do ray tracing.
I think Geegeeoh pointed out that only a fraction of the new GPU is RT cores and the raytracing will be used to augment rather than to replace the entire raster process. NVidia CEO himself did tell on his keynote that rendering a frame will be one part raster, one part rays, and one part tensor math.

That is different from the raytracing of videos that you and Pixar have done.
 
  • Like
Reactions: Headfoot

Geegeeoh

Member
Oct 16, 2011
145
126
116
Yes, CGI. While talking about "movies" it shouldn't be that confusing...

Just read what I quoted from the Anandtech article, it's not that long.
Reading the whole thing wouldn't hurt...
 
Last edited:
  • Like
Reactions: Headfoot

Geegeeoh

Member
Oct 16, 2011
145
126
116
If you want more:
Despite Nvidia's description of ray-tracing as the holy grail of computer graphics during its introduction of the Turing architecture, these graphics cards do not replace rasterization—the process of mapping 3D geometry onto a 2D plane and the way real-time graphics have been produced for decades—with ray-tracing, or the process of casting rays through a 2D plane into a 3D scene to directly model the behavior of light. Real-time ray tracing for every pixel of a scene remains prohibitively expensive, computationally speaking.
https://techreport.com/review/34095/popping-the-hood-on-nvidia-turing-architecture
 

SirCanealot

Member
Jan 12, 2013
87
1
71
Do you have a source you can link to about this brand new revelation you are espousing? I have never heard that the new ray tracing gpu does not do ray tracing...

This thread is amazing.

So hold on, can I confirm something?

Do you understand that the new GPUs are going to be 95% (chosen numerical value 'for the sake of argument') rasterization with 5% ray tracing?

Or do you understand that the new GPUs are going to be 100% ray traced?

If you can let me know which one you think is correct, that would be great!
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,846
3,190
126
....

are you really enjoying your soap box you got setup there made out of Nvidia boxes?

I honestly dont know who you are trying to convince, or who your trying to incite, but 90% of us on this forum all feel the RTX 2080 is at this moment a bad investment, until we get real numbers.

Did you also miss how the original owner of Tom's Hardware shot that editor who wrote the RTX article stating that if he still owned Tom's he would of probably AXE'd him?

Real numbers mean benchmarks run by multipul independant reviewers on real drivers, and not some beta hash which wcf has posted from a leaked benchmark.

Lastly did you really need to create a new thread?
This is still honestly your opinion thread, and did not deserve a new thread.
 
  • Like
Reactions: DAPUNISHER

Elfear

Diamond Member
May 30, 2004
7,097
644
126
Nope.
If you don't believe me, look up the words for yourself. :) lol

I don't think the term "hybrid" in this case means what you think it means. Here is a pic from Nvidia's Turing blog post that shows what Geegeeoh is getting at.

image17-1024x528.jpg
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
NV white paper gives a 50/50 ratio of ray tracing to rasterization as typical in Hybrid rendering, which occurs over about 80% of the total frame time.

 Using DLSS as a representative DNN workload (purple), we observe that it takes about 20% of the frame time. The remaining 80% time is doing rendering (yellow).
 Of the remaining rendering time, some time will be spent ray tracing (green) while some time is spent in traditional rasterization or G-Buffer evaluation. The amount of time will vary based on content. Based on the games and demo applications we’ve evaluated so far, we found that a 50/50-time split is representative. So, in Figure 43, Ray Tracing is about half of the FP32 shading time. In Pascal, ray tracing is emulated in software on CUDA cores, and takes about 10 TFLOPs per Giga Ray, while in Turing this work is performed on the dedicated RT cores, with about 10 Giga Rays of total throughput or 100 tera-ops of compute for ray tracing.
 A third factor to consider for Turing is the introduction of integer execution units that can execute in parallel with the FP32 CUDA cores. Analyzing a breadth of shaders from current games, we found that for every 100 FP32 pipeline instructions there are about 35 additional instructions that run on the integer pipeline. In a single-pipeline architecture, these are instructions that would have had to run serially and take cycles on the CUDA cores, but in the Turing architecture they can now run concurrently. In the timeline above, the integer pipeline is assumed to be active for about 35% of the shading time.
Given this workload model, it becomes possible to understand the usable ops in Turing and compare vs a previous generation GPU that only had one kind of operation instead of four. This is the purpose of RTX-OPS—to provide a useful, workload-based metric for hybrid rendering workloads.
 

Geegeeoh

Member
Oct 16, 2011
145
126
116
50/50 of computational cost, not of work done.
Rasterization still draws almost everything.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
"Both Ray tracing and Rasterization pipeline operate simultaneously and cooperatively in Hybrid Rendering model used in Turing GPUs."
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
"The RT Cores in Turing can process all the BVH traversal and ray-triangle intersection testing, saving the SM from spending the thousands of instruction slots per ray, which could be an enormous amount of instructions for an entire scene. The RT Core includes two specialized units. The first unit does bounding box tests, and the second unit does ray-triangle intersection tests. The SM only has to launch a ray probe, and the RT core does the BVH traversal and ray-triangle tests, and return a hit or no hit to the SM. The SM is largely freed up to do other graphics or compute work."
 

mv2devnull

Golden Member
Apr 13, 2010
1,498
144
106
Yeah, ray tracing cost a ton a could not be done without the RT cores...
Not sure about the last part. Ray tracing algorithms have existed decades and the CEO's keynote did claim how much faster 20xx is in it than 10xx. That implies that 10xx CUDA cores or CPU can trace rays (but not as fast/efficiently). Feature films have been rendered in server farms sans RT cores. (The impact of RT cores on movie industry ... do we already speculate that?)

"Could not be done realtime"? The initial NVidia "realtime ray" demo was on Voltas, was it not? Sans RT.

"Could not be done realtime in consumer system without RT cores"? Yes.

Alas, the definitions of "realtime" and "consumer price" are a bit subjective; the OP has one opinion on them. Even Ents of Middle-Earth were acting in real time ...
 

Geegeeoh

Member
Oct 16, 2011
145
126
116
I mean, RT Cores or any other specialized hardware for that task.
Programmable cores can do anything, but that's too much work… at least for this job and for now.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,815
7,173
136
In response to the title of the thread, I would have to answer: Ray Tracing will be worth it... Eventually.

I hope the hardware for this feature becomes ubiquitous across vendors and powerful enough so that lower high end/mainstream peasants such as myself can eventually enjoy glorious lighting for ourselves.

I do envision a time when games will simply have a slider that determines how many initial rays are cast and how many subsequent bounces will be processed and all lighting/shadowing is raycast.

That's a long ways away though and for the moment the initial ask for the 2xxx series cards is a bit much, especially without any playable games to show at launch. I'm also worried that the high initial price will keep everything inflated on the used market above what many people would be willing to spend, hurting PC Gaming as a whole.

So right now, I'd have to say Ray Tracing definitely is not worth it.
 

Geegeeoh

Member
Oct 16, 2011
145
126
116
If I got it right, now they can track ~3 ray per pixel (at FHD).
We are so far off to replace rasterization...
 

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
202
106
It may be worth it, in which case, wait for the second generation products. Wait for the GeForce RTX 2180 before you buy.

There are no games that you can buy right now that use it. The hardware is in its infancy. I guarantee that the RTX 2180 will have technology that makes it able to do things the 2080 cannot do at all. As in, it will not simply be faster at raytracing, it will have a superior implementation of it.

So, don't pay the early adopter tax.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Now that the reviews are out, we discover not even 1 game has RTX effects live at card launch. Rome wasn't built in a day, but at the same time, as a consumer it makes a lot of sense for me to wait until nVidia has built up much more adoption and benchmarks are out for how the actual games perform with the RTX effects as implemented. We're back to a holding pattern