Whats your opinion of real time ray tracing.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
I don't believe RtRT will ever take off fully. Like BFG said, we will see hybrids like DoomIII here and there. And I believe if it ever "does" take off, it will be a lot longer than two years from now. And, we will also have to see what Bulldozer can do at that time.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
You could be right . But Intel seems to think just 2 years . As an enthusist I want to believe it . So really until we here more about Larrabbee . Its a tough call . The one thing we all know is Intel and Daniel believe they can do it in 2 years . For me thats good enough.


This concept is vary exciting to me. Yes Nay sayers will out number the believers. Let us remember that befor merom came out only a hand full of people believed that it would destroy X2. It wasn't based off blind loyality to intel . But the information that was available on the web . That only a few could comprehend.

But if one looks at the new energizer bunny(intel) and the way they have delivered over the past year and half. Its hard not to believe. If Intel does deliver on this statement I really don't care what Bulldozer brings to the table. Besides this really wasn't a discussion about AMD vs Intel .

But there is noway AMD is looking at raytracing in the near future.
 

speckedhoncho

Member
Aug 3, 2007
156
0
0
Ray tracing algorithms that are inefficient don't have to be that way. As in ones not calculating indirect light or those that have 30+ ray traces per pixel.

It is never good to fit all solutions under some methodology, ray-tracing included. For example, instead of voiding a light's impact on the eye because there are one or more non-transparent surfaces between the source and the eye, ray tracing can predict future collisions in frames of the future. It's strength is interpolation, so if it has a place in the rendering pipeline, it most likely won't be redesigning an entire feature such as specular lighting or texture mapping or anti-aliasing, but rather as a faster tool of mapping elements on a curve.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Nemesis 1
You could be right . But Intel seems to think just 2 years . As an enthusist I want to believe it . So really until we here more about Larrabbee . Its a tough call . The one thing we all know is Intel and Daniel believe they can do it in 2 years . For me thats good enough.


This concept is vary exciting to me. Yes Nay sayers will out number the believers. Let us remember that befor merom came out only a hand full of people believed that it would destroy X2. It wasn't based off blind loyality to intel . But the information that was available on the web . That only a few could comprehend.

But if one looks at the new energizer bunny(intel) and the way they have delivered over the fast year and half. Its hard not to believe. If Intel does deliver on this statement I really don't care what Bulldozer brings to the table. Besides this really wasn't a discussion about AMD vs Intel .

But there is noway AMD is looking at raytracing in the near future.

Lets assume you are right .. and in 2 years we have the HW to do this.

Game devs may then *start* to develop games using it ... probably more so as it is used today as a hybrid ...
... and Engines may even become "RtRT friendly" ...

... so then we can conclude at the MINIMUM it will be five years to see it used in games ...

What else will happen in 5 years in game engines?
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Or we could assume that the brand new engine that Daniel just invented. Not the one used to do Doom 3/4 . Will be in developers hands NOW and games are already being ported or created using this tek so that when Intel is ready a few games will be available.

If there the right games. It will be a good start and I would jump onboard. One must remember Intel has 80% of the market. Game developers should and probably will be all over a tech that consumes less time and $$$. AND is far less complex than present day programming. As I said this is a win win for all. Those who don't adapt early will fall behind. $$$$ is a motavator . IF developers believe intel will have the processing power to run games at exceptable frame rates . THEY will be motivated.

I wonder what it was that brought Carmen onboard with intel . It wouldn't be or couldn't be that RtRT is blind to the OS. He does run the Larrabbe project ya know.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Nemesis 1
Or we could assume that the brand new engine that Daniel just invented. Not the one used to do Doom 3/4 . Will be in developers hands NOW and games are already being ported or created using this tek so that when Intel is ready a few games will be available.

If their the right games It will be a good start and I would jump onboard. One must remember Intel has 80% of the market. Game developers should and probably will be all over a tech that consumes less time and $$$. AND is far less complex than present day programming. As I said this is a win win for all those who don't adapt early will fall behind. $$$$ is a motavator . IF developers believe intel will have the processing power to run games at exceptable frame rates . THEY will be motivated.

I wonder what it was that brought Carmen onboard with intel . It wouldn't be or couldn't be that RtRT is blind to the OS. He does run the Larrabbe project ya know.

of course you could assume anything .. and you are jumping aboard the intel ship with your new [future] rig, i believe

i have been following the developments with RtRT for a long time ...
=my analysis leads me to believe that this IS what Intel is aiming for ... BUT it will never replace the GPU - not in the foreseeable [5-7 year] future.

i believe, as games get much [much] more *complex* we will find the increasing need for both the GPU and CPU - along with complex physics calculations whose tasks are not even clearly assigned to HW yet.
And don't write off AMD [yet] ... they are looking at this future also and they believe they have it also [eventually] covered with Fusion. This is where their philosophy diverges from intel's.

RtRT is an excellent addition to game engines - with increasing importance - and i believe we will see it and new technology ... eventually right past ST:NG Holodecks [if we survive].
But it is very difficult to predict what will happen after 5 years ... i don't think RtRT will be more then another dev "tool"; and i don't believe it "is far less complex than present day programming"


EDIT: i wonder what MS thinks .. and how it will fit in with their schemes?
:Q
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
It may well be as you say I don't know. But what we can surmize is this Intel SAYS it can be done. Also Intel is working on the project now spending possiably billions in development. At the same time filing for patients in their research efforts. Intel is ahead in this game. Beings how you stated you have been following RtRt for years which I to have done . Tell me were the Elbrus compiler fits into this picture.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Nemesis 1
It may well be as you say I don't know. But what we can surmize is this Intel SAYS it can be done. Also Intel is working on the project now spending possiably billions in development. At the same time filing for patients in their research efforts. Intel is ahead in this game. Beings how you stated you have been following RtRt for years which I to have done . Tell me were the Elbrus compiler fits into this picture.

Intel has proposed many thing in the past ... i remember NetBust and promises of scaling to 10Ghz; Rambust was supposed to be THE new memory tech endorsed by intel. Many things that seem likely now will never happen ... and a new tech could blindside everyone

i am not planning for 5 years ahead in my gaming ... in fact i am *still* about 8 games behind where i want to be [btw, 2Worlds is decent in the Oblivion/G3 "vein"] ... i only plan ahead about 3 years and i still have about 2-1/2 years to go on this rig [with Penryn and GPU upgrades being my only likely upgrades to this Vista 32 rig].

i think you are looking ahead well over 5 years ... and do you think that the GPU will really be *gone* by then?
--it it even possible? intel will get resistance from the industry; it always happens. Can they do it? -not without MS' blessing.

Ray Tracing is not in my crystal ball as the next "big thing" for gaming
:p
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Apoppin

i wonder what MS thinks .. and how it will fit in with their schemes?

Scheme was the correct word. I wonder what Intel thought when MS helped AMD develop AMD 64. I can tell you what I think Intel thought and said . But youngsters read these forums so I can't use those words.

But rest assured Intel has to play nice with MS for the time being but they are working very hard to do for MS what Ms did for Intel on the AMD 64 issue.

1). RtRT is blind to the OS . Apple has large smile on face .2). Elbrus compiler changes the future of 64 bit computing and leaves MS out in the cold. Now this will take longer than RtRt will. But I can't think of any company that deserves to be B-I-T-C-H slaped more than MS. Nor can I think of a company that deserves to deliver that blow more than Intel.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Nemesis 1
Apoppin

i wonder what MS thinks .. and how it will fit in with their schemes?

Scheme was the correct word. I wonder what Intel thought when MS helped AMD develop AMD 64. I can tell you what I think Intel thought and said . But youngsters read these forums so I can't use those words.

But rest assured Intel has to play nice with MS for the time being but they are working very hard to do for MS what Ms did for Intel on the AMD 64 issue.

1). RtRT is blind to the OS . Apple has large smile on face .2). Elbrus compiler changes the future of 64 bit computing and leaves MS out in the cold. Now this will take longer than RtRt will. But I can't think of any company that deserves to be B-I-T-C-H slaped more than MS. Nor can I think of a company that deserves to deliver that blow more than Intel.
i see ... and i now know exactly where you are coming from [a position of *hope*]
-but MS is not called the "evil empire" for nothing ... and they will NOT get left out in the cold ... certainly not by intel; even if it means backing AMD. ;)

you're talking politics ... and there is no guessing the outcome
- but you can bet on it ... buy stock in the company you think will "win"


 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Your correct about netburst being a bust. But I can't help but think . If Intel would have stayed with Northwood processor what kind of GHz would we be seeing on the new 45nm tech. I think it would have been pretty dasm impressive.

On the Rambust thing we already no the cartel conspired against Rambust. Prices were way to high. Other wise it is excellant tech.

Hay who is adding the smiles to the date and time I am not.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Nemesis 1
Your correct about netburst being a bust. But I can't help but think . If Intel would have stayed with Northwood processor what kind of GHz would we be seeing on the new 45nm tech. I think it would have been pretty dasm impressive.

On the Rambust thing we already no the cartel conspired against Rambust. Prices were way to high. Other wise it is excellant tech.

*Exactly*

Predicting future trends more than a few years out is purely speculation ... we don't have all the facts and "politics" is probably as important in shaping future HW as engineering.

i DO think we will see RTRT commonly used in games as multicore goes mainstream .. but it's probably not the 'next big thing' in gaming - imo

... anyway ... We'll see ... link to this old thread when/if it happens

EDIT:
Hay who is adding the smiles to the date and time I am not.
if you don't *change* it, you get the emoticon from the previous poster's Topic/Message Rating
-- see ... i changed mine to 'plain' ... if you reply, there will be no emoticon unless you add it.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Predicting future trends more than a few years out is purely speculation . Thats exacly what this thread is about and its a pretty good thread to boot.

HAY we need Better Emotions If thats how it works. devious mind at work.


 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Nemesis 1
Predicting future trends more than a few years out is purely speculation . Thats exacly what this thread is about and its a pretty good thread to boot.

HAY we need Better Emotions If thats how it works. devious mind at work.

Hey :p
[hay is for horses :D]

i have been saying that for months ... years :(
-FT emoticons SUCK
- and i have been accused of being beyond devious

yeah ... decent thread ... let's see if intel can pull it off

[damn i *hate* that mod edit feature :|
--maybe i can get rid of it if i ask nicely
:Q
... sorry ... i changed nothing in your post]
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Can't slip anything buy you . But what the hay. But I will keep tring. Dam grammer cops.

They have a thread on this subject in the highly tech. section but if we keep it simple and give good links this is the right place for it.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: Nemesis 1
Here is a short discription of the system intel was using at IDF.

The demo system that was on display at IDF was running a dual-quad core (total 8 cores) system as you can see from the 8-threads being processed in the task manager on screen. The image here is from a map on Quake 4 and is being rendered completely on the Intel CPUs while the GPUs are only taking the final image and sending it to the monitor.

Now on the nehalem processor there will be 8 cores with H/T so that will = 16 threads . Than you add in the sse4 speed up of 4x and things are becoming very interesting fast,

Its still yet ouknown what vectorization improvements exist with nehalem so that just +++

So intels claims in the above link of RtRT in 2 years isn't looking undoable from my perspective .

Couple that to the fact that creating games for RtRt is easier than the present day gpu is looking very good.

Nehalem c is the cpu in the nehalem family I want.

I believe this is why AMD won't use all the sse4 and sse4.1 instruction set. This is why intel won't use SSE5 .

Intel and AMD are going in differant directions . Who wins in this race is unknown at this time . My money is on Intel .

The clear loser here is NV.


I don't think that CPU rendering is the best way for ray tracing, they can even use a 16 core CPU, and it will always trail well behind a GPU which is massively parallel.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Ok I wasn't aware of any gpu out on the market that can do RtRt.

Ya I see your point why would anyone want the cpu to render almost perfect lighting effects in games at Playable FPS. At High resolutions. I for one am all for upgrading to a new GPU every 6 months. We most certainly don't need cpus with that kind of power on the desktop. I mean why do we even need 4 core cpu's when hardly any apps use the extra cores? So what! If their are applications out there that scale with the number of processors . Even if its something as irrelevant as RtRt. Right!

All kidding aside . I am pretty sure Intels Larrabbe will be used in conjunction with the 16 threads the cpus have to offer . Or however intel specs it out.

Than of course will have the Nehalem version of Skull trail . With 32 threads running up to 4 larrabbee cards. So the power intel will have to offer in 09 will be massive compared to 2 months from now. Be honest now wouldn't you like to play a game such as Doom 4 on a 30" screen . At playable FPS with the effects I linked for the doom 3 video . WOULDN'T you enjoy that.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
In two years time, we might be able to get an octal-core CPU in a single socket for the same price, running at a higher frequency with better ipc and faster system RAM.
If things are really that parallelizable then ray tracing is far more likely to run better on the GPU, like Folding @ Home which ran about 50 times faster on a GPU than on a CPU last time I checked.

People have been predicting that CPUs will render GPUs obsolete since the Voodoo days but so far every GPU generation has simply increased that gap.

16 cores using system memory is not going to match 128 stream processors backed by over 100 GB/sec memory bandwidth and the latter is what we already have now. Who knows what we?ll have by the time 16 core CPUs are available.

Also doing ray tracing for lighting is great but what about filtering, AA, textures, etc? System RAM simply can't cope with that load.

You do make an interesting point about drivers though.

Ya I see your point why would anyone want the cpu to render almost perfect lighting effects in games at Playable FPS. At High resolutions.
But where are these scenarios you list? Where are these games running at playable FPS at high resolutions?

Like I said they've only just managed to get Quake 3 running at playable speeds and I don't think they even used any AA or AF, and that game is almost ten years old.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Ok I wasn't aware of any gpu out on the market that can do RtRt.
I think it's possible to do it with shaders; the problem is no-one is going that route, likely because it's too expensive even for GPUs.

Than of course will have the Nehalem version of Skull trail . With 32 threads running up to 4 larrabbee cards. So the power intel will have to offer in 09 will be massive compared to 2 months from now.
Uh-huh. Have you been paying attention to the GPU industry? They basically double performance every 12 months. In 2009 we are likely going to see GPUs that we can't even imagine today.

Be honest now wouldn't you like to play a game such as Doom 4 on a 30" screen
Again where in practice are these magical scenarios you list? You haven't even posted any links to the information you've copied from other websites.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
BFG10K
I am only spectulating , as anything in the future is basicly speculating .

If you read all the source links of the first page it will keep you reading for quit some time.

As was said on the first page Until we know more about Larrabe its makes speculating all that much harder.

Intel has bought some companies threw the years . That from appearance makes no sense.

One such company is Elbrus . What was it that Elbrus had . That Intel wanted so bad that they bought the company to attain the rights and patients for . The Elbrus compiler.

Once you understand what the capabilities of this compiler are. The first thing that you realize Large fast cache on a cpu is Hugh. Couple that with what occurs when you add a vertex unit to a multi core processor.

On cache if you read about RtRt you will find that Large fast cache again is not a small thing its a big deal . Than there is SIMD SSE4.1 these accelurators are crucial to RtRT performance increases. Intel is getting the cpu structure in place with each new generation of the merom processor. Intel has also introduce the SSE4 family of instructions very stranglely . They could have introduced the entire family all at the same time . But instead they did it in a way thats rather strange. I assume that AMD was the reason for this.

Than we come to a dedicated Vertex unit on a moduler multi core processor again this is very significant inso far as RtRT is concercened. Its funny that what helps in performance increases for RtRT are exactly the same things that the Elbrus compiler needs to become fullly functional without a performance penality. For me this gets a bit confusing tring to put the puzzle together. But it would seem both require the same functions on a cpu . Were it really gets confusing is the X86 core its self as the Elbrus compiler is for VLIL
it would seem that intell isn't leaving X86 cpu's anytime soon. This in its self makes it vary hard to decipher what Intel is doing. Now If you read about what has been said about Larrabe . As I understand it Its the First tera-scale processor from Intel . Having 16 mini x86simple in order cores all shareing a large cache. This is were the Elbrus compiler gets interesting. Simple in order x86 mini cores. You have to ask yourself what did Intel leave out of this core to make it simple? We know from available info it wasn't SIMD. Now if you read up about the elbrus compiler it basicly tells you what Intel can leave off of the logic unit to have a very powerful core. It is infact interesting I would urge you to read up on it.

Here's what it comes down to . We have a lot of Smart people saying RtRT at high res. Saying RtRt not possiable anytime in the near future.

Than we have INTEL and Daniel saying we can do this in 2 years.

So here's what it boils down to.

1). Intel is flat out lieing. Which I doubt.

2) This is were I put my foot in a bunch of shit. Or all the smart people on the web posting on this subject haven't put the puzzle together to see the big picture. So they are basicly clueless. Now befor you bash me for this. Please remember that before any Merom benchies came out . It was many of these same people that were in denial. Over its performance capabilities yet all information was right there in front of them on the web. All they had to do was assimilate the information that was available and merom would not have been a surprize. This to me is more than likely occurring again with RtRt.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Regardless of how fast a cpu can do raytracing a modern high end gpu can do it orders of magnitude faster. But even so, there's little incentive for game developers to adopt raytracing. Just look at some modern games like WiC, which bring any gpu in existence today to its knees, even without raytracing. There will have to be a major leap in technology before raytracing in action games becomes an option, and neither Nehalem nor SSE4 are enough to make that happen. No matter how much power modern hardware packs, game devs will find a way to use it up and ask for more.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
I would like a link to these modern day cpu's doing RtRt.

When talking about Nehalem C We must include Larrabe in the subject . None knows what Larrabe is exactly . Other than its intels first tera scale project with 16 x86 simple in order cores. Not sure if its 2 threads per core or 4 .

ALL I know is intel says in 2 years they can do RtRt in games. Unless your on the Larrabee project you haven't a clue . Nay sayers will undoubtedly out number the people saying oh boy rtrt in 09.

What is the resistance to intels claims?

If its true and intel can pull this off . Is it that Intel will have everyone else pawned. AMD ATI NV. I don't care which company pulls it off.

But from reading the replies . If Intel pulls this off . Everyone would have to agree intel would have a computational monster on thier hands . The very idea that Intel says they can do this means they must believe it. Its going to be an enteresting year in 08. 09 looks to be a killer.

http://news.softpedia.com/news...e-16-Cores-48746.shtml
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
What I know is that Intel is mainly using this raytracing subject as a marketing gimmick to pursue its own interest. I'm willing to bet money that two years from now game devs will still be using the same two API's, DirectX and OpenGL, to make games using rasterized polygons and pixel shaders.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
I wouldn't bet the house on that . What about Capcom . You think maybe they might jump right in. They are already doing 8 threads in their lastest release. I hate when developers lock fps tho. I heard from what I now consider a great source . Their going to develop an RtRT game .
 

her34

Senior member
Dec 4, 2004
581
1
81
Originally posted by: Nemesis 1
I just found this . Great read . RtRT maybe only 2 years away

http://www.pcper.com/article.php?aid=455


According to Daniel and the Intel team, the magic number of rays they'll need to process each second to achieve that "game quality" and frame rate is around one billion (though interesting designs can be done with considerably fewer). That would allow for about 30 rays per pixel to be processed for each frame, with different rays necessary colors, lighting and other special effects. Doing that math, at a 1024x768 resolution for a total of 786,432 pixels times 30 rays per pixel and 60 frames per second, you get 1.415 billion rays per second required. That is an impressive amount of processing horsepower and a level that we just are not yet at. The dual Clovertown system running our live demo was pushing approximately 83 million rays per second plus the work of standard trilinear filtering.

taking into account clock speed increases, ipc increases, architecture changes/refinements i think we'll be looking at real games when 16 cores is released which is 5-6 years away. and that's 720p for ray tracing whereas gpu games will be doing 1200p 4xAA 16af on all games with ease

gpu games will continue to improve and gamers won't care about ray tracing unless the games look better

also the use of cores for physics and ai is already increasing which will leave fewer cores for ray tracing. by the time 8 core cpu's are released games will be using 4 cores for non-graphics and we're almost back to square 1


it's still a long ways off before cpu's do what gpu's can't.

http://en.wikipedia.org/wiki/Image:Glasses_800_edit.png