Originally posted by: Nemesis 1
You could be right . But Intel seems to think just 2 years . As an enthusist I want to believe it . So really until we here more about Larrabbee . Its a tough call . The one thing we all know is Intel and Daniel believe they can do it in 2 years . For me thats good enough.
This concept is vary exciting to me. Yes Nay sayers will out number the believers. Let us remember that befor merom came out only a hand full of people believed that it would destroy X2. It wasn't based off blind loyality to intel . But the information that was available on the web . That only a few could comprehend.
But if one looks at the new energizer bunny(intel) and the way they have delivered over the fast year and half. Its hard not to believe. If Intel does deliver on this statement I really don't care what Bulldozer brings to the table. Besides this really wasn't a discussion about AMD vs Intel .
But there is noway AMD is looking at raytracing in the near future.
Originally posted by: Nemesis 1
Or we could assume that the brand new engine that Daniel just invented. Not the one used to do Doom 3/4 . Will be in developers hands NOW and games are already being ported or created using this tek so that when Intel is ready a few games will be available.
If their the right games It will be a good start and I would jump onboard. One must remember Intel has 80% of the market. Game developers should and probably will be all over a tech that consumes less time and $$$. AND is far less complex than present day programming. As I said this is a win win for all those who don't adapt early will fall behind. $$$$ is a motavator . IF developers believe intel will have the processing power to run games at exceptable frame rates . THEY will be motivated.
I wonder what it was that brought Carmen onboard with intel . It wouldn't be or couldn't be that RtRT is blind to the OS. He does run the Larrabbe project ya know.
Originally posted by: Nemesis 1
It may well be as you say I don't know. But what we can surmize is this Intel SAYS it can be done. Also Intel is working on the project now spending possiably billions in development. At the same time filing for patients in their research efforts. Intel is ahead in this game. Beings how you stated you have been following RtRt for years which I to have done . Tell me were the Elbrus compiler fits into this picture.
i see ... and i now know exactly where you are coming from [a position of *hope*]Originally posted by: Nemesis 1
Apoppin
i wonder what MS thinks .. and how it will fit in with their schemes?
Scheme was the correct word. I wonder what Intel thought when MS helped AMD develop AMD 64. I can tell you what I think Intel thought and said . But youngsters read these forums so I can't use those words.
But rest assured Intel has to play nice with MS for the time being but they are working very hard to do for MS what Ms did for Intel on the AMD 64 issue.
1). RtRT is blind to the OS . Apple has large smile on face .2). Elbrus compiler changes the future of 64 bit computing and leaves MS out in the cold. Now this will take longer than RtRt will. But I can't think of any company that deserves to be B-I-T-C-H slaped more than MS. Nor can I think of a company that deserves to deliver that blow more than Intel.
Originally posted by: Nemesis 1
Your correct about netburst being a bust. But I can't help but think . If Intel would have stayed with Northwood processor what kind of GHz would we be seeing on the new 45nm tech. I think it would have been pretty dasm impressive.
On the Rambust thing we already no the cartel conspired against Rambust. Prices were way to high. Other wise it is excellant tech.
if you don't *change* it, you get the emoticon from the previous poster's Topic/Message RatingHay who is adding the smiles to the date and time I am not.
Originally posted by: Nemesis 1
Predicting future trends more than a few years out is purely speculation . Thats exacly what this thread is about and its a pretty good thread to boot.
HAY we need Better Emotions If thats how it works. devious mind at work.
Originally posted by: Nemesis 1
Here is a short discription of the system intel was using at IDF.
The demo system that was on display at IDF was running a dual-quad core (total 8 cores) system as you can see from the 8-threads being processed in the task manager on screen. The image here is from a map on Quake 4 and is being rendered completely on the Intel CPUs while the GPUs are only taking the final image and sending it to the monitor.
Now on the nehalem processor there will be 8 cores with H/T so that will = 16 threads . Than you add in the sse4 speed up of 4x and things are becoming very interesting fast,
Its still yet ouknown what vectorization improvements exist with nehalem so that just +++
So intels claims in the above link of RtRT in 2 years isn't looking undoable from my perspective .
Couple that to the fact that creating games for RtRt is easier than the present day gpu is looking very good.
Nehalem c is the cpu in the nehalem family I want.
I believe this is why AMD won't use all the sse4 and sse4.1 instruction set. This is why intel won't use SSE5 .
Intel and AMD are going in differant directions . Who wins in this race is unknown at this time . My money is on Intel .
The clear loser here is NV.
If things are really that parallelizable then ray tracing is far more likely to run better on the GPU, like Folding @ Home which ran about 50 times faster on a GPU than on a CPU last time I checked.In two years time, we might be able to get an octal-core CPU in a single socket for the same price, running at a higher frequency with better ipc and faster system RAM.
But where are these scenarios you list? Where are these games running at playable FPS at high resolutions?Ya I see your point why would anyone want the cpu to render almost perfect lighting effects in games at Playable FPS. At High resolutions.
I think it's possible to do it with shaders; the problem is no-one is going that route, likely because it's too expensive even for GPUs.Ok I wasn't aware of any gpu out on the market that can do RtRt.
Uh-huh. Have you been paying attention to the GPU industry? They basically double performance every 12 months. In 2009 we are likely going to see GPUs that we can't even imagine today.Than of course will have the Nehalem version of Skull trail . With 32 threads running up to 4 larrabbee cards. So the power intel will have to offer in 09 will be massive compared to 2 months from now.
Again where in practice are these magical scenarios you list? You haven't even posted any links to the information you've copied from other websites.Be honest now wouldn't you like to play a game such as Doom 4 on a 30" screen
Originally posted by: Nemesis 1
I just found this . Great read . RtRT maybe only 2 years away
http://www.pcper.com/article.php?aid=455
According to Daniel and the Intel team, the magic number of rays they'll need to process each second to achieve that "game quality" and frame rate is around one billion (though interesting designs can be done with considerably fewer). That would allow for about 30 rays per pixel to be processed for each frame, with different rays necessary colors, lighting and other special effects. Doing that math, at a 1024x768 resolution for a total of 786,432 pixels times 30 rays per pixel and 60 frames per second, you get 1.415 billion rays per second required. That is an impressive amount of processing horsepower and a level that we just are not yet at. The dual Clovertown system running our live demo was pushing approximately 83 million rays per second plus the work of standard trilinear filtering.