• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Intel GPU in the PS4?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: SickBeastI guess I just fail to see how a bunch of puny Atom-like CPUs can somehow emulate a GPU and run somewhat fast. I just don't like the direction intel is going with; Larabee does not look to be a good product to me. It's too radical. I don't know why Sony keeps going with radical designs lately.

Heard of a thing called an SP?
http://en.wikipedia.org/wiki/Stream_processing

A single larabee core is very similar to a bank of SP units from nvidia or ATI... only difference is that it wastes a little bit more space on x86 decoding hardware too.

And intel already has a driver and engineering samples. They just need to tap it out a bit more and then release.

Technically if you know all the laws of the universe, had infinite ram, and took a snapshot of the entirety of reality, you could write an x86 (or ARM, or punch cards, or whatever) program that could emulate the entirety of existence... it would just be really really REALLY slow.

But they are not emulating a GPU here... they are emulating a FEW gpu components, adding x86 component, and the rest is just an intel SP istead of an nvidia or ATI one... and SP is basically an array of floating point units.
 
Originally posted by: taltamir
Technically if you know all the laws of the universe, had infinite ram, and took a snapshot of the entirety of reality, you could write an x86 (or ARM, or punch cards, or whatever) program that could emulate the entirety of existence... it would just be really really REALLY slow.

You might miss a few subtleties of the universe if downsampled to fit within the limitations of 64bit computing 😛 😉 But yeah, I know exactly what you are saying and in spirit you are right, an x86 processor running Universe Emulator v1.0 would be damn slow and hopefully its not an OC'ed processor that dies from instability after the first million years of computing the first picosecond of the universe's life :laugh:
 
There is one reason you'll never see another Intel processor in a console.

Cost.

As long as consoles target the $300 and below market, PC cpus are too expensive. They sell for far more in PCs than commodity devices could absorb.

Manufacturing wise I suppose Intel could handle it, but unless there's a big downturn in PC demand that they don't foresee, it wouldn't make the most economical decision for Intel.

I could see Atom in a console however. In the Wii's successor.
 
Originally posted by: Fox5
There is one reason you'll never see another Intel processor in a console.

Cost.

As long as consoles target the $300 and below market, PC cpus are too expensive. They sell for far more in PCs than commodity devices could absorb.

Manufacturing wise I suppose Intel could handle it, but unless there's a big downturn in PC demand that they don't foresee, it wouldn't make the most economical decision for Intel.

I could see Atom in a console however. In the Wii's successor.

What was wrong with the 733MHz P3 used in the XBOX? Worked out just fine then, what's to keep a 22nm Intel octo-core chip from being used in consoles come 2012?

Let's face it, who else on the planet is going to be able to produce IC's in leading edge 22nm process tech before 2014 other than Intel? To not sign them up at this time is to leave that advantage to your competition.

I pity the CEO that will be looking back saying "well we really thought TSMC would have 22nm ready for ATI's GPU to go into our console in 2012, who knew the foundries would not be able to deliver on their roadmaps..." Or IBM for 22nm Cell, etc.
 
Originally posted by: taltamir
Originally posted by: SickBeastI guess I just fail to see how a bunch of puny Atom-like CPUs can somehow emulate a GPU and run somewhat fast. I just don't like the direction intel is going with; Larabee does not look to be a good product to me. It's too radical. I don't know why Sony keeps going with radical designs lately.

Heard of a thing called an SP?
http://en.wikipedia.org/wiki/Stream_processing

A single larabee core is very similar to a bank of SP units from nvidia or ATI... only difference is that it wastes a little bit more space on x86 decoding hardware too.

And intel already has a driver and engineering samples. They just need to tap it out a bit more and then release.

Technically if you know all the laws of the universe, had infinite ram, and took a snapshot of the entirety of reality, you could write an x86 (or ARM, or punch cards, or whatever) program that could emulate the entirety of existence... it would just be really really REALLY slow.

But they are not emulating a GPU here... they are emulating a FEW gpu components, adding x86 component, and the rest is just an intel SP istead of an nvidia or ATI one... and SP is basically an array of floating point units.

taltamir i would like to see where you get your information on larrabee. i have been looking for some up to date information on it for about a month now (last AT article was what, september?). hearing that they already have engineering samples of their product out is great news in my book, since i am looking at a GPU upgrade about the time these are slated for release, and even if they dont perform at the performance crown level, it could drive prices on other hardware down further.
 
Originally posted by: Idontcare
Originally posted by: Fox5
There is one reason you'll never see another Intel processor in a console.

Cost.

As long as consoles target the $300 and below market, PC cpus are too expensive. They sell for far more in PCs than commodity devices could absorb.

Manufacturing wise I suppose Intel could handle it, but unless there's a big downturn in PC demand that they don't foresee, it wouldn't make the most economical decision for Intel.

I could see Atom in a console however. In the Wii's successor.

What was wrong with the 733MHz P3 used in the XBOX? Worked out just fine then, what's to keep a 22nm Intel octo-core chip from being used in consoles come 2012?

Let's face it, who else on the planet is going to be able to produce IC's in leading edge 22nm process tech before 2014 other than Intel? To not sign them up at this time is to leave that advantage to your competition.

I pity the CEO that will be looking back saying "well we really thought TSMC would have 22nm ready for ATI's GPU to go into our console in 2012, who knew the foundries would not be able to deliver on their roadmaps..." Or IBM for 22nm Cell, etc.

yes... intel provided the p3 for the xbox, and the triple core 3.2ghz xenon used in the xbox360... and most of the world chipsets...
Intel, while expensive to the enthusiast system builder, has rebates in place to make them cheap to system integrators like dell... intel is out to make a buck, and if they have to undercut AMD to score millions of consoles then they will...
 
Originally posted by: faxon
taltamir i would like to see where you get your information on larrabee. i have been looking for some up to date information on it for about a month now (last AT article was what, september?). hearing that they already have engineering samples of their product out is great news in my book, since i am looking at a GPU upgrade about the time these are slated for release, and even if they dont perform at the performance crown level, it could drive prices on other hardware down further.

mmm... this statement makes me think I might be using the term incorrectly...
Let me check wiki...
Engineering samples are the beta versions of CPUs that are meant to be used as demonstrators. Usually, they are picked out of a very large bunch and perform well. However, they may have many flaws that were fixed in the production model.
Yap... this is NOT what I meant, I did not mean that they have an almost ready sample used for benchmarks and for showcasing and to give select partners.

I meant that I read that they fabbed some early hardware and using it for various performance testing as part of the ongoing development process.
Wiki mentions the ES were supposed to come at the end of 08, but there is no followup and the article they cite makes no mention of it:
http://en.wikipedia.org/wiki/Larrabee_(GPU)

The point I was making is that intel is testing and building structures and designing intelligently, they are not just shooting in the dark...
Oh, interestingly enough, that article mentions that in addition to the arrays of processors, intel will include fixed function stuff...
 
Originally posted by: taltamir
yes... intel provided the p3 for the xbox, and the triple core 3.2ghz xenon used in the xbox360... and most of the world chipsets...
Intel, while expensive to the enthusiast system builder, has rebates in place to make them cheap to system integrators like dell... intel is out to make a buck, and if they have to undercut AMD to score millions of consoles then they will...
The xenon is a IBM CPU, not Intel. Intel is looking for a console deal to get developers to program directly code for Larrabee. Lets face it more and more titles are getting ported to the PC, having developers code directly for Intel GPU could benefit it a lot down the road when Larrabee 2 hits the PC.
 
The Xbox 1 CPU wasn't even a pentium 3, it was a Celeron with measly 128KB L2 cache, half of what the Pentium 3 originally had.
 
Originally posted by: evolucion8
The Xbox 1 CPU wasn't even a pentium 3, it was a Celeron with measly 128KB L2 cache, half of what the Pentium 3 originally had.
It was actually a Pentium 3 mobile with half the cache.

 
the celeron has always been a lower cache version of a mainstream processor. the first was the pentium2, the last is a c2d based celeron. Every chip in between had a "celeron" version of it... this is why it is better to describe it as a p3 than a celeron, since otherwise you don't know which celeron it is based on.
 
Originally posted by: chizow
They're actually both based on existing RISC processors, which have always been better than their general-purpose long instruction set x86 counterparts for gaming consoles.

i don't remember ever seeing x86 based processors in consoles though i also haven't watched the console market so tell me, which were those consoles that used x86 based processors and were so horrible because of it?
 
Originally posted by: nosfe
Originally posted by: chizow
They're actually both based on existing RISC processors, which have always been better than their general-purpose long instruction set x86 counterparts for gaming consoles.

i don't remember ever seeing x86 based processors in consoles though i also haven't watched the console market so tell me, which were those consoles that used x86 based processors and were so horrible because of it?

The only console that featured a x86 processor was the original XBox. And no it wasn't horrible because of the cpu.

Oh yeah, I almost forgot the Infinium Labs Phantom. It was supposed to use a x86 cpu.
 
Originally posted by: Idontcare
Originally posted by: Fox5
There is one reason you'll never see another Intel processor in a console.

Cost.

As long as consoles target the $300 and below market, PC cpus are too expensive. They sell for far more in PCs than commodity devices could absorb.

Manufacturing wise I suppose Intel could handle it, but unless there's a big downturn in PC demand that they don't foresee, it wouldn't make the most economical decision for Intel.

I could see Atom in a console however. In the Wii's successor.

What was wrong with the 733MHz P3 used in the XBOX? Worked out just fine then, what's to keep a 22nm Intel octo-core chip from being used in consoles come 2012?

Let's face it, who else on the planet is going to be able to produce IC's in leading edge 22nm process tech before 2014 other than Intel? To not sign them up at this time is to leave that advantage to your competition.

I pity the CEO that will be looking back saying "well we really thought TSMC would have 22nm ready for ATI's GPU to go into our console in 2012, who knew the foundries would not be able to deliver on their roadmaps..." Or IBM for 22nm Cell, etc.

The issue isn't that Intel can't make a competitive part, it's that Intel makes more money selling their chips on the x86 commodity market than they could in a console. Several times as much. They also haven't shown much willingness to sell designs wholesale as IBM does, which is a major problem with cost reductions in consoles.
Profit margins on x86 chips are way higher than most cpus. The profit margins that graphics cards see are closer to what is reasonable to include in a console, but they also cost way more to produce (for a top end chip anyway), leaving the 360 and ps3 with rather small graphics chips compared to what was available on PC at the time.

In case you're curious, an x1900xt had a die size of 315 mm^2, xenos is about 210mm^2 while rsx is about 240mm^2. Additionally, both xenos and rsx had built in redundancy since failed chips couldn't be sold as cut down versions like on pc. Overall, slightly better than the mid-range chips (comparable to $200-$300 and not the $300 to $400) that were available when the 360 launched, and rather low end for when the ps3 launched. In Microsoft's case, they paid to own the design outright, and in Sony's case there's rumors that they're getting a bit of a raw deal with nvidia selling the chips to them directly. Not quite applicable though because nvidia and ati don't own their own fabs, Intel does and generally can make good use of just about all of their fab capacity without selling cheap chips into the console market.

yes... intel provided the p3 for the xbox, and the triple core 3.2ghz xenon used in the xbox360... and most of the world chipsets...
Intel, while expensive to the enthusiast system builder, has rebates in place to make them cheap to system integrators like dell... intel is out to make a buck, and if they have to undercut AMD to score millions of consoles then they will...

Intel dumped P3 based Celerons on the xbox that were unsold due to AMD's surprising success in the PC market at the time. After the initial supply of cheap Celerons dried up, subsequent orders were expensive for Microsoft.
The triple core in the 360 is provided by IBM. The chipset is provided by SIS.
And Intel usually doesn't have to undercut AMD. AMD only competed for the original Xbox and generally isn't involved in consoles. As cheap as AMD chips are, console chips have to sell for cheaper.
Additionally, Intel has always had a higher average selling price than AMD regardless of the performance situation, so no, they don't need to undercut AMD. They also tend to have lower manufacturing costs as well, sometimes significantly. Atoms may not be much to look at performance wise, but they're 1/3rd the size of a single core Athlon 64, and sell for roughly what a console cpu would have to sell for. (yet would be way inferior for the purpose)

Though considering the multithreading revolution didn't happen quite as much as was expected, and that Intel currently leads everybody in single threaded performance, I could see Intel getting in if it offers enough of a performance boost. Give the i7 some time to come down in price, or alternatively maybe scoop up some core 2 quads cheap(but I'd assume not, since those will be phased out by the time the next console launches). The i7's hyperthreading also works extremely well, which would make them attractive even if many threaded programs are in the future.
 
Well, i will put this here, thank you moderator. :thumbsup:

When i think of Intel and larrabee i think of the claim from Intel that rasterization is coming to an end and that if it depends on Intel every game will be ray traced. Daniel Pohl did it with the quake 3 engine and has done it with the quake 4 engine. This could be a selling point for the PS4. Raytraced games. Intel has done a lot of research on it and if anybody know how to make compilers it's intel. Thus in my opinion development, development tools for larrabee will not be an issue. Together with Intels advanced process node sony might have a winner. But they will need a game producer willing to make a raytraced game engine.

larrabee

quake 4 engine raytracing

raytracing the Intel way

Carmack's idea on raytracing.


I wonder if a beefed up cell together with larrabee can pull this off.
And i find this very interesting :
In a neat bending of technology to an unintended use, Daniel Pohl did one really cool thing, he used the same rays that you use for graphics to do collision detection. You cast rays out from the player and everything they hit may be an object. Since the math is being done already, collision detection, one of the harder problems with 3D games, is done for you. It isn't free, but considering how many millions of pixels there are on a screen, 1600*1200 would be almost 2 million pixels, a few hundred more per object is rounding error. You can do much more accurate collisions for every bullet and bit of debris spinning around, and do it right.

When i read the quote from above i think this could be used for physics as well.
I forgot to mention that Havok is also owned by Intel. Havok builds physics and game engines for games and movies. So we have havok and id software both possibly doing research and actually designing a raytracing game engine where the raytracing is not only used for the visual output but also for collision detection and physics ?

havok


If this would happen, we will see the same shift on the pc as well.

AMD/ATI and Intel have both the same idea about future gaming and the gfx chips lend themselves more and more for these kinds of calculations.

The future sure looks promising...
 
interesting, don't know if intel can put together a GPU but at least they won't put a massively parallel computer in there with programmer support as in the badly designed ps3. however, if ps4 will do DX11 would be a plus since developers can do the same code for both xbox 360 II and ps4. it can practically leech games off MS console.
 
Originally posted by: nyker96
interesting, don't know if intel can put together a GPU but at least they won't put a massively parallel computer in there with programmer support as in the badly designed ps3. however, if ps4 will do DX11 would be a plus since developers can do the same code for both xbox 360 II and ps4. it can practically leech games off MS console.

nyker96 this is presumably going to be a Larrabee-based solution for the playstation. Larrabee has been in the works at Intel for nearly 4 yrs now.
 
So much discussion over a work of fiction.

When LarryB does see the light of day (next year, the year after?). I doubt it will rival whatever ATI/NVIDA have to offer.
 
Well, Raytracing seems to scale perfectly linear with the number of cores. 4 more cores, is 4x times faster calculations for ray tracing but not for rasterization. Maybe the many core era needs some killer application and ray tracing just might be it. With what i have read the curent gpu's with the advanced shader capabilities could maybe lend themselves for the same kind of calculations or at least future generations. I am just guessing here tho. It is gathering information and hoping what i do mostly here 🙂. And since Nvidia want's to jump into the x86 business it all kind of fit's together. AMD want's to go the path of (x86 ?) many core, Intel is on the path of x86 many core. And both AMD and Nvidia already have a lot of experience. Look at how the current generation gfx chips are build.
 
This whole discussion reminds me of my surprise a bit over two years ago upon hearing that Apple was going to launch a new Mac using Intel CPUs. If you recall, that was when Intel was still using Netburst processors & AMD was winning the war with their K8 architecture. I wondered about the decision to use Intel chips for like a month or so until the C2D/C2Q processors hit the market. Then, suddenly, the decision made a lot of sense.

Could we be about to see another major market-shifting release in the next couple of years? That could easily spell the end for AMD (if they're even still around when it happens) and maybe even nVidia, if Larrabee scales as well as it appears & they can get enough cores packed on-die to push games properly.
 
Originally posted by: William Gaatjes
Well, i will put this here, thank you moderator. :thumbsup:

When i think of Intel and larrabee i think of the claim from Intel that rasterization is coming to an end and that if it depends on Intel every game will be ray traced. Daniel Pohl did it with the quake 3 engine and has done it with the quake 4 engine. This could be a selling point for the PS4. Raytraced games. Intel has done a lot of research on it and if anybody know how to make compilers it's intel. Thus in my opinion development, development tools for larrabee will not be an issue. Together with Intels advanced process node sony might have a winner. But they will need a game producer willing to make a raytraced game engine.

larrabee

quake 4 engine raytracing

raytracing the Intel way

Carmack's idea on raytracing.


I wonder if a beefed up cell together with larrabee can pull this off.
And i find this very interesting :
In a neat bending of technology to an unintended use, Daniel Pohl did one really cool thing, he used the same rays that you use for graphics to do collision detection. You cast rays out from the player and everything they hit may be an object. Since the math is being done already, collision detection, one of the harder problems with 3D games, is done for you. It isn't free, but considering how many millions of pixels there are on a screen, 1600*1200 would be almost 2 million pixels, a few hundred more per object is rounding error. You can do much more accurate collisions for every bullet and bit of debris spinning around, and do it right.

When i read the quote from above i think this could be used for physics as well.
I forgot to mention that Havok is also owned by Intel. Havok builds physics and game engines for games and movies. So we have havok and id software both possibly doing research and actually designing a raytracing game engine where the raytracing is not only used for the visual output but also for collision detection and physics ?

havok


If this would happen, we will see the same shift on the pc as well.

AMD/ATI and Intel have both the same idea about future gaming and the gfx chips lend themselves more and more for these kinds of calculations.

The future sure looks promising...

I liked how ya picked up on the collision detection . Than looking right at Havok. Ya Carmack still singing the same song. But If you look at the timeframe Larrabbee appears Westmere appears also. Looks like intel has all the indians inplace. Read about this company Neoptica and why Nehalem X58 and Westmere are so important to the success. of larrabee. Also their is a new demo of project offset it just hasn't been released to the public yet.

Yep Its even possiable AMD could beat Intel to the punch. ATI has a very good raytracing card there. With DX11 it will be possiable for AMD to offload a lot of work to the CPUs just like intel is doing. The big differance between Intel And ATi/AMD is something you already touched upon Intels compilers. Thats why this company is so much needed
Neoptica.
9 months. Seems like forever to me.

Thought you guys mite like this link.

http://game-on.intel.com/eng/index.aspx

 
Back
Top