Go Back   AnandTech Forums > Hardware and Technology > Video Cards and Graphics

Forums
· Hardware and Technology
· CPUs and Overclocking
· Motherboards
· Video Cards and Graphics
· Memory and Storage
· Power Supplies
· Cases & Cooling
· SFF, Notebooks, Pre-Built/Barebones PCs
· Networking
· Peripherals
· General Hardware
· Highly Technical
· Computer Help
· Home Theater PCs
· Consumer Electronics
· Digital and Video Cameras
· Mobile Devices & Gadgets
· Audio/Video & Home Theater
· Software
· Software for Windows
· All Things Apple
· *nix Software
· Operating Systems
· Programming
· PC Gaming
· Console Gaming
· Distributed Computing
· Security
· Social
· Off Topic
· Politics and News
· Discussion Club
· Love and Relationships
· The Garage
· Health and Fitness
· Merchandise and Shopping
· For Sale/Trade
· Hot Deals
· Free Stuff
· Contests and Sweepstakes
· Black Friday 2013
· Forum Issues
· Technical Forum Issues
· Personal Forum Issues
· Suggestion Box
· Moderator Resources
· Moderator Discussions
   

Reply
 
Thread Tools
Old 02-09-2009, 10:36 AM   #51
taltamir
Lifer
 
taltamir's Avatar
 
Join Date: Mar 2004
Posts: 13,509
Default Intel GPU in the PS4?

Quote:
Originally posted by: Wreckage
So much discussion over a work of fiction.

When LarryB does see the light of day (next year, the year after?). I doubt it will rival whatever ATI/NVIDA have to offer.
considering intels amazing releases recently (nehalem and the X-25 SSD)... they might pull a similar rabbit out of the hat for larrabee.
__________________
How to protect your data guide
AA Naming Guide

I do not have a superman complex; for I am God, not superman!
The internet is a source of infinite information; the vast majority of which happens to be wrong.
taltamir is offline   Reply With Quote
Old 02-09-2009, 11:11 AM   #52
nosfe
Senior Member
 
Join Date: Aug 2007
Posts: 424
Default Intel GPU in the PS4?

Making developers optimize for larrabee shouldn't be to much of a problem considering how big intel is but the problem with larrabee will be the drivers, even though they're not made by the igp driver team they still have to start from scratch whereas ati/nvidia have been working on theirs for years on end. All that said, i'm really looking forward to seeing how it'll perform in photoshop, etc as initially it'll most likely suck at gaming
nosfe is offline   Reply With Quote
Old 02-09-2009, 11:35 AM   #53
Idontcare
Administrator
Elite Member
 
Idontcare's Avatar
 
Join Date: Oct 1999
Location: 台北市
Posts: 20,115
Default Intel GPU in the PS4?

Quote:
Originally posted by: Denithor
This whole discussion reminds me of my surprise a bit over two years ago upon hearing that Apple was going to launch a new Mac using Intel CPUs. If you recall, that was when Intel was still using Netburst processors & AMD was winning the war with their K8 architecture. I wondered about the decision to use Intel chips for like a month or so until the C2D/C2Q processors hit the market. Then, suddenly, the decision made a lot of sense.

Could we be about to see another major market-shifting release in the next couple of years? That could easily spell the end for AMD (if they're even still around when it happens) and maybe even nVidia, if Larrabee scales as well as it appears & they can get enough cores packed on-die to push games properly.
I like your thinking here. Very logical. I will use it to replace my prior vision of the likely future reality until someone posts a superior alternative.
Idontcare is offline   Reply With Quote
Old 02-09-2009, 12:00 PM   #54
Denithor
Diamond Member
 
Denithor's Avatar
 
Join Date: Apr 2004
Location: Charlotte, NC
Posts: 5,479
Default Intel GPU in the PS4?

Quote:
Originally posted by: Idontcare
Quote:
Originally posted by: Denithor
This whole discussion reminds me of my surprise a bit over two years ago upon hearing that Apple was going to launch a new Mac using Intel CPUs. If you recall, that was when Intel was still using Netburst processors & AMD was winning the war with their K8 architecture. I wondered about the decision to use Intel chips for like a month or so until the C2D/C2Q processors hit the market. Then, suddenly, the decision made a lot of sense.

Could we be about to see another major market-shifting release in the next couple of years? That could easily spell the end for AMD (if they're even still around when it happens) and maybe even nVidia, if Larrabee scales as well as it appears & they can get enough cores packed on-die to push games properly.
I like your thinking here. Very logical. I will use it to replace my prior vision of the likely future reality until someone posts a superior alternative.
The big question is how many cores can Intel pack onto the die? Keep in mind they're competing (today) with a GPU containing 216/240 or 160/800 "cores" depending on which architecture you choose.

I have to wonder how many GPU stream processors each Larrabee core will equal?
__________________
GTX 650 Ti Boost -- i7 3770 (4.2GHz) -- Z77 Pro4-M -- Intel 330 180GB -- 16GB DDR3-1866 -- 50" LG plasma

Originally posted by: ironwing
Adam should'a bought a PC instead. Eve fell for the marketing hype.
Denithor is offline   Reply With Quote
Old 02-09-2009, 12:20 PM   #55
BTRY B 529th FA BN
Lifer
 
BTRY B 529th FA BN's Avatar
 
Join Date: Nov 2005
Posts: 12,747
Default Intel GPU in the PS4?

How will hyper threading effect scaling?
__________________
Rig1: Xeon X5660 6core 32nm @ 4GHz, Asrock X58, GTX 670 FTW, Samsung 128Gb 840 Pro, Vertex2 100Gb, Cherry MX Reds, 2009 Seasonic X 650w, Aluminum Full tower
Rig2: i7 970 6core 32nm @ Stock, EVGA 760 Classified,
Nvidia 0db 210, Elpida Hyper CL6.6.6.15, Samsung Evo 120Gb, Vertex LE 100Gb, Cherry MX Blacks, Sidewinder X3, Enermax Revo 950w, Obsidian 800D - F@H machine
BTRY B 529th FA BN is offline   Reply With Quote
Old 02-09-2009, 12:29 PM   #56
William Gaatjes
Diamond Member
 
William Gaatjes's Avatar
 
Join Date: May 2008
Location: In a house.
Posts: 7,224
Default Intel GPU in the PS4?

Quote:
Originally posted by: Nemesis 1
Quote:
Originally posted by: William Gaatjes
Well, i will put this here, thank you moderator.

When i think of Intel and larrabee i think of the claim from Intel that rasterization is coming to an end and that if it depends on Intel every game will be ray traced....

The future sure looks promising...
I liked how ya picked up on the collision detection . Than looking right at Havok. Ya Carmack still singing the same song. But If you look at the timeframe Larrabbee appears Westmere appears also. Looks like intel has all the indians inplace. Read about this company Neoptica and why Nehalem X58 and Westmere are so important to the success. of larrabee. Also their is a new demo of project offset it just hasn't been released to the public yet.

Yep Its even possiable AMD could beat Intel to the punch. ATI has a very good raytracing card there. With DX11 it will be possiable for AMD to offload a lot of work to the CPUs just like intel is doing. The big differance between Intel And ATi/AMD is something you already touched upon Intels compilers. Thats why this company is so much needed
Neoptica.
9 months. Seems like forever to me.

Thought you guys mite like this link.

http://game-on.intel.com/eng/index.aspx
That was a good one about neoptica thank you. :lips:
I have found some information about neoptica :

neoptica aquisition

Well that sure gives me a stronger feeling that raytracing or a mixed raytracer/rasterizer is on it's way to the gamer sooner then we think...

a qoute from the article...

Quote:
8 employees (including the two co-founders) according to Neoptica's official website. 3 have a background with NVIDIA's Software Architecture group: Matt Pharr (editor of GPU Gems 2) and Craig Kolb who were also Exluna co-founders, and Geoff Berry. Tim Foley also worked there as an intern. 2 are ex-Electronic Arts employees: Jean-Luc Duprat (who also worked at Dreamworks Feature Animation) and Paul Lalonde. Nat Duca comes from Sony, where he led the development of the RSX tool suite and software development partnerships. Aaron Lefohn comes from Pixar, where he worked on GPU acceleration for rendering and interactive film preview. Pat Hanrahan also was on the technical advisory board. He used to work at Pixar, where he was the chief architect of the Renderman Interface protocol. His PhD students were also responsible for the creation of both Brook and CUDA.
It seems to me Intel want's to make sure Larrabee has a "job to do" when released in the wild. The neoptica people say in this pdf more or less the same as Carmack does in the link in my other post if i have interpreted it correctly...

neoptica whitepaper

Some more information.


__________________
To expand ones knowledge is to expand ones life.
<< Armchair Solomon >>
(\__/)
(='.'=)
(")_(")
William Gaatjes is offline   Reply With Quote
Old 02-09-2009, 01:05 PM   #57
Nemesis 1
Banned
 
Join Date: Dec 2006
Posts: 11,379
Default Intel GPU in the PS4?

Yep thats the way I read it. The differance being that very high speed link thats needed to the gpu from the cpu. Thats the key. Thats what John just doesn't get.
Nemesis 1 is offline   Reply With Quote
Old 02-09-2009, 01:20 PM   #58
taltamir
Lifer
 
taltamir's Avatar
 
Join Date: Mar 2004
Posts: 13,509
Default Intel GPU in the PS4?

Quote:
Originally posted by: nosfe
Making developers optimize for larrabee shouldn't be to much of a problem considering how big intel is but the problem with larrabee will be the drivers, even though they're not made by the igp driver team they still have to start from scratch whereas ati/nvidia have been working on theirs for years on end. All that said, i'm really looking forward to seeing how it'll perform in photoshop, etc as initially it'll most likely suck at gaming
they have WORKING DX10 drivers for their IGP...
And the larb will include fixed function parts, it is not one hundred percent x86 emulation...
__________________
How to protect your data guide
AA Naming Guide

I do not have a superman complex; for I am God, not superman!
The internet is a source of infinite information; the vast majority of which happens to be wrong.
taltamir is offline   Reply With Quote
Old 02-09-2009, 01:31 PM   #59
William Gaatjes
Diamond Member
 
William Gaatjes's Avatar
 
Join Date: May 2008
Location: In a house.
Posts: 7,224
Default Intel GPU in the PS4?

Quote:
Originally posted by: Nemesis 1
Yep thats the way I read it. The differance being that very high speed link thats needed to the gpu from the cpu. Thats the key. Thats what John just doesn't get.
That's funny you mention that, AMD wanted to integrate a gpu like core together with the cpu on die. Is AMD fusion not the same ideas as Larrabee ? That would need a high speed link in between. A possibility would be a 32bit hyper transport 3 channel between cpu and gpu , 32 bit width would be 25,6 GB/sec a direction or a total aggregate bandwidth of 51,2 GB/sec. That is the easy way. Well, interesting times are coming again. That is , if the prophecy of Jim Rogers does not come completely true.

__________________
To expand ones knowledge is to expand ones life.
<< Armchair Solomon >>
(\__/)
(='.'=)
(")_(")
William Gaatjes is offline   Reply With Quote
Old 02-09-2009, 02:40 PM   #60
Cookie Monster
Diamond Member
 
Cookie Monster's Avatar
 
Join Date: May 2005
Location: Down Under
Posts: 4,749
Default Intel GPU in the PS4?

Quote:
Originally posted by: taltamir
Quote:
Originally posted by: nosfe
Making developers optimize for larrabee shouldn't be to much of a problem considering how big intel is but the problem with larrabee will be the drivers, even though they're not made by the igp driver team they still have to start from scratch whereas ati/nvidia have been working on theirs for years on end. All that said, i'm really looking forward to seeing how it'll perform in photoshop, etc as initially it'll most likely suck at gaming
they have WORKING DX10 drivers for their IGP...
And the larb will include fixed function parts, it is not one hundred percent x86 emulation...
You would be grateful if you could even go pass the start menu without crashing.. 3D performance has always been the downfall of intel IGPs even in todays times.

I agree with nosfe about the various points made about the transition to software rendering ala larrabee. The current market is an old one, with rasterization becoming the core of it. Now even companies like intel will have a hard time penetrating into this market, or "replacing" rasterization to new methods like ray tracing/software rendering. Intel is going to bleed money in their first and maybe their second attempt. I guess its almost like MS and their attempt at the console business.

Fact is, if this larrabee is slower than what nVIDIA/AMD has at the time of its release (whether its ray tracing or rasterization, most referring to the latter), they will be absolutely no incentive for software devs to code for larrabee. Devs aren't going to abandon the current consumer base either, whom are stuck with old DX10/9 hardware.

I think people are failing to see the bigger picture here. Intel has a big mountain to climb, and yet some people make it out like its the end of everything within the graphics market because its intel. It is going to be interesting fosho, but i dont have a high expectation of intel's first attempt at what they call a video card.

I wonder from a physical perspective, if the card looked vaguely like the voodoo series cards from 3DFX., somewhere along the lines of the almighty Voodoo 5 6000
__________________
Heatware
Cookie's Rig:
Intel i5 2500K@4.5GHz/1.36V//Corsair H80 || Asus P8Z68-V PRO || G.Skill RipJaws 2x4GB@DDR3-2133 || EVGA GTX680 SC@1200/6500MHz || 2xSamsung 840 250GB || 2xWD Green 1TB || Seasonic X-560 || Asus Xonar Essence STX || Silverstone FT02 LE || 2xDell U2412M || Windows 7 Pro x64 || Logitech M950 || Sennheiser HD595 || Audioengine A2
Cookie Monster is offline   Reply With Quote
Old 02-09-2009, 03:03 PM   #61
Arkaign
Lifer
 
Arkaign's Avatar
 
Join Date: Oct 2006
Posts: 19,109
Default Intel GPU in the PS4?

Quote:
Originally posted by: taltamir
the celeron has always been a lower cache version of a mainstream processor. the first was the pentium2, the last is a c2d based celeron. Every chip in between had a "celeron" version of it... this is why it is better to describe it as a p3 than a celeron, since otherwise you don't know which celeron it is based on.
True. The difference between P3 and Celeron went beyond cache.

Other differences :

SSE
FSB
L2 Latency
__________________
Death is the answer.
Arkaign is offline   Reply With Quote
Old 02-09-2009, 03:09 PM   #62
nosfe
Senior Member
 
Join Date: Aug 2007
Posts: 424
Default Intel GPU in the PS4?

Quote:
Originally posted by: taltamir
they have WORKING DX10 drivers for their IGP...
And the larb will include fixed function parts, it is not one hundred percent x86 emulation...
i've used intel igp's, the drivers are "working" but they're still crap and i've yet to meet someone who likes them
nosfe is offline   Reply With Quote
Old 02-09-2009, 04:05 PM   #63
Nemesis 1
Banned
 
Join Date: Dec 2006
Posts: 11,379
Default Intel GPU in the PS4?

Quote:
Originally posted by: Cookie Monster
Quote:
Originally posted by: taltamir
Quote:
Originally posted by: nosfe
Making developers optimize for larrabee shouldn't be to much of a problem considering how big intel is but the problem with larrabee will be the drivers, even though they're not made by the igp driver team they still have to start from scratch whereas ati/nvidia have been working on theirs for years on end. All that said, i'm really looking forward to seeing how it'll perform in photoshop, etc as initially it'll most likely suck at gaming
they have WORKING DX10 drivers for their IGP...
And the larb will include fixed function parts, it is not one hundred percent x86 emulation...
You would be grateful if you could even go pass the start menu without crashing.. 3D performance has always been the downfall of intel IGPs even in todays times.

I agree with nosfe about the various points made about the transition to software rendering ala larrabee. The current market is an old one, with rasterization becoming the core of it. Now even companies like intel will have a hard time penetrating into this market, or "replacing" rasterization to new methods like ray tracing/software rendering. Intel is going to bleed money in their first and maybe their second attempt. I guess its almost like MS and their attempt at the console business.

Fact is, if this larrabee is slower than what nVIDIA/AMD has at the time of its release (whether its ray tracing or rasterization, most referring to the latter), they will be absolutely no incentive for software devs to code for larrabee. Devs aren't going to abandon the current consumer base either, whom are stuck with old DX10/9 hardware.

I think people are failing to see the bigger picture here. Intel has a big mountain to climb, and yet some people make it out like its the end of everything within the graphics market because its intel. It is going to be interesting fosho, but i dont have a high expectation of intel's first attempt at what they call a video card.

I wonder from a physical perspective, if the card looked vaguely like the voodoo series cards from 3DFX., somewhere along the lines of the almighty Voodoo 5 6000
If one would take the time to Look at the whole Larrabee picture I think you are to soon to be nay sayer. First of all I won't link to all the people involved but Intel has great Names working on this project . The best in the business. Their driver tean top notch.

Their Gcpu great team . Project offset wonderful team assembled and a new mod engine lol. This is the Key . Project Offset is Intel exclusive because no other card at the time will be able to run the game. If the game is a smash hit. So will Larrabee. Plus Project offset . Will be out of the gate first . With a hugh lead. Unless other developers are hearing the same stories blowing in the wind about larrabee as everone else is saying . Good times are coming for Gamers. Amd/ATI aren't lagging that far behind either.

Nemesis 1 is offline   Reply With Quote
Old 02-09-2009, 04:37 PM   #64
Cookie Monster
Diamond Member
 
Cookie Monster's Avatar
 
Join Date: May 2005
Location: Down Under
Posts: 4,749
Default Intel GPU in the PS4?

Intel surely has some of the brilliant engineers in this world today, but so does ATi/AMD and nVIDIA. Im not so sure if larrabee has their own software team, but I wouldn't call intel software team for their current IGP lineup "top notch".

Nemesis, your suggesting that this "project offset" game, an intel exclusive game is going to give larrabee a huge lead if it becomes successful resulting in the masses buying larrabee hardware joined by software devs who've abandoned their old ways? I think its quite an understatement to say that you optimistic about larrabee!

Lets see, im a very very causal gamer so I dont tend to keep up with many titles people talk about in these forums yet i still do some big titles anticipated, launching this year and the next such as SC2 for one. Quite frankly, "project offset" isnt on that list, hell i just remembered what the game was about now since you've reminded me of it. This aint the 1990s where the market was wide open for games to result in a smash hit, like doom, HL for example. In today's times where genres/ideas/plots have been beaten to death in the gaming world, I doubt a game like the calibre of "project offset" will somehow turn into a smash hit. What I consider a smash hit isnt FC2, or crysis but something along the lines of world of warcraft, HL/DOOM series so forth.

Exclusivity also is a bad thing. Why alienate consumers to render their current cards useless? Sure some might go out buy new hardware, but most wont just for one mere game (chances of this will increase if this hardware performs subpar in other games). This is the same problem with PhysX, CUDA and other proprietary standards although that is for another discussion.

IMHO, Intel's main goal should be to perform above/on par with the competing products at rasterization (if successful can then start to spread its wings for its other features like ray tracing etc), not relying on this larrabee exclusive game to be a smash hit.

Think this has gone OT though. My bad.
__________________
Heatware
Cookie's Rig:
Intel i5 2500K@4.5GHz/1.36V//Corsair H80 || Asus P8Z68-V PRO || G.Skill RipJaws 2x4GB@DDR3-2133 || EVGA GTX680 SC@1200/6500MHz || 2xSamsung 840 250GB || 2xWD Green 1TB || Seasonic X-560 || Asus Xonar Essence STX || Silverstone FT02 LE || 2xDell U2412M || Windows 7 Pro x64 || Logitech M950 || Sennheiser HD595 || Audioengine A2
Cookie Monster is offline   Reply With Quote
Old 02-09-2009, 05:11 PM   #65
Idontcare
Administrator
Elite Member
 
Idontcare's Avatar
 
Join Date: Oct 1999
Location: 台北市
Posts: 20,115
Default Intel GPU in the PS4?

Quote:
Originally posted by: Cookie Monster
Fact is, if this larrabee is slower than what nVIDIA/AMD has at the time of its release (whether its ray tracing or rasterization, most referring to the latter), they will be absolutely no incentive for software devs to code for larrabee. Devs aren't going to abandon the current consumer base either, whom are stuck with old DX10/9 hardware.
I agree. And the fact of the matter is that we aren't smart people, we certainly aren't educated/experienced people like the business and engineering whizzes who are currently operating in the industry.

So if what you state is blatantly obvious to even us, then surely it is ridiculously blatantly obvious to the Intel folks working on Larrabee, yes?

So for sure we can feel like safely concluding they aren't going to bother releasing Larrabee until it is capable of doing exactly what it needs to do whenever it is finally released. (which, among MANY other things, includes trumping whatever Nvidia and AMD have on the market at that time)

If 45nm silicon does not meet the performance requirements then I fully expect Intel will eat the development costs, eat the timeline costs, and push-out the release timeline to the 32nm iteration of Larrabee. If 32nm is not the cat's meow then 22nm it will be, etc.

And why can we rest assured this is what will happen with Larrabee? Because Intel learned this lesson with Merced (the first Itanium) as evidenced by their willingness to cancel the 45nm Itanium and skip directly from 65nm (Tukwila) to the 32nm iteration (Poulson). So doing it with Larrabee will not be a foreign decision tree with the executive management at Intel.

Quote:
Originally posted by: Cookie Monster
I think people are failing to see the bigger picture here. Intel has a big mountain to climb, and yet some people make it out like its the end of everything within the graphics market because its intel. It is going to be interesting fosho, but i dont have a high expectation of intel's first attempt at what they call a video card.
It may be true, some folks are no doubt ill-equipped to conceptualize the engineering and project management challenges involved in Intel's effort to climb 20yrs of GPU development learning curve in a mere 4 yrs of their own development time.

But on the other hand it could also be said that we outsiders are ill-equipped to conceptualize the scale and scope of Intel's resources put behind making Larrabee a success on the timeline they've set out to deliver on.

So who stands to be the bigger fools here? The people who are ignorant of the challenge set before Intel as well as being arrogant regarding the resources they assume Intel is putting to work on the challenge, or the people who are arrogant enough to assume they know well the magnitude of the challenge set before Intel but are wholly ignorant of the relative magnitude of the resources Intel is putting towards surmounting the challenge?

I for one am not about to assume I know more than the people at Intel who actually work on the business end of this topic, so I can only assume my most salient of observations regarding their challenges and my most basic expectations of their resources being allocated to the project are mere child's play in comparison to the level of intellect and dollars they are truly operating with. It would be remarkably arrogant of me to think otherwise of myself.
Idontcare is offline   Reply With Quote
Old 02-09-2009, 05:50 PM   #66
Nemesis 1
Banned
 
Join Date: Dec 2006
Posts: 11,379
Default Intel GPU in the PS4?

Quote:
Originally posted by: William Gaatjes
Quote:
Originally posted by: Nemesis 1
Yep thats the way I read it. The differance being that very high speed link thats needed to the gpu from the cpu. Thats the key. Thats what John just doesn't get.
That's funny you mention that, AMD wanted to integrate a gpu like core together with the cpu on die. Is AMD fusion not the same ideas as Larrabee ? That would need a high speed link in between. A possibility would be a 32bit hyper transport 3 channel between cpu and gpu , 32 bit width would be 25,6 GB/sec a direction or a total aggregate bandwidth of 51,2 GB/sec. That is the easy way. Well, interesting times are coming again. That is , if the prophecy of Jim Rogers does not come completely true.
Yes exactly. Thats why when I heard Intel Had a Qpi Link between the Cpu and Gpu on X58 and I7 I got really excited thats when I knew in my mind what Larrabee is and what it represents. In respect to Nehalem and beyound. Right know Intel is building there base for Larrabee. In games like What ever project offset will be.

This is a statement I got out of another thread here . About an article I pasted . This is what got me.

Neoptica's Vision

Neoptica published a technical whitepaper back in March 2007. Matt Pharr also gave a presentation at Graphics Hardware 2006 that highlights some similar points.
It explained their perspective on the limitations of current programmable shading and their vision of the future, which they name "programmable graphics". Much of their point resides on the value of 'irregular algorithms' and the GPU's inability to construct complex data structures on its own.
They argue that a faster link to the CPU is thus a key requirement, with efficient parallelism and collaboration between the two. Only the PS3 allows this today.
They further claim the capability to deliver many round-trips between the CPU and the GPU every frame could make new algorithms possible and improve efficiency. They plead for the demise of the unidirectional rendering pipeline
Nemesis 1 is offline   Reply With Quote
Old 02-09-2009, 06:33 PM   #67
BenSkywalker
Elite Member
 
Join Date: Oct 1999
Posts: 8,955
Default Intel GPU in the PS4?

The notion that Sony would use an Intel GPU in the PS4 is rather absurd on many different levels.

First off, Sony doesn't buy chips like that off of people, they license the technology and fab it themselves. Can anyone see Intel handing off IP to Sony? I don't see it happening. Sony won't lock themselves into a contract with Intel to provide the chips to them. The XBox, no matter your personal views on how it was as a gaming platform, was an absolute failure in a business sense due to MS buying parts off of other people. Sony is smart in the way they handle their supply chain- assures that they are capable of manipulating whatever aspect they want to when they want to.

Second, x86 won't be able to emulate older PS titles. Emulating little endian using big endian is fairly simple, going the other is not. I know people have less of an expectation in terms of BC these days, but the PS3- even the non BC ones, can still emulate the PS1 at the very least. Using a Ray Traced based GPU would compound the problem enormously as emulating Rasterizers with Ray Tracing is not effective by any means, the entire code structure will not run well at all. We aren't talking about running the horror show style code PCs must deal with to remain compatabile, console code is meant to run on one exacting configuration and often times will not work if any changes are made(the PS3, despite the Cell sharing its MIPS roots with EE can not run PS2 games as an example).

Third, and I think the biggest factor- to date Cell has proven vastly superior, by oders of magnitude, to anything Intel has displayed to the public. Yes, I'm sure Larrabee will be faster then what we have in the PS3 today, but it isn't like no progress has been made with the Cell platform since that launch. A 64 core Cell running @5GHZ or so would be rather brutally effective at doing exactly what Intel is trying to do with Larrabee, likely more effective then what Intel could offer them. Intel has a significantly weaker track record in visual computing not only when compared to ATi or nVidia, but even compared to Sony themselves. At this moment, Intel is at best a third rate GPU company(and that is being as kind as possible- ATi/nV making up the top tier, Sony/IBM/HP being in the second tier in terms of what they can offer the consumer market).

The market trends should also be considered when looking at this. We simply need to look at the 360 to see where the market doesn't want to go. High priced hardware that has trouble selling significantly below what market value should dictate. With the 360 at 50% of the price of the PS3 and a year head start and only pulling a 8Million unit lead while the much inflated price on the Wii has grabbed Nintendo a 18Million unit lead with a year less on the market something will be adjusted movinig forward. Sony used the PS3 to win the format war, which worked, next round they will likely be pushing to maximize profits and marketshare moreso then using it as a tool to gain leverage in other markets. They are almost certainly going to be looking at building a system that they can absolutely maximize control over costs on. This leads into another issue, currency exchange rates. As it stands now, the Yen is simply too strong versus the dollar for Sony to consider buying anything from an American company. It has nothing to do with nationalism, that is simply an economic reality that they must deal with. From an economics point of view the only truly viable way that they could think about Intel would be an IP licensing agreement with a fixed cost per unit based on the Yen, not the Dollar. Is that something Intel would go for? I find it unlikely.

The fourth reason I see why this is rather foolish to think of as a possibility is the actual marketplace. Enix a big name in PC gaming? Naughty Dog? Polyphony? No to all of them, and there in lies the point. All of Sony's big hitters are big hitters exclusively in the console market, they have no presence in the PC space. The current batch of cross platform titles are those that exited the PC market or those who lean towards XBox development. Intel using Larrabee on the PS4 as a seed to help spread ray traced rendering won't pay close to the dividends it would if they could land themselves in the next XBox. There it may make sense for Intel to take a loss leader approach on the GPU for MS in order to advance their viabilty in the PC market. In the PS4, it simply makes no sense.
BenSkywalker is offline   Reply With Quote
Old 02-09-2009, 07:04 PM   #68
mmnno
Senior Member
 
Join Date: Jan 2008
Posts: 381
Default Intel GPU in the PS4?

Larrabee is not at all a replacement for Cell, it is a replacement for RSX, or whatever nVidia/ATI GPU Sony would otherwise put in the PS4. The comparison isn't between Cell and Larrabee; by functionality there simply is no comparison there. The parts do not perform the same tasks. The rumor is that the PS4 would be Cell+Larrabee.

The main obstacle to Larrabee is not the superiority of Cell (come on, a 64 core cell @ 5ghz in the PS4? You don't think they learned something from 599 US dollars?), but the near impossibility of usable PS3 BC. That would definitely be a big hit to Sony's reputation, so that problem needs to be solved first. For the same reason, the PS4 will certainly stick with a Cell evolution for the CPU, and if it's too expensive they will just delay until costs fall enough.
mmnno is offline   Reply With Quote
Old 02-09-2009, 07:11 PM   #69
Nemesis 1
Banned
 
Join Date: Dec 2006
Posts: 11,379
Default Intel GPU in the PS4?

So your saying I shouldn't buy intel stocks at these pricies? Because when larrabee comes out Intel is going to tank . SO I should sell sell sell . That might be good advice. Not. Because if larrabee isn't good . Intel is infact in a world of hurt. These are real stakes . The Game is cut throat. Whats coming sooner or latter gets here. Intel can only try to shape future to their advantage. Thats what they will try to do.
Nemesis 1 is offline   Reply With Quote
Old 02-09-2009, 07:23 PM   #70
Cookie Monster
Diamond Member
 
Cookie Monster's Avatar
 
Join Date: May 2005
Location: Down Under
Posts: 4,749
Default Intel GPU in the PS4?

Quote:
Originally posted by: Idontcare
Quote:
Originally posted by: Cookie Monster
Fact is, if this larrabee is slower than what nVIDIA/AMD has at the time of its release (whether its ray tracing or rasterization, most referring to the latter), they will be absolutely no incentive for software devs to code for larrabee. Devs aren't going to abandon the current consumer base either, whom are stuck with old DX10/9 hardware.
I agree. And the fact of the matter is that we aren't smart people, we certainly aren't educated/experienced people like the business and engineering whizzes who are currently operating in the industry.

So if what you state is blatantly obvious to even us, then surely it is ridiculously blatantly obvious to the Intel folks working on Larrabee, yes?

So for sure we can feel like safely concluding they aren't going to bother releasing Larrabee until it is capable of doing exactly what it needs to do whenever it is finally released. (which, among MANY other things, includes trumping whatever Nvidia and AMD have on the market at that time)

If 45nm silicon does not meet the performance requirements then I fully expect Intel will eat the development costs, eat the timeline costs, and push-out the release timeline to the 32nm iteration of Larrabee. If 32nm is not the cat's meow then 22nm it will be, etc.

And why can we rest assured this is what will happen with Larrabee? Because Intel learned this lesson with Merced (the first Itanium) as evidenced by their willingness to cancel the 45nm Itanium and skip directly from 65nm (Tukwila) to the 32nm iteration (Poulson). So doing it with Larrabee will not be a foreign decision tree with the executive management at Intel.

Quote:
Originally posted by: Cookie Monster
I think people are failing to see the bigger picture here. Intel has a big mountain to climb, and yet some people make it out like its the end of everything within the graphics market because its intel. It is going to be interesting fosho, but i dont have a high expectation of intel's first attempt at what they call a video card.
It may be true, some folks are no doubt ill-equipped to conceptualize the engineering and project management challenges involved in Intel's effort to climb 20yrs of GPU development learning curve in a mere 4 yrs of their own development time.

But on the other hand it could also be said that we outsiders are ill-equipped to conceptualize the scale and scope of Intel's resources put behind making Larrabee a success on the timeline they've set out to deliver on.

So who stands to be the bigger fools here? The people who are ignorant of the challenge set before Intel as well as being arrogant regarding the resources they assume Intel is putting to work on the challenge, or the people who are arrogant enough to assume they know well the magnitude of the challenge set before Intel but are wholly ignorant of the relative magnitude of the resources Intel is putting towards surmounting the challenge?

I for one am not about to assume I know more than the people at Intel who actually work on the business end of this topic, so I can only assume my most salient of observations regarding their challenges and my most basic expectations of their resources being allocated to the project are mere child's play in comparison to the level of intellect and dollars they are truly operating with. It would be remarkably arrogant of me to think otherwise of myself.
I agree with everything you say here, although im guilty of myself for being one of those people you've described

Its nice to see that due to the topic of "intel GPU" which brought up the talks of larrabee, AT users from cpu subforum are most often engaging in discussions with video subforum members and soon this will probably be more "common" as we are inching closer to larrabee's release.
__________________
Heatware
Cookie's Rig:
Intel i5 2500K@4.5GHz/1.36V//Corsair H80 || Asus P8Z68-V PRO || G.Skill RipJaws 2x4GB@DDR3-2133 || EVGA GTX680 SC@1200/6500MHz || 2xSamsung 840 250GB || 2xWD Green 1TB || Seasonic X-560 || Asus Xonar Essence STX || Silverstone FT02 LE || 2xDell U2412M || Windows 7 Pro x64 || Logitech M950 || Sennheiser HD595 || Audioengine A2
Cookie Monster is offline   Reply With Quote
Old 02-09-2009, 07:26 PM   #71
Nemesis 1
Banned
 
Join Date: Dec 2006
Posts: 11,379
Default Intel GPU in the PS4?

Quote:
Originally posted by: BenSkywalker
The notion that Sony would use an Intel GPU in the PS4 is rather absurd on many different levels.

First off, Sony doesn't buy chips like that off of people, they license the technology and fab it themselves. Can anyone see Intel handing off IP to Sony? I don't see it happening. Sony won't lock themselves into a contract with Intel to provide the chips to them. The XBox, no matter your personal views on how it was as a gaming platform, was an absolute failure in a business sense due to MS buying parts off of other people. Sony is smart in the way they handle their supply chain- assures that they are capable of manipulating whatever aspect they want to when they want to.

Second, x86 won't be able to emulate older PS titles. Emulating little endian using big endian is fairly simple, going the other is not. I know people have less of an expectation in terms of BC these days, but the PS3- even the non BC ones, can still emulate the PS1 at the very least. Using a Ray Traced based GPU would compound the problem enormously as emulating Rasterizers with Ray Tracing is not effective by any means, the entire code structure will not run well at all. We aren't talking about running the horror show style code PCs must deal with to remain compatabile, console code is meant to run on one exacting configuration and often times will not work if any changes are made(the PS3, despite the Cell sharing its MIPS roots with EE can not run PS2 games as an example).

Third, and I think the biggest factor- to date Cell has proven vastly superior, by oders of magnitude, to anything Intel has displayed to the public. Yes, I'm sure Larrabee will be faster then what we have in the PS3 today, but it isn't like no progress has been made with the Cell platform since that launch. A 64 core Cell running @5GHZ or so would be rather brutally effective at doing exactly what Intel is trying to do with Larrabee, likely more effective then what Intel could offer them. Intel has a significantly weaker track record in visual computing not only when compared to ATi or nVidia, but even compared to Sony themselves. At this moment, Intel is at best a third rate GPU company(and that is being as kind as possible- ATi/nV making up the top tier, Sony/IBM/HP being in the second tier in terms of what they can offer the consumer market).

The market trends should also be considered when looking at this. We simply need to look at the 360 to see where the market doesn't want to go. High priced hardware that has trouble selling significantly below what market value should dictate. With the 360 at 50% of the price of the PS3 and a year head start and only pulling a 8Million unit lead while the much inflated price on the Wii has grabbed Nintendo a 18Million unit lead with a year less on the market something will be adjusted movinig forward. Sony used the PS3 to win the format war, which worked, next round they will likely be pushing to maximize profits and marketshare moreso then using it as a tool to gain leverage in other markets. They are almost certainly going to be looking at building a system that they can absolutely maximize control over costs on. This leads into another issue, currency exchange rates. As it stands now, the Yen is simply too strong versus the dollar for Sony to consider buying anything from an American company. It has nothing to do with nationalism, that is simply an economic reality that they must deal with. From an economics point of view the only truly viable way that they could think about Intel would be an IP licensing agreement with a fixed cost per unit based on the Yen, not the Dollar. Is that something Intel would go for? I find it unlikely.

The fourth reason I see why this is rather foolish to think of as a possibility is the actual marketplace. Enix a big name in PC gaming? Naughty Dog? Polyphony? No to all of them, and there in lies the point. All of Sony's big hitters are big hitters exclusively in the console market, they have no presence in the PC space. The current batch of cross platform titles are those that exited the PC market or those who lean towards XBox development. Intel using Larrabee on the PS4 as a seed to help spread ray traced rendering won't pay close to the dividends it would if they could land themselves in the next XBox. There it may make sense for Intel to take a loss leader approach on the GPU for MS in order to advance their viabilty in the PC market. In the PS4, it simply makes no sense.
Doesn't sony do movies . Even 3D movies. There are alot of reasons for sony to play nice with intel or AMD. Not so much for NV tho. I think Sony feels the same about NV as Microsoft did with Xbox. So it will be AMD ATI in all the consols. Because I just don't see NV getting Sony back.

Nemesis 1 is offline   Reply With Quote
Old 02-09-2009, 07:53 PM   #72
Nemesis 1
Banned
 
Join Date: Dec 2006
Posts: 11,379
Default Intel GPU in the PS4?

This is about as direct an ans. You'll ever see from intel until they say so. I guess.

http://www.crn.com/hardware/213000161


http://www.tomshardware.com/fo...intel-nehalem-larrabee
Nemesis 1 is offline   Reply With Quote
Old 02-09-2009, 07:58 PM   #73
Nemesis 1
Banned
 
Join Date: Dec 2006
Posts: 11,379
Default Intel GPU in the PS4?

http://news.cnet.com/nanotech/?keyword=Larrabee
Nemesis 1 is offline   Reply With Quote
Old 02-09-2009, 08:11 PM   #74
Idontcare
Administrator
Elite Member
 
Idontcare's Avatar
 
Join Date: Oct 1999
Location: 台北市
Posts: 20,115
Default Intel GPU in the PS4?

Quote:
Originally posted by: BenSkywalker
First off, Sony doesn't buy chips like that off of people, they license the technology and fab it themselves. Can anyone see Intel handing off IP to Sony? I don't see it happening. Sony won't lock themselves into a contract with Intel to provide the chips to them. The XBox, no matter your personal views on how it was as a gaming platform, was an absolute failure in a business sense due to MS buying parts off of other people. Sony is smart in the way they handle their supply chain- assures that they are capable of manipulating whatever aspect they want to when they want to.
Would you agree that the XBOX was quite effective in disrupting Sony's revenue stream and presumed dominance in the console market?

Taken on it's own, excluding all the knock-on benefits that could not have transpired since XBOX's release had XBOX never been released, sure we can make the argument that XBOX failed in the business sense but that would require us to assume that the business sense Microsoft envisioned for XBOX stopped with XBOX and would not carry forward in the form of momentum of XBOX 360 and disruption/mayhem of Nintendo's and Sony's plans.

I would argue Microsoft knew XBOX would merely be the first salvo shot in a multi-salvo match-up with Sony and Nintendo. I would argue that it performed to their business expectations - it got them into the console market, established themselves as a viable third contender and not a wanna-be destined to fade away.

Surely this momentum and confidence carried forward into assisting them with their XBOX 360 contracts, roadmaps/timelines, as well going some distance to destabilizing Sony's presumed path to dominance and had a little something to do with their profit situation.

Quote:
Originally posted by: BenSkywalker
The market trends should also be considered when looking at this. We simply need to look at the 360 to see where the market doesn't want to go. High priced hardware that has trouble selling significantly below what market value should dictate. With the 360 at 50% of the price of the PS3 and a year head start and only pulling a 8Million unit lead while the much inflated price on the Wii has grabbed Nintendo a 18Million unit lead with a year less on the market something will be adjusted movinig forward. Sony used the PS3 to win the format war, which worked, next round they will likely be pushing to maximize profits and marketshare moreso then using it as a tool to gain leverage in other markets. They are almost certainly going to be looking at building a system that they can absolutely maximize control over costs on. This leads into another issue, currency exchange rates. As it stands now, the Yen is simply too strong versus the dollar for Sony to consider buying anything from an American company. It has nothing to do with nationalism, that is simply an economic reality that they must deal with. From an economics point of view the only truly viable way that they could think about Intel would be an IP licensing agreement with a fixed cost per unit based on the Yen, not the Dollar. Is that something Intel would go for? I find it unlikely.
The way I see it, Sony has major cause for effecting change and interjecting a new strategy given the profit situation with the company as a whole and the PS3 as a product line.

Given the trajectory of Microsoft/XBOX and the unforeseen force that Nintendo morphed into combined with the sizable billions of losses at Sony in addition to the management at the top saying old Sony is to be replaced with new Sony...if new Sony was ever going to to go maverick and change things up this is the time they would do it.

The logic tree you lay out is quite logical, and had Sony reported profits instead of losses this past quarter and had the management team that brought PS3 to the market not started talking about old vs. new Sony then I would be inclined to subscribe to the logic you laid out.

But I just can't help but think this is the time, if there ever was going to be one, that Sony's management is going to assess the situation with PS3, XBOX 360, and Wii, look to the future and say to themselves "gents, we got to do something different, this just isn't working".

So in my view what you describe (and you do it well I agree) is the old Sony, the Sony that existed for the past five years when Microsoft was just a "possible threat, but not credible" and Nintendo was "nearly extinct, guaranteed dead within 3 yrs". But management is now touting the age of new Sony is dawning, and that doesn't sound like business as usual to me.

And if a new Sony aims to not be a business as usual old Sony, well then your opus (regardless of how logical it is) on the old Sony just went out the window along with the decision processes that got the old Sony into the predicament it's management currently feels needs to be evicted from future decision making processes. (/preparing self for baby-with-bathwater rebuttal )
Idontcare is offline   Reply With Quote
Old 02-09-2009, 08:28 PM   #75
Idontcare
Administrator
Elite Member
 
Idontcare's Avatar
 
Join Date: Oct 1999
Location: 台北市
Posts: 20,115
Default Intel GPU in the PS4?

Quote:
Originally posted by: Nemesis 1
http://news.cnet.com/nanotech/?keyword=Larrabee
This is a very intriguing blog Nemesis, thanks for posting it.

I had not realized how aggressive Pat's statements on Larrabeee have been, this really caught my eye:

Quote:
Intel's Gelsinger relishes the challenge. "We've not been bashful about saying we want to win Nehalem," Gelsinger said. Nehalem is Intel's next-generation chip architecture that will roll out over the next 12 months. "(Larrabee) will plug into Nehalem, and into Westmere, and into Sandy Bridge," he said, referring to future Intel chip platforms. "And volume consumer applications as well," he said.
The reality check blog entry was entertaining too, not that I disagreed with the blog's author but rather at the apparent attempts by the analyst to assume they understood the technical aspects of the complexity of Nvidia's world well enough as to then be a viable authority on assessing and characterizing the (purported) deficiencies of Intel's efforts. Arrogance oft has it roots in ignorance.
Idontcare is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 01:56 AM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.