• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

"Larrabee now needs HD decoders to keep up"

I find this strange to the point of unlikely. I mean if Larrabee comes out with 32+ cores consuming in the range of 100-150W combined, how many of those are needed to decode HD? Only a few I'd say, the rest can be turned off like on Nehalem so it would only end up consuming a fraction of it's maximum power. It's not like they are planning a mobile version of the first Larrabee right??
 
if you read the article, they say they are adding the decoders to reduce power consumption during playback. My guess is that this will make it use next to nothing of power so it will be virtually dead silent.
 
Originally posted by: jones377
I find this strange to the point of unlikely. I mean if Larrabee comes out with 32+ cores consuming in the range of 100-150W combined, how many of those are needed to decode HD? Only a few I'd say, the rest can be turned off like on Nehalem so it would only end up consuming a fraction of it's maximum power. It's not like they are planning a mobile version of the first Larrabee right??

I don't know much about Larrabbe's architecture, but maybe they can't turn off cores like Nehalem, becauce the 32 cores are 4 groups of 8 cores each.

Maybe they can turn off only groups, so 1 group (8 cores) will be active during decoding.
I am just guessing, I don't know.

Originally posted by: footballrunner800
if you read the article, they say they are adding the decoders to reduce power consumption during playback. My guess is that this will make it use next to nothing of power so it will be virtually dead silent.

My concern is what pricing model are they going to follow, becauce it can lead to the prospect of making larrabbe to be virtually dead.

Also besides pricing ,my other concern is driver quality.
 
Well if they ARE adding an actual G45 processor, it might actually be able to help with performance with the right drivers.
 
Originally posted by: firewolfsm
Well if they ARE adding an actual G45 processor, it might actually be able to help with performance with the right drivers.

I think this is what's going to make or break Larrabee, if it can compete with what we have today and a reasonable price, it might just be a hit, anything less than that and nobody will be ditching ATI/Nvidia anytime soon.
 
and here's me putting in a vote for ray tracing!!!

urg...maybe I'll hop aboard after the second or third generation Larrabee works out the kinks and takes off (if it does)
 
If it matches up to the midrange ATI and nV offerings right away I'll jump ship immediately...as long as I can watercool it. 🙂
 
Originally posted by: thilan29
If it matches up to the midrange ATI and nV offerings right away I'll jump ship immediately.

That wouldn't be the first time it happened. But in the past the others always failed in their drivers and follow up products. Intel has alot to prove there and need to establish a track record for me to jump ship.
 
Its quite odd that Intel with such huge R&D teams with huge amounts of cash with one of the best hardware engineering teams in the world (Probably the best one), can't cope with decent drivers for their hardware.

While the much more limited AMD or nVidia with their very talented hardware engineers, limited R&D, much less budget can make great drivers to maximize their hardware potential.
 
Originally posted by: evolucion8
Its quite odd that Intel with such huge R&D teams with huge amounts of cash with one of the best hardware engineering teams in the world (Probably the best one), can't cope with decent drivers for their hardware.

While the much more limited AMD or nVidia with their very talented hardware engineers, limited R&D, much less budget can make great drivers to maximize their hardware potential.

Good point, I mean if you are implying that Nvidia & ATI has very talented S/W engineers. (although in ATI case they were not performing that good before DX9 days)

I still remember, that in an interview in Beyond3d the ATI guy that was in charge for the development of the 2000 series had the nerve to say that the design choice to put 1:1 pixel / texel ratio in 2900 was fine and the problem was on the S/W side and that he had confidence in the driver team (bs, the driver team was just fine) he even defended the shader antialiasing method of 2000 series (well history disproved all his points, even at that time you could see from HD2600 model what a 1:2 pixel / texel ratio could do)

I can't open beyond3d to provide a link (something with their servers?) If someone else can do it, please do so.
 
Back
Top