Nvidia announces x86 chip *edit: not true*

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

magreen

Golden Member
Dec 27, 2006
1,309
1
81
Keys, you just have to accept it -- listening to Nemesis is like listening to the Sphinx. If you can figure out his riddle, you know the future.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: magreen
Keys, you just have to accept it -- listening to Nemesis is like listening to the Sphinx. If you can figure out his riddle, you know the future.

Give me ten days and I can use my supercomputer here to predict tomorrow's weather.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Na you don't get it. Intel is on a set path . Were others are still trying to find there way.

It doesn't matter what NV or AMD/ATI does. Intel has a course and it must stay on it .

You think like them

and like them you cannot accept anyone who has a reasonable and sometimes Superior alternative .. nor can you change your thinking as situations sometimes quickly demand.
Worse, intel is a corporate board - a behemoth giant ship that cannot change direction even with an iceberg in front of them - so they blast away at it

Nvidia is much different and they respond more quickly to changing situations

i believe they can respond much better than intel can .. and they have a much better vision .. of client computing losing its relevance and becoming supplanted by *their* cloud computing

it is indeed a battle for the soul of the PC
- and i believe Jensen has the bigger and better vision than you guys stuck in the mud

rose.gif
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Your not looking at this correctly at all. You think because I like what Intel is doing makes me a fan of THAT COMPANY. Not True I am a fanbois to tech. You have to relize what SANDY means to Everyone in the industry. OK .

Sandy even tho it runs X86 code. Isn't an x86cpu as we currently define it. SSE has to be ported to AVX. OR Vectorizing OK.

You know all this.

Heres the part your ignoring. With an X86 processor,what can AMD do to fight AVX.

1). AMD no longer can copy intel instructions that are not related to an X86 processor. Since Sandy is a vectoring processor its a new game between Intel / AMD.

What can AMD do? They either have to go vectoring processor. Than try to get around all of Intels patients. Buy than Intel will be on haswell were FMA is activiated. Ect ect.

SO lets cut the crap. You tell me what AMD can do to compete with intels AVX processor.

2) There is only one thing I know of that could work. I am not even sure its doable. That would be FMAx86. Sun has FMA x86 but its not working the best . Must remember the SUN connection with Russias SPAR = Boris = Intel compilars.

So thats AMDs only 2 options. AMD will try FMA x86. If they succeed . Intel will get free FMAx86= to AMDs Plus still have FMA(AVX) . FMA works really well with Itanics. I see said the blind man. Works great with GPUs . Doesn't play well with x86. Sandy is 4 operand. When its released only 3operands will be active. The 4th operand is for FMA. that will be activated later. When needed. Intel covered every base this time. With Itanic they were betrayed. Now that everyone else wants to leave X86. Intel has said F--- you.

You wanted X86 were going to drown ya in X86. At the same time were moving away from X86 . But your not . Because its what YOU ALL WANTED.

I remember it clearly as if it were yesterday. All you guys wanted AMD64 not EPIC.

I thought you guys were crazy. Now I am setting back old and broken laughing at what intel is going to shovel down your thoats. You wanted AMD64 well your going to get it in spades. The best part it won't even be an X86 processor Everthing is ported.

This is so cool. Nobady wants x86. Intel is holding the world ransum to x86. So ironic that Itanic could cause so many to fail without itself sinking. LOL. NEMESIS

3) You get one other alternitive AMD has no choice other than turning to ATI CAL/BrooK/ DX11. Now you know why AMD bought ATI. They had no choice. Hector may be alot smarter than ya think.


As for NV priority cuda will never pass in the industry. Plus NV thinking that a simple X86 cpu is all they need. Is just insane thinking. China should start poping out x86 cpus any time now if x86 is public domain . LOL.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
as i said you are looking NARROWLY at intel's emerging technology - LIKE a fan - while *ignoring* the HUGE inroads Nvidia has ALREADY made with CUDA :p

CUDA is "x86 for GPU" and it is the No1 programming language for it - taught in 50 universities now
- they have nailed down the pro market and there are NO alternatives for medical real-time 3D imaging

while intel is dreaming, Nvidia went to work - 1-3 years and they challenge intel on their own turf
rose.gif
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Stop with NV and Cuda. Cuda died when DX11 was announced.

. I thought in the server markets and such. Efficienct counts . When did GPU become efficient. In cpu its a big deal . In gpus its not???. Cuda works better than whats out now. But not in 6 months . Not a chance in Hell. I will bet what ever you want on that.

X86 apps will pawn cuda. With Larrabee in gaming NV has a chance. Outside of gaming its game over.

I seen by NV last QT report . People are knocking themselves out to try and get hold of Cuda. NOT. Lets talk about cuda and codex. Hows that working out . LOL.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Your missing it with cuda. For Cuda to work it has to be open . When AMD gets Cal running and their compilers up to snuff. Any programm written for AMD GPCPU will run perfectly on larrabee. Same goes for AMD . Any program written for Larrabee will run with AMD GPUCPU. The same can't be said for cuda. The Game project offset will not run on ATI or NV graphics. But latter AMD GPCPU will beable to run it. Maybe even just ATI GPU with right compiler. This still applies with Intel AVX. You didn't think intel was going to lock AMD out did ya. Again once you write native for AVX look out.

Nv is there own worse enemy.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
You are dreaming along with intel

GPU is going to supplant CPU .. CPU cannot do what the GPU can do .. unless you want 100 cores - then it is not practical any longer
- Cloud computing is the future
- you client guys are stuck in the mud


Think MUCH bigger
--change the ways humans interact

intel's vision is limited and small and based on their PAST glories

You are thinking that Nvidia is gonna roll over like ViA and Transmeta
- their graphics will always kick-ass over intels

intel is its own worst enemy
[you can say that about anyone :p]

rose.gif


and the proof of the pudding is in the eating .. we shall see who has the better vision
- soon enough; my money is on Nvidia .. although i wish intel well
=)
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Fair enough. Its wait and see. But just for the hell of it . Lets not leaveAMD/ATI out of this. IF NV can whip both the boys NV deserves it. Odds 0%
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: Nemesis 1
Fair enough. Its wait and see. But just for the hell of it . Lets not leaveAMD/ATI out of this. IF NV can whip both the boys NV deserves it. Odds 0%

Trueth is If NV was a threat to Intel , Intel would simply buy them . It won't happen because NV is not a threat. The Threat will come from another country.

 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Nemesis 1
Stop with NV and Cuda. Cuda died when DX11 was announced.

It's killing me that I can say certain things, but boy oh boy does Microsoft have a surprise for you. I'm just glad I see you post things like this. It shows "me", at least, that you haven't a clue what's really going on. Oh, and just to make you aware, you are reverting back to your Turtle1/Intelia days where Intel was your religion. Apoppin is closer to the mark than you are, that is for certain.

 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: apoppin
You are dreaming along with intel

GPU is going to supplant CPU .. CPU cannot do what the GPU can do .. unless you want 100 cores - then it is not practical any longer
- Cloud computing is the future
- you client guys are stuck in the mud


Think MUCH bigger
--change the ways humans interact

intel's vision is limited and small and based on their PAST glories

You are thinking that Nvidia is gonna roll over like ViA and Transmeta
- their graphics will always kick-ass over intels

intel is its own worst enemy
[you can say that about anyone :p]

rose.gif


and the proof of the pudding is in the eating .. we shall see who has the better vision
- soon enough; my money is on Nvidia .. although i wish intel well
=)

Well it pretty clear Intel agrees with ya. No one is denying the compute power of a gpu. But you state a cpu it takes 100's lol can't compete with a gpu. How many computes engines in a GPU.

Intel does agree with you. Or Larrabee wouldn't been started back in 05 after intel bought Elbrus. After intel looked over shoulder at SUN and said holy shit. Point is Intel is doing it with X86 processors . Using stateof the art compilars . Using software render. Using ALL the compute performance of the entire system And with only 1/3 of the puzzle in place Thats would be larrabee vectoring process. The next third is in place with Sandy AVX and the SSE2 pretext of vex for larrabee mostly. Than the final chapter At maybe 22nm. but likely 16nm haswell with FMA. By the time we see haswell intel will have moved to 512b vectors. Your either dreaming or you can't see were MV missed the boat. AMD /ATI didn't why is that? When AMD bought ATI Intel had to have felt relieve. Had AMD bought NV. This would be a very differant story. But as it is AMD bought ATI and an EPIC processor. That allowed Intel to move deliberatly and without hesitation to move against CUDA. NOT NV . BUT CUDA. Intel new that any programm written for ATI they could also run. All intel had to do is show AMD were noy locking you out . Thats all it took to keep Intel /AMD in the SAME game. Which keeps NV out. NV made that choice.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: keysplayr2003
Originally posted by: Nemesis 1
Stop with NV and Cuda. Cuda died when DX11 was announced.

It's killing me that I can say certain things, but boy oh boy does Microsoft have a surprise for you. I'm just glad I see you post things like this. It shows "me", at least, that you haven't a clue what's really going on. Oh, and just to make you aware, you are reverting back to your Turtle1/Intelia days where Intel was your religion. Apoppin is closer to the mark than you are, that is for certain.

Easy keys. I am not reverting back to anything. I am looking at the TECH software and hardware. Intel / AMD have best solution . NV going priorty cuda was damaging to themselves.

NO nothing MS does will surprise me. Intel has put MS under its thumb. They know it to . Intel doesn't need MS anymore. Intel has a full librairy except CUDA! Intels compilers are better than MS. I know how AMD and MS schemed on AMD64. So does Intel . Basicly with haswell this war is over . Haswell = FMA= Itanic on the desktop.

If I can make it that far. When you see my name change from Zinn2b to Zinn. You will know Itanic is on your desktop. You will know how you were owned. And who owned you.

Just for fun keys. Its interesting. How many branches do you think 1 thread on larrabee has. You should find out. Than understand what each branch subset is capable of. Its staggering . We don't have long to wait now . March 24 that week should prove entertaining. I wish I can tell ya more but funs about to begin. You guys get a break tho . I should be offline that week . LOL

Keys if NV opened up cuda. Intel would change its stance. Its that simple. Is Cuda hardware? If so why did NV want to help AMD/ATI epic run on cuda.

This is cudas problem . I am sure it works great . But its closed. Get AMD or Intel to bit on closed software be tough trick. AMD/Intel depend on each other. Intel knows full well what AMD has with ATi. Its got intel nervious as all hell. The right compiler look out.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Nemesis 1
Originally posted by: apoppin
You are dreaming along with intel

GPU is going to supplant CPU .. CPU cannot do what the GPU can do .. unless you want 100 cores - then it is not practical any longer
- Cloud computing is the future
- you client guys are stuck in the mud


Think MUCH bigger
--change the ways humans interact

intel's vision is limited and small and based on their PAST glories

You are thinking that Nvidia is gonna roll over like ViA and Transmeta
- their graphics will always kick-ass over intels

intel is its own worst enemy
[you can say that about anyone :p]

rose.gif


and the proof of the pudding is in the eating .. we shall see who has the better vision
- soon enough; my money is on Nvidia .. although i wish intel well
=)

Well it pretty clear Intel agrees with ya. No one is denying the compute power of a gpu. But you state a cpu it takes 100's lol can't compete with a gpu. How many computes engines in a GPU.

Intel does agree with you. Or Larrabee wouldn't been started back in 05 after intel bought Elbrus. After intel looked over shoulder at SUN and said holy shit. Point is Intel is doing it with X86 processors . Using stateof the art compilars . Using software render. Using ALL the compute performance of the entire system And with only 1/3 of the puzzle in place Thats would be larrabee vectoring process. The next third is in place with Sandy AVX and the SSE2 pretext of vex for larrabee mostly. Than the final chapter At maybe 22nm. but likely 16nm haswell with FMA. By the time we see haswell intel will have moved to 512b vectors. Your either dreaming or you can't see were MV missed the boat. AMD /ATI didn't why is that? When AMD bought ATI Intel had to have felt relieve. Had AMD bought NV. This would be a very differant story. But as it is AMD bought ATI and an EPIC processor. That allowed Intel to move deliberatly and without hesitation to move against CUDA. NOT NV . BUT CUDA. Intel new that any programm written for ATI they could also run. All intel had to do is show AMD were noy locking you out . Thats all it took to keep Intel /AMD in the SAME game. Which keeps NV out. NV made that choice.

i know ,, OF COURSE intel realizes that Nvidia is right; the soul of "their" PC migrating to Cloud computing best performed on a GPU is what *spurred* Intel's return to [real] 3D graphics in '05 when their DEAL with Nvidia *fell thru*
.. at least the way intel TALKs about it

But intel has NO UNIFIED VIEW .. their board is clearly at odds with each other as to the DIRECTION they are going

i say [imo] they are too late .. Nvidia got the JUMP on them; Nvidia has clearly been PREPARING for x86 and their OWN CPU-GPU for a LONG time - before AMD acquired ATi to do the SAME THING .. 2-1/2 years ago [and they are well on their way with their OWN programming language and their Fusion CPU-GPU; ATi can teach CPU engineers this kind of thing much easier than for a CPU engineer to think like a GPU engineer]

Intel tried to do it; Remember when intel and Nvidia were briefly "in bed" together? --
--Only intel couldn't cut the deal with Nvidia .. Nvidia got to see that their engineers are LOST when it comes to GPU
- so out came the intel PR team .. the SAME ONE that "did netbust" - P4; with all of their broken promises of 10Ghz "easily"

they simply could not pull off what their MARKETING-DRIVEN management demanded back then .. i see a precedent for Larrabeast
- back then it was an engineering FAILURE [to do the impossible with P4]
- don't tell me their engineers did not know they needed LUCK

well they need it now; Larrabeast is a "shot" at a GPU accelerator. if they are LUCKY and get massive industry support, it will fly

BUT if they do not BEAT *at least* g80 with their midrange .. they are not going to get it .. not right away

and then they pour more and more resources into the GPU *black hole* while Nvidia CONTINUES to do what they do best
- expand the GPU market into *everything* including the automobile and handhelds. AND i am certain they already have their x86 GPU[CPU] already done .. that is WHY he said 1-3 years

UNLESS Nvidia had a breakthrough - which i was waiting for - we would continue to see Jensen pull off his "distraction dance"
- masterfully
-- he should be on "Dancing with the Stars"
:D

AND .. you are full of it if you think CUDA ends with DX10
- i ASSURE that is NOT the case at all
rose.gif


CUDA's future is already ASSURED
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
YOU got me convinced.
rose.gif


I should google this . Its just a thought. When one recompiles x86 apps for avx. Heres the question . After that app is recompiled for Intel AVX. Will that app run on a MAC. Thats a hugh question .
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
darn

didn't mean to :p



:D

i would miss your metaphysical mumbo jumbo
[to everyone else]
rose.gif


i hear you .. and we shall *see*
- i just think i have a little bigger picture of it and the shift to cloud computing is going Jensen's way

 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Well to be honest . I do believe your correct. When it comes to compute hardware software . Your heads in the clouds . ;):gift:
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Nemesis 1
Well to be honest . I do believe your correct. When it comes to compute hardware software . Your heads in the clouds . ;):gift:

no need for apology brother :p

i started it by saying you client guys are stuck in the mud
:eek:

the clouds are my home
.. humans crawled out of the slime long ago
rose.gif
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Nemesis 1
Originally posted by: keysplayr2003
Originally posted by: Nemesis 1
Stop with NV and Cuda. Cuda died when DX11 was announced.

It's killing me that I can say certain things, but boy oh boy does Microsoft have a surprise for you. I'm just glad I see you post things like this. It shows "me", at least, that you haven't a clue what's really going on. Oh, and just to make you aware, you are reverting back to your Turtle1/Intelia days where Intel was your religion. Apoppin is closer to the mark than you are, that is for certain.

Easy keys. I am not reverting back to anything. I am looking at the TECH software and hardware. Intel / AMD have best solution . NV going priorty cuda was damaging to themselves.

NO nothing MS does will surprise me. Intel has put MS under its thumb. They know it to . Intel doesn't need MS anymore. Intel has a full librairy except CUDA! Intels compilers are better than MS. I know how AMD and MS schemed on AMD64. So does Intel . Basicly with haswell this war is over . Haswell = FMA= Itanic on the desktop.

If I can make it that far. When you see my name change from Zinn2b to Zinn. You will know Itanic is on your desktop. You will know how you were owned. And who owned you.

Just for fun keys. Its interesting. How many branches do you think 1 thread on larrabee has. You should find out. Than understand what each branch subset is capable of. Its staggering . We don't have long to wait now . March 24 that week should prove entertaining. I wish I can tell ya more but funs about to begin. You guys get a break tho . I should be offline that week . LOL

Keys if NV opened up cuda. Intel would change its stance. Its that simple. Is Cuda hardware? If so why did NV want to help AMD/ATI epic run on cuda.

This is cudas problem . I am sure it works great . But its closed. Get AMD or Intel to bit on closed software be tough trick. AMD/Intel depend on each other. Intel knows full well what AMD has with ATi. Its got intel nervious as all hell. The right compiler look out.

What I was referring to before was DirectX Compute in Windows 7. I think you only need one mega giant to have your back in this industry, and MS certainly is a mega giant.
An operating system that can utilize the computing power of the GPU for certain apps, tasks? If GPU's can be used to speed up everyday OS tasks, including but not limited to gaming, then that is a pretty big deal. Can you see where this could be headed? I don't know all of the details of DX Compute, but more and more data is surfacing. And the closer Windows 7 comes to launch, the more we'll know about it and what it can offer in it's first iteration. The latest OS's are already optimized as much as they can be for the CPU, but now there is a "new" resource to tap into that could prove far more powerful and robust for certain tasks. I'm really anxious to see what it will bring. Sounds very interesting. I posted a link over in the vid forum about a DH interview with Nvidia's software product manager. Cool stuff.
Linky: DH Interview
Not bad eh?
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
Originally posted by: keysplayr2003
Originally posted by: Nemesis 1
Originally posted by: keysplayr2003
Originally posted by: Nemesis 1
Stop with NV and Cuda. Cuda died when DX11 was announced.

It's killing me that I can say certain things, but boy oh boy does Microsoft have a surprise for you. I'm just glad I see you post things like this. It shows "me", at least, that you haven't a clue what's really going on. Oh, and just to make you aware, you are reverting back to your Turtle1/Intelia days where Intel was your religion. Apoppin is closer to the mark than you are, that is for certain.

Easy keys. I am not reverting back to anything. I am looking at the TECH software and hardware. Intel / AMD have best solution . NV going priorty cuda was damaging to themselves.

NO nothing MS does will surprise me. Intel has put MS under its thumb. They know it to . Intel doesn't need MS anymore. Intel has a full librairy except CUDA! Intels compilers are better than MS. I know how AMD and MS schemed on AMD64. So does Intel . Basicly with haswell this war is over . Haswell = FMA= Itanic on the desktop.

If I can make it that far. When you see my name change from Zinn2b to Zinn. You will know Itanic is on your desktop. You will know how you were owned. And who owned you.

Just for fun keys. Its interesting. How many branches do you think 1 thread on larrabee has. You should find out. Than understand what each branch subset is capable of. Its staggering . We don't have long to wait now . March 24 that week should prove entertaining. I wish I can tell ya more but funs about to begin. You guys get a break tho . I should be offline that week . LOL

Keys if NV opened up cuda. Intel would change its stance. Its that simple. Is Cuda hardware? If so why did NV want to help AMD/ATI epic run on cuda.

This is cudas problem . I am sure it works great . But its closed. Get AMD or Intel to bit on closed software be tough trick. AMD/Intel depend on each other. Intel knows full well what AMD has with ATi. Its got intel nervious as all hell. The right compiler look out.

What I was referring to before was DirectX Compute in Windows 7. I think you only need one mega giant to have your back in this industry, and MS certainly is a mega giant.
An operating system that can utilize the computing power of the GPU for certain apps, tasks? If GPU's can be used to speed up everyday OS tasks, including but not limited to gaming, then that is a pretty big deal. Can you see where this could be headed? I don't know all of the details of DX Compute, but more and more data is surfacing. And the closer Windows 7 comes to launch, the more we'll know about it and what it can offer in it's first iteration. The latest OS's are already optimized as much as they can be for the CPU, but now there is a "new" resource to tap into that could prove far more powerful and robust for certain tasks. I'm really anxious to see what it will bring. Sounds very interesting. I posted a link over in the vid forum about a DH interview with Nvidia's software product manager. Cool stuff.
Linky: DH Interview
Not bad eh?

One people say that Intel will fail with GPUs but they are basing it on their IGP status. Well unlike their IGP Larrabbee is actually meant to make money so they'll put much more resources towards it and care how it turns out.

You can't compare Larrabbee with their IGP. For Larrabbee they have a seperate group working for it, people who are top-notch in graphics knowledge. Plus, unlike their IGP, software engineer investment aka driver team is supposed to be substantial(500 for driver team alone). It might be right to just assume failure for future Intel IGPs but assuming same for Larrabbee basing on their IGP is a mistake. Nobody knows how it will turn up.

See, Nvidia also got a problem. If Intel gets their IGP up and eventually tightly integrated with the CPU like FPU is then majority of the benefits they claim will be on the CPU side again. People are saying many cores like GPUs are the future for GPUs, yet they fail to realize "the end of parallelism" road like it reached for ILP will soon reach for TLP.

The reason x86 survived over other ISAs is because of the massive software support it had, so big that even Intel gave in and embraced it for everything.

Same with IGP. Currently its regarded as pieces of s*it for anything but office work but when it becomes part of the CPU and used to accelerate parallel work, the term "IGP" will be redefined.

The days of Intel "slacking" and purposeful crippling of their products(possibly due to make Itanium look better) ended the moment they canceled Netburst and worked on Core uarch.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
One people say that Intel will fail with GPUs but they are basing it on their IGP status. Well unlike their IGP Larrabbee is actually meant to make money so they'll put much more resources towards it and care how it turns out.
Here is where i stopped reading

intel makes huge money off their IG .. all of their products are meant to make money ,, especially P4; it did - before their engineers failed to meet their marketing department's unrealistic dreams for it

all i will say is that Larrabeast's success will depend on how well it PERFORMS in 3d aps
-compared directly to AMD's and Nvidia's graphics

rose.gif
 
Dec 30, 2004
12,553
2
76
I completely agree with NV. I don't need anything faster than my dual core for desktop usage. Gaming, encoding, maybe; but how often do I do that? Hardly ever.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: soccerballtux
I completely agree with NV. I don't need anything faster than my dual core for desktop usage. Gaming, encoding, maybe; but how often do I do that? Hardly ever.

Intel agrees too, hence Larrabee. But this is more about creating/catering to a new market segment that cares more about GPU things and not so much about CPU things...it really isn't about eliminating the existing market segment that cares about CPU things. (IMO)

NV would love the new emerging market segment to displace the existing one as that means higher revenue potential for them, so their PR rhetoric is understandable and logical if not expected.

But for sure the new segment of gaming and a few CUDA apps is a high growth potential market, just as are netbooks, and high-growth opportunities is what gets shareholders all tingly and the last thing Intel's shareholders want is for Intel to be dominant in a stagnant industry segment while all the high growth opportunities are going to AMD/ATI and NV.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
Originally posted by: apoppin
One people say that Intel will fail with GPUs but they are basing it on their IGP status. Well unlike their IGP Larrabbee is actually meant to make money so they'll put much more resources towards it and care how it turns out.
Here is where i stopped reading

intel makes huge money off their IG .. all of their products are meant to make money ,, especially P4; it did - before their engineers failed to meet their marketing department's unrealistic dreams for it

all i will say is that Larrabeast's success will depend on how well it PERFORMS in 3d aps
-compared directly to AMD's and Nvidia's graphics

rose.gif

You are stupid. Sure they make money off their IG, but they charge a whole $4-5 more than their non-IGP variants. It's going to be nothing like Larrabbee which is a entirely different product and seperate segment itself(unlike IG which comes with a chipset and you don't even need to use it).

They dont' need to waste silicon making an IGP fast as mid-level discrete graphics for $4 when they can do that on lower-end discrete devices which will make real money.