I find it interesting that you first bash x86, and then you go on to bash itanium.
Probably because you are trying to read it a certain way. I bashed the success of Itanium- Intel spent billions on engineering and years hyping it. In no uncertain terms was it a faiulre and everyone outside of Intel loyalists knew it was going to be in advance and could explain why. Everyone except Intel loyalists were right.
Why so much hate for intel?
No hate at all for Intel. Sometimes I do something stupid too, my friends are usually the ones who point it out to me, the people who genuinely don't like me just sit back and wait to get a good laugh
If we shouldn't use x86 and we shouldn't use itanium, what should we use?
VLIW would actually work quite well in a Larrabee style setup, it sucked trying to replace x86 CPUs for the masses. It still wouldn't be ideal, but it would be much, much better then x86.
The way I look at it is that if there was something so much better out there, people would use it.
You are not even close to that ignorant

Geothermal power production is vastly superior to every other method we use in the world today, outside of Iceland it isn't really used anywhere. There is the question of massive amounts of money that need to be put into something to make it viable for the mass market. Intel is banking on scales of economy making Larrabee a viable platform. The problem is, they need it to be competitive in the GPU space before they will realize that goal. If it isn't, their whole plan falls apart.
The way I look at it, 80 atom cpus will encode video faster than my G80 based GPU with 80 stream processors running CUDA.
80 Atom CPUs utterly dwarf the die space of 80 stream processors, by a huge margin. Besides that, 80 Atom CPUs come in about 1/6th the raw procession power of 80 stream processors. We aren't even close to comparable in terms of raw power or die size, x86 it utterly demolished in your given comparison on both fronts(Larrabee has far better designed cores then Atom which would be an abject failure in every way possible if it was the design route they were taking).
You bash x86, yet it is the language upon which just about everything today is written.
Because it has been around for so long and it works extremely well at general purpose OoO code. For the specific useage we are talking about, it sucks, badly. Intel already had to add a bunch of instructions to be able to push Larrabee with a straight face, it is simply a bad instruction set to use for a vector processor.
I would certainly call NV's approach more radical.
Not in the space they are in. HPC application architectures are closer to stream processors then x86 OoO based ones.
If intel had more involvement in rasterization, all current GPUs would probably run some variant of x86 already.
No chance honestly. x86 is flat out bad for highly parallel code. Different length instructions right off the top- bad idea for the type of architecture they are using. Requiring decode hardware to alter code to something that is machine level friendly isn't a good idea when you could just create an architecture that produces machine level friendly code in the first place. Millions of transistors per Larrabee chip will be spent on decode front end, that is required because of x86 and wouldn't be if they used an architecture better suited for it.
From a graphics standpoint, I agree that intel's approach is problematic and wasteful, but I think there are side benefits that counteract this and actually make the idea quite viable.
By the time Larrabee hits, will it have any advantage over the GPUs? Given that Larrabee requires a recompile and won't run OoO code properly, what edge do you see it having over the DX11 class GPUs from a non graphics standpoint? Larrabee can not run existing code, and even with a recompile outside of very limited exceptions it can't run existing code remotely decently either.
There's also the idea that games programmed specifically for Larabee will not take a performance hit.
They won't take a performance hit compared to software rendering. Developers could make games to run faster on Larrabee if they sought to cripple them on rasterizers, but that is what it would take for Larrabee to be competitive.