Anand has Lynnfield Preview Up

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: SickBeast
Originally posted by: Shaq
Originally posted by: SickBeast
I'm really guessing that intel is trying to create a gaming platform with this i5 socket. For GPGPU purposes, they say that you don't need as much PCIe bandwidth. That may explain why intel has put a 1/2 bandwidth PCIe controller on the Lynnfield. It probably saves die space, and does not negatively impact Larrabee (but will negatively impact AMD/NV GPUs).

That's pretty underhanded, especially in light of the massive antitrust settlement in Europe. This one is simply blatant IMO.

Interesting. I didn't think of that. Make Nvidia and AMD users buy the most expensive board in order to get full bandwidth for their video cards and offer Larrabee on the cheaper boards. It will make Larrabee even more attractive price vs. performance. And that would force AMD and Nvidia into making chipsets to get that bandwidth back. We'll see if that is part of their strategy with this move.

Yeah, I'm guessing that the Larrabee could benefit 20-30% or more from the on-die PCIe controller as well due to the reduced latency. I have a feeling that it's going to be really lopsided. I could see Larrabee winning a few benchmarks on an i5 motherboard, then losing them all on a different motherboard, and vice versa.

Intel is doing everything that they can to boost Larrabee's performance as they appear to be at a 50% transistor disadvantage by default.

How does that give Intel an advantage? Larrabee would look better than Nvidia and ATI on an Intel board, but the Intel board would be destroyed by an AMD board using the exact same video cards. There's no way Intel would screw themselves like that.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: ShawnD1
How does that give Intel an advantage? Larrabee would look better than Nvidia and ATI on an Intel board, but the Intel board would be destroyed by an AMD board using the exact same video cards. There's no way Intel would screw themselves like that.
If Intel does nothing, then they are screwed on an AMD board anyway. By optimizing the i5 platform for Larrabee, they are at least doing *something* to help boost their own brand. It will give them the nice PR benchmarks that they can slap on the packaging to make the Larrabee look good on paper, just like they did the last time they released a discreet GPU.
 

Seramics

Junior Member
Apr 30, 2009
2
0
0
Why is it that in test that demand intensive multi threading, the lynnfield loses out to bloomfield while in less intensive multithreaded or single threaded test, the lynnfield can equal bloomfield? Could it be that in intensive multithreading processing that work up more than 4 threads (thereby requiring the use of HT for better performance), it is able to use the extra memory bandwith provided by triple channel in the bloomfield platform? Maybe in situation where the software can really demand mutlple threading processing intensively, the bloomfield extra memory bandwith provide an edge. An easy way to test this is increasing the memory bandwith of lynnfield and/or decreasing memory bandwith of bloomfield n see wht happens in intensive multithreaded test. Im suspicious of this. Let me know wht u guys thinks.
 

soonerproud

Golden Member
Jun 30, 2007
1,874
0
0
Originally posted by: Nemesis 1
Sorry only amd fanbois leave out details, I posted . Clearly I said Beginning 3rd qt . 1 month for 4 core beginning 4th quarter 32nm 2 core 4 thread. Ya A month really gives AMD breathing room .

I am going to help AMD out. I have PH now . After they cut pricies to the bone . I gonna pick up very Cheap PHII . Along with a cut to bone priced M/B . I already got all the DDR3 I need

There more than 1 post in thread . I don't need to keep putting down dates after time frame established.

Sorry then, I missed the part where you said Q3, it was a honest mistake. I also have a C2D PC which I speced out and built for my brother in the house so I am hardly a fanboy of one company or another.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: SickBeast
Originally posted by: ShawnD1
How does that give Intel an advantage? Larrabee would look better than Nvidia and ATI on an Intel board, but the Intel board would be destroyed by an AMD board using the exact same video cards. There's no way Intel would screw themselves like that.
If Intel does nothing, then they are screwed on an AMD board anyway. By optimizing the i5 platform for Larrabee, they are at least doing *something* to help boost their own brand. It will give them the nice PR benchmarks that they can slap on the packaging to make the Larrabee look good on paper, just like they did the last time they released a discreet GPU.


I like you . But ya do know what happens when larrabbee Kicks ASS don't YA? I gonna put SickBEAST out of misery . Its the human thing todo. You should not suffer so ;)

 

imported_Shaq

Senior member
Sep 24, 2004
731
0
0
Originally posted by: Nemesis 1
Originally posted by: SickBeast
Originally posted by: ShawnD1
How does that give Intel an advantage? Larrabee would look better than Nvidia and ATI on an Intel board, but the Intel board would be destroyed by an AMD board using the exact same video cards. There's no way Intel would screw themselves like that.
If Intel does nothing, then they are screwed on an AMD board anyway. By optimizing the i5 platform for Larrabee, they are at least doing *something* to help boost their own brand. It will give them the nice PR benchmarks that they can slap on the packaging to make the Larrabee look good on paper, just like they did the last time they released a discreet GPU.


I like you . But ya do know what happens when larrabbee Kicks ASS don't YA? I gonna put SickBEAST out of misery . Its the human thing todo. You should not suffer so ;)

Kick ass at the high end or midrange bang for buck? I highly doubt it will beat a G300 on the initial run. And even if it is that good they have to be able to write the drivers to use the power. lol With 5-10 or more on a board it will be a little trickier to do.
 

alyarb

Platinum Member
Jan 25, 2009
2,444
0
76
larrabee is shooting for 2 tflop at 300 W. if RV870 is 1200 shaders @ 950 and G300 is going to be 512 shaders at 1500, it's already over.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: alyarb
larrabee is shooting for 2 tflop at 300 W. if RV870 is 1200 shaders @ 950 and G300 is going to be 512 shaders at 1500, it's already over.

Ya got a link . Latest is larrabee is 4 sflops 2 dpflops witth 48 cores and it sips power . Thats the performance model . The 64 cpu model is over 5 spflops . My info is as valid as yours. Fact is latest rumor has the top larrabee AT between 80and 50 cores That would be 64 cores . I love the way Intel is handling this .

 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Nemesis 1
Originally posted by: SickBeast
Originally posted by: ShawnD1
How does that give Intel an advantage? Larrabee would look better than Nvidia and ATI on an Intel board, but the Intel board would be destroyed by an AMD board using the exact same video cards. There's no way Intel would screw themselves like that.
If Intel does nothing, then they are screwed on an AMD board anyway. By optimizing the i5 platform for Larrabee, they are at least doing *something* to help boost their own brand. It will give them the nice PR benchmarks that they can slap on the packaging to make the Larrabee look good on paper, just like they did the last time they released a discreet GPU.


I like you . But ya do know what happens when larrabbee Kicks ASS don't YA? I gonna put SickBEAST out of misery . Its the human thing todo. You should not suffer so ;)

Is that a death threat? Perhaps the mods will BAN you so you can be re-incarnated yet again. :light:
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
Originally posted by: SickBeast
Originally posted by: Nemesis 1
I like you . But ya do know what happens when larrabbee Kicks ASS don't YA? I gonna put SickBEAST out of misery . Its the human thing todo. You should not suffer so ;)

Is that a death threat? Perhaps the mods will BAN you so you can be re-incarnated yet again. :light:

I know you are joking sickbeast, as is Nemesis, but you really need to be more liberal with the emoticons when using those words and phrases as that can be a particularly sensitive topic for some forum visitors. Make the sarcasm a bit more obvious is all I'm asking.
 

Avalon

Diamond Member
Jul 16, 2001
7,565
150
106
A very impressive product. I was a little disappointed that Intel wasn't shooting for much lower of a price range, but I guess expecting mainstream quad cores priced into dual core territory was a bit unrealistic. Maybe with their next iteration, and I'll upgrade.
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
Originally posted by: SickBeast
I don't appreciate sarcasm when it comes to comments like that.

He's kidding IMO, I think you might be taking it a bit too personal. It's your right to take it as personal as you like of course, no one can tell you how you should feel, but taking it too personal won't change whether or not there is actually any true malice behind those words.

And based on Nemesis online persona I'm pretty sure he's just kidding with you and wouldn't have used such comments if he suspected you would take it as an actual threat of harm to your body.

Just my opinion of course, Nemesis can speak for himself, I am merely trying to expedite heading off at the pass of what appears to me to be an honest miscommunication between you too in hopes of avoiding this situation from degrading.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
As I said in my PM, IDC, I have no intention of starting a thread in PFI or contacting the mods. I would appreciate getting back on topic; this nonsense is derailing the thread.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: SickBeast
Originally posted by: Nemesis 1
Originally posted by: SickBeast
Originally posted by: ShawnD1
How does that give Intel an advantage? Larrabee would look better than Nvidia and ATI on an Intel board, but the Intel board would be destroyed by an AMD board using the exact same video cards. There's no way Intel would screw themselves like that.
If Intel does nothing, then they are screwed on an AMD board anyway. By optimizing the i5 platform for Larrabee, they are at least doing *something* to help boost their own brand. It will give them the nice PR benchmarks that they can slap on the packaging to make the Larrabee look good on paper, just like they did the last time they released a discreet GPU.


I like you . But ya do know what happens when larrabbee Kicks ASS don't YA? I gonna put SickBEAST out of misery . Its the human thing todo. You should not suffer so ;)

Is that a death threat? Perhaps the mods will BAN you so you can be re-incarnated yet again. :light:

Oh ! no s Please beasty have mercy .:brokenheart: I will reincarnate soon enough without mods aid.
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
My own OPINION is that Larrabee will be a flop...I have heard that none of the major driver designers have yet been enticed over to the project, and without some massive and completely new drivers for Larrabee, it will be relegated to the low end. At least for the first few years...
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
Originally posted by: Viditor
My own OPINION is that Larrabee will be a flop...I have heard that none of the major driver designers have yet been enticed over to the project, and without some massive and completely new drivers for Larrabee, it will be relegated to the low end. At least for the first few years...

I thought Intel had some 400 or 500 engineers/programmers dedicated to just writing the drivers for Larrabee? Or is that an internet urban legend that has no base in reality?
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Back to the ballgame. So sick beast you believe intel has crippeled I5 in a way that would hurt Ati/ NV performance .

Let me tell why that not a good theory .

Because of intels Very own game company . Intel doesn't have to pull any crap and they won't . Howintel does on todays games is unknown . But I suspect that if ATI/ Nv play Intel game they may not run at all or slower than intel . Point is Intel has spent more than Million on this project . Intel KNOWS It needs a kick ass game To get public attention . Intel is spending alot on this game . Every aspect . But there taking real pride in the ART WORK itself . There keying on it . I like that . The hired great graphics artist . and the colaberation between all the gaming companies they bought is all about 1 game to start with . IF the game is a Hit , Thats all Intel needs , This game is intel exclusive . Until ATI/NV can make it work on their hardware .


I don't know why you did this public sickbeast . BVut you emailed me other day and asked me to honestly ans a question . I did . Than you turn around and try to hurt me with . Ya know beast that kinda hurts. My comment was directly focused on your tag. You should have and probably did understand that . I am deeply disappointed you. If you want me to go away beast just ask me . You don't need mods . I would say I am sorry about comment. But I am not . It was meant in spitit of fun . Clearly written as such .
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Intel was recruiting graphics people en masse a year or so ago but I haven't heard anything since then.

I don't see why the drivers can't produce themselves with the right tools, for the most part. The hardware is essentially software to begin with.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: Viditor
My own OPINION is that Larrabee will be a flop...I have heard that none of the major driver designers have yet been enticed over to the project, and without some massive and completely new drivers for Larrabee, it will be relegated to the low end. At least for the first few years...

True to form :Q YA but Viditor Larrabee will stilldo well with only NV to compete with . Since AMD is about to get the boot from cpu business:lips: But not to worry Viditor . I understand Apple is snoopying around the watering hole.
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: Nemesis 1
Originally posted by: Viditor
My own OPINION is that Larrabee will be a flop...I have heard that none of the major driver designers have yet been enticed over to the project, and without some massive and completely new drivers for Larrabee, it will be relegated to the low end. At least for the first few years...

True to form :Q YA but Viditor Larrabee will stilldo well with only NV to compete with . Since AMD is about to get the boot from cpu business:lips: But not to worry Viditor . I understand Apple is snoopying around the watering hole.

Does this happen before or after the giant aliens land? :D
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Spicedaddy
Originally posted by: taltamir
with 633 mhz turbo mode and a 100$ or lower mobo, no you are not better off getting an i7

Turbo mode is for people who don't overclock, and good P55 mobos won't be 100$ or lower when they come out, while X58 boards keep getting cheaper every month.

If you're on Core 2: wait and see.
If you're on older stuff and want to upgrade: not worth the 3-4 month wait IMO.

if you are on something older than core2 than wait a few months and buy a used C2Q system for peanuts.

Anyways, the X58 is just so WASTEFUL, the X58 does NOTHING except connect to the CPU, Southbridge, and Video cards... moving the video card interconnect to the CPU and have the CPU connect to the southbridge directly just makes a whole lot of sense.

The problem with disabling turbo mode and overclocking the chip, is that you end up using a lot more power.
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: Idontcare
Originally posted by: Viditor
My own OPINION is that Larrabee will be a flop...I have heard that none of the major driver designers have yet been enticed over to the project, and without some massive and completely new drivers for Larrabee, it will be relegated to the low end. At least for the first few years...

I thought Intel had some 400 or 500 engineers/programmers dedicated to just writing the drivers for Larrabee? Or is that an internet urban legend that has no base in reality?

Firstly, (as you know) it's not always quantity...quality in engineers is at least as important. The rumours I have heard is that the best engineers are still working for ATI and Nvidia.
Also, I believe that the quantity is a bit short as well...IIRC, Nvidia has over 1,800 software engineers on the books.

Of course this is all rumour and speculation...