• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

How does nehalem affect future Intel offerings?

Comdrpopnfresh

Golden Member
Jul 25, 2006
1,202
2
81
Does nehalem have enough performance, or does vista or win7 take advantage of quad cores enough to put an end to dual cores like duals did to single cores?
Thoughts?

My reason for wondering about this is that, in general, a dual core system seems more practical than a quad core. Obviously, there are niche places where quads take a foothold; such as, some games, intensive calculations, and huge multitasking environments. But unless there is some architectural tweak, or win7 increases the performance/efficiency of quad cores, would it be fool-hardy to believe EVERY developer of programs or games will, on their own, optimize their releases for quads?

Also, if nehalem is designed for multi-core performance, how would a dual-core derivative of the i7 architecture fair against current E8XXX offerings? Obviously, it'd be quite easy to create a binning of cpus with cores disabled to increases their margins. But:
-Will we see an intel tri-core?
-Will a dual core i7-derivative suffer from an architecture aimed at multi-core optimizations?
-Will a 2/3-core i7 chip retain any of the L3 cache?
-How will core-disabled binnings utilize or alter the native i7 cache hierarchy?

One thing I'd say i7 is good for, independent of the mentionings above is it is driving down prices of DDR3 by furthering its adoption. This is def. good for low-mid range gfx cards- soon they'll be no reason to offer DDR2 over DDR3 when not equipping cards with GDDR variants.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
I know that Intel has dual-core chips planned, especially those with a GPU integrated into the package. (Integrated into the die?)

But Nehalem was designed first and foremost as a server chip. So that explains why it doesn't exceed existing high-end desktop chips in every category, because they had to make tradeoffs in the Nehalem's design.

 

Comdrpopnfresh

Golden Member
Jul 25, 2006
1,202
2
81
Well, if they're planning to phase out current architecture chips with i7 offerings, I'd think they'll have to offer a dual-core chip that isn't vaporware. I've heard of these 'fused' dies w/ a gpu+cpu. But didn't AMD, for instance, drop theirs, and turn 'fusion' into a platform tweak?
 

hans007

Lifer
Feb 1, 2000
20,212
18
81
they proably will not release the dual core version at first. I mean it would be pointless. Dual core would imply midrange and low range computers. nehalem's memory controller is ddr3 which is not low range or mid range.

And it would cost them more to make a dual core nehalem 45nm than to just keep selling 45nm core 2s with chipsets that support low cost ddr2.

I would figure eventually they will do it, but it probably will be a year from now.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: Comdrpopnfresh
One thing I'd say i7 is good for, independent of the mentionings above is it is driving down prices of DDR3 by furthering its adoption. This is def. good for low-mid range gfx cards- soon they'll be no reason to offer DDR2 over DDR3 when not equipping cards with GDDR variants.

Actually, DDR3 and GDDR3 aren't the same thing. As a matter of fact, they have so many differences, I can't imagine the price of one effecting the price of other. In other words, it isn't based on desktop DDR3, the way that GDDR2 is based on desktop DDR2. Wikipedia explains it pretty well, in basic terms.
 

Comdrpopnfresh

Golden Member
Jul 25, 2006
1,202
2
81
I was saying 'budget' cards typically have ddr2 on them. With a higher adoption of DDR3 driving down prices, we'll see that DDR2 turn to DDR3 on new offerings. Those 'budget' cards are typically the ones without the GDDRX
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
That's because DDR2 and GDDR2 are either exactly or almost exactly the same thing. DDR3 and GDDR3 are far from the same thing, which is why the price of one falling won't be effecting the price of the other much, if any at all.
 

Sureshot324

Diamond Member
Feb 4, 2003
3,370
0
71
Intel is going to release a smaller socket (LGA1160) in 2009 for more mainstream products. The Havendale core is basically a dual core Nehalem with integrated graphics on die and will be on LGA1160. The Lynnfield core is apparently a cheaper quad core that will also be on LGA1160. All future 8+ core Nehalems will only be on LGA1366.

It's too bad Intel decided to go this route and have 2 sockets at once, though I can understand it could be hard to put 8 or more cores on a small die.
 

Martimus

Diamond Member
Apr 24, 2007
4,490
157
106
Originally posted by: myocardia
That's because DDR2 and GDDR2 are either exactly or almost exactly the same thing. DDR3 and GDDR3 are far from the same thing, which is why the price of one falling won't be effecting the price of the other much, if any at all.

You may be right, but there are definietly versions of AMD's lower binned cards that use DDR3 versus GDDR3. It was one part of the Anandtech review on the HD 4650 that I remember reading. AMD felt the performance was too similar in the actually operating range that the cost of the GDDR wasn't worth it.

For a while now we've been seeing either DDR2 or GDDR3 on most graphics cards. The future standard looks to be GDDR5 at this point (with 4 being skipped due to its not-that-much-better-ness when compared to GDDR3), but currently board makers are faced with an interesting situation: DDR3 is coming down in price due to its adoption on the desktop.

The push to GDDR3 was to fill the need in the graphics industry for faster DRAM, so it came along a little ahead of DDR3 and has served us well for the past few years. DDR3 is now finding traction in the desktop world and prices are starting to come down as every major platform will support DDR3 going forward. Performance is apparently fairly similar at the speeds DDR3 can hit (we don't really have a way of testing this ourselves right now, but this is what AMD is telling us), so price is really the only differentiator.
Anandtech Article Link
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: myocardia
That's because DDR2 and GDDR2 are either exactly or almost exactly the same thing. DDR3 and GDDR3 are far from the same thing, which is why the price of one falling won't be effecting the price of the other much, if any at all.

It wasn't about affecting the price at all.
GDDR3 stays the same price.
DDR3 falls in price.
DDR3 becomes more attractive and replaces DDR2 and GDDR3 on graphics cards.

Back to the OP's topic:
It's a modular design which can scale, but IMO it's just not necessarily something that will work well when scaled down.
Clock for clock it's not got that much of an advantage over Core 2's in many situations, although with HT on a 2 core processor it might do more, but it's almost certain to be scaled down for mobile processors.

I think once upon a time I saw a rumour about late 09 for dual core Nehalem in laptops, and maybe desktops, since lower end Nehalem (LGA10xx or whatever, with dual channel RAM) will be out and DDR3 will be more affordable too.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
-Will we see an intel tri-core?
-Will a dual core i7-derivative suffer from an architecture aimed at multi-core optimizations?
-Will a 2/3-core i7 chip retain any of the L3 cache?
-How will core-disabled binnings utilize or alter the native i7 cache hierarchy?

1. I don't see why it will. The L3 cache still acts to the dual cores as it would with the quad cores: snoop filter and victim cache.
2. We know certain plans for future Nehalem already. Lynnfield, which is quad core without the QPI(but still featuring integrated memory controller dual channel in this case), Havendale, which is a dual core with exactly half the L3 cache(4MB) and CPU core is connected to the memory controller/GPU using a QPI interface.
3. I don't think it will much. We have dual core version which is exactly half the quad core versions with half the caches etc, and possible value versions with no L3 cache.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Seems to me that cpu speeds have crested around the 4Ghz mark for the last 2-3 yrs, might as well increase core's with new instruction sets and pathways to get more performance. Less core's\cache will be the next budget cpu's....

2c
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Seems to me that cpu speeds have crested around the 4Ghz mark for the last 2-3 yrs, might as well increase core's with new instruction sets and pathways to get more performance. Less core's\cache will be the next budget cpu's....

Not a direct response to you, but its easy to see why clock speeds aren't increasing. It's very simple.

Every process tech generation decreases power by 35-40% in reality. If the number of cores increase by 2x per generation, eventually the CPU will run extremely hot in the future. Add clock speeds and the power will go to the stratosphere.

This is why Nehalem focuses on power conservation and clocking down as much as possible when necessary. Turbo mode is another part of the same equation. By increasing clock speed when its cool, they are essentially increasing clock speeds without actually increasing it :p. Sandy Bridge, the next architectural revision has extreme focus on Turbo Mode. It'll be much more advanced and the default will clock much higher.

That aside, there's supposed to be faster Core i7's in the future so we'll probably see Bloomfield's top out at 3.46GHz or so.
 

DanDaMan315

Golden Member
Oct 25, 2004
1,366
0
0
Eventually quad core is going to work right and take a foot hold. If the software isn't working right for it now then I would say it's not worth your time or the money. I doubt triple core Intels will be released just because Intel is really behind a lot of the programming to make their chips work right. Hopefully, in the meantime, we can see Microsoft putting out updates to make 3+ cores work right. I'd like to see some articles published about what is actually making multi core processors work.