Why there aint multi CPU mobos?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

WildW

Senior member
Oct 3, 2008
986
20
81
evilpicard.com
If we're on the subject of multi-processor porn, I must reminisce about my old Dual Xeon (Nocona core 3GHz - Pentium 4's basically.)

Pic

The first dual-core Athlon 64 chips were starting to appear for £LOTS, and it was a little cheaper to build a dual Xeon system. And yes, that's a 6800 Ultra . . .ahh great days. Four threads in task manager thanks to hyperthreading, and I could run Battlefield 2 on high when it first came out.

I later bought a proper Athlon X2, and it was a good bit faster for gaming, but would grind to a halt more readily when running a bazillion applications than the dual Xeon.

Hmm, you're right, that was a lot of hard drives.
 

Gillbot

Lifer
Jan 11, 2001
28,830
17
81
Originally posted by: waffleironhead
Originally posted by: Gillbot

Asus P2B-D?
Tyan Tomcat III?

See my earlier post ;) I still have my p2b-d, but I doubt I have memory for it anymore.

I saw, I was addressing Aigo's post dismissing us old timers! ;) :laugh:

If I'm not mistaken, I still have a Tomcat III with twin 166's and ram somewhere. Though I doubt I have a PSU to power it since I believe it takes an AT one.
 

Zap

Elite Member
Oct 13, 1999
22,377
2
81
Originally posted by: aigomorla
The only reason why i keep it is because its 2p lappy processor system thats based off a server platform. Only consumes about 100W on full load.

Its called Sammy:
http://i125.photobucket.com/al...aigomorla/IMG_1083.jpg

Hey, that sounds pretty cool! I'm sure some of the properties of mobile chips make them suitable for high density (for cooling) or performance/watt sensitive server stuff.

Originally posted by: Meelis S
Why not to join 4 ITX mobos, take off 3 int videos, join power connection, remove back panel connections. Soo 8 memory slots, 4 cpu soc, 1 int video, nice small for work.
I belive there is space for more memory.

SGI Octane III?
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
As has been said it is because home users are not using the processing power they have even now. Lots of people with quad cores rarely use all 4 cores. I could use for 3d work. 3D renderers are like crack addicts when it comes to cpu power, there is no such thing as too much.

It has come a long way though for ease of access to multiple cores. I remember using the Abit bp6 with celeron 300A chips. Intel disabled the smp support so some of us put our very expensive cpu under a drill press, drilled out cpu pins and then soldered wire in place to +5v to pull up the pin to re-enable smp. That is when overclocking and moding a pc was pretty hard core :)

 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Meelis S
<<<<<Because 99.999999% of "home users" don't need that kind of computational power?
How many of them go for multi GPU boards for speed. I belive them + people who need rendering power for 3DCG or video aint that low %.

<<<<<Keep in mind that a single i7 or i5 quad will handle nearly anything.
But it would be nice to reduce processing time from month to weak, or from weak to below 2 days,
when only there would be multible slots for cpu-s and u have te possibility to choose more if u need.

<<<<<What do you plan on doing with all that horsepower?
3DCG rendering.

1. multi GPU users do not need multiple CPUs... that is because the GPU contributes a lot more to gaming performance than the CPU.
2. 3DCG that takes WEEKS to render is NOT home usage. home users don't make full length animated movies... disney and pixar do.
3. 3dCG is one of the rare few things that scales to that many cores, and no home user needs it.

When it comes right down to it, it is simply a matter of how you define "home user"... apparently you define it as the one in a million that actually renders that much... which does not justify the cost of development. Get something that renders in CUDA
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: taltamir
Get something that renders in CUDA

Unfortunately when it comes to 3d work, CUDA is all but dead for the time being. Nothing uses it for final render.

 

Meelis S

Member
Mar 14, 2009
34
0
0
Originally posted by: taltamir
When it comes right down to it, it is simply a matter of how you define "home user"... apparently you define it as the one in a million that actually renders that much... which does not justify the cost of development.

That's good point, who knows how many home users would pay for speed.
80%+ just brows internet.
 

Denithor

Diamond Member
Apr 11, 2004
6,300
23
81
Which is exactly the logic behind my first post to you.

If you seriously have a use for multiple i7 chips in a single system then you are a 1 in 1,000,000 or even 1 in 10,000,000 user. And no one is going to spend serious money developing a system targeted at the home market that is going to sell like 10 machines per year.

So if you do actually need that kind of power you're goint to have to buy a server level machine.
 

mlah384

Senior member
Dec 17, 2008
228
1
71
thisonlinething.com
Originally posted by: Arkaign
Originally posted by: aigomorla
Originally posted by: ChaosDivine
Originally posted by: aigomorla
Third.. we did have a consumer 2p board for the home user. It was called skulltrail, and it was insanely expensive.
Tsk, tsk. Forgetting about ye olde BP6 are we? :p

lol

not many people reading this thread were in the hobby that long to remember a abit bp6. :p

Heh, I had a BP6 with 512MB of ram, and two Celeron slot-1 333s running at 500mhz. It was a helluva experience using Win2k and trying to get a game to take advantage of it. I think I finally succeeded with a version of Quake 2, but then a patch broke the SMP. It's been a long damned time back, but I just remember it being frustrating and a bit of fun as well.

HA! I remember the old 333 celeron... it was like the best OC processor ever at the time!
 

PlasmaBomb

Lifer
Nov 19, 2004
11,815
2
81
Originally posted by: wwswimming
Originally posted by: Meelis S
<<<<Keep in mind that a single i7 or i5 quad will handle nearly anything.
But it would be nice to reduce processing time from month to weak, or from weak to below 2 days,
when only there would be multible slots for cpu-s and u have te possibility to choose more if u need.

plus the 2P systems are compatible with 64 bit spellcheckers. :sun:

To be fair he spelt the word correctly, although he used the wrong word ;)
 

PlasmaBomb

Lifer
Nov 19, 2004
11,815
2
81
Originally posted by: aigomorla
Originally posted by: ChaosDivine
Originally posted by: aigomorla
Third.. we did have a consumer 2p board for the home user. It was called skulltrail, and it was insanely expensive.
Tsk, tsk. Forgetting about ye olde BP6 are we? :p

lol

not many people reading this thread were in the hobby that long to remember a abit bp6. :p

I don't remember anything from 1999...

DOH!
 

deimos3428

Senior member
Mar 6, 2009
697
0
0
Consumer grade multi-socket boards went the way of the dodo with the advent of multi-core packages. They haven't exhausted the number of cores they can squeeze into a single package yet, but when that day comes we'll no doubt start seeing consumer-level boards with multiple sockets again. I think there might be a niche market for redundant sockets, though... (Why should a computer halt just because the processor fails? There are video games to play, dammit.)
 

Meelis S

Member
Mar 14, 2009
34
0
0
Originally posted by: Denithor
Which is exactly the logic behind my first post to you.

If you seriously have a use for multiple i7 chips in a single system then you are a 1 in 1,000,000 or even 1 in 10,000,000 user. And no one is going to spend serious money developing a system targeted at the home market that is going to sell like 10 machines per year.

So if you do actually need that kind of power you're goint to have to buy a server level machine.

1 Question, how was Nvidia able to make multi gpu single video cards what u can have in sli when market for that is even lower than for cpu's?
Maybe 1 person of 100,000,000 will buy it.

I dont see multi cpu machine like 1 extreme solution, but feature for every mobo to add cpu's in a bit larger board on multi side or in PCIE slot like a video card that has 2+ cpus and memory integrated on. But with PCIE "slot like" u need to have cpu card with integrated waterblock. Soo u can have more power in smaller cubage space.

But i understand it might kill 2soc Xeon solution, or maybe not because u can have more memory on workstation mobo. And memory really is like 150-200x faster than swaping on 1 HD.
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
There's just no market demand for them. If there were, users would be buying server MBs (which you can easily do for some extra $$) and use them in custom built PCs. Manufacturers would then notice this happening and start marketing multi socket mb's to consumers. Guess what? None of this actually happens.
 

drizek

Golden Member
Jul 7, 2005
1,410
0
71
Originally posted by: deimos3428
Consumer grade multi-socket boards went the way of the dodo with the advent of multi-core packages. They haven't exhausted the number of cores they can squeeze into a single package yet, but when that day comes we'll no doubt start seeing consumer-level boards with multiple sockets again. I think there might be a niche market for redundant sockets, though... (Why should a computer halt just because the processor fails? There are video games to play, dammit.)

I have never had a processor fail. HDDs, ram, PSUs, video cards, motherboards, all that stuff can go out at any time, but CPUs generally will not fail unless you go out of your way to do so.
 

Meelis S

Member
Mar 14, 2009
34
0
0
Originally posted by: drizek
Originally posted by: deimos3428
Consumer grade multi-socket boards went the way of the dodo with the advent of multi-core packages. They haven't exhausted the number of cores they can squeeze into a single package yet, but when that day comes we'll no doubt start seeing consumer-level boards with multiple sockets again. I think there might be a niche market for redundant sockets, though... (Why should a computer halt just because the processor fails? There are video games to play, dammit.)

I have never had a processor fail. HDDs, ram, PSUs, video cards, motherboards, all that stuff can go out at any time, but CPUs generally will not fail unless you go out of your way to do so.

Friend of mine had AMD 500Mhz cpu working 0,5 year without any cooling. But gpu-s will burn most often in my experience.
 

ChaosDivine

Senior member
May 23, 2008
370
0
0
Originally posted by: Meelis S
Friend of mine had AMD 500Mhz cpu working 0,5 year without any cooling. But gpu-s will burn most often in my experience.
Try that with a Thunderbird / Duron. Your oops time is ~1s.
 

MikeShunt

Member
Jun 21, 2007
35
0
0
Originally posted by: Meelis S
The memory speed is not that important, but the amount of memory and the cpu speed.

A lot of people would disagree with that. no use having a blazing fast CPU(s) if it cant get data to/from RAM fast enough.

 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: MikeShunt
Originally posted by: Meelis S
The memory speed is not that important, but the amount of memory and the cpu speed.

A lot of people would disagree with that. no use having a blazing fast CPU(s) if it cant get data to/from RAM fast enough.

Benchmarks typically disagree.
 

Dravic

Senior member
May 18, 2000
892
0
76
1 Question, how was Nvidia able to make multi gpu single video cards what u can have in sli when market for that is even lower than for cpu's?
Maybe 1 person of 100,000,000 will buy it.

You have that wrong, there is much higher demand for GPU power then CPU power for a home user that games at all.

Because SLI and crossfire have real tangible benefits for the average gamer, is just more expensive. Going from 40 - 70 fps in a game provides a decent improvement when gaming at higher resolutions. SLI'ing $150 GPU is actually getting to be pretty common way of getting close to $500 single gpu performance. The single SLI cards are even nicer because you don't require a more expensive SLI mobo.

Now TRI and QUAD SLI would be a better comparison as the benefits of those setups require a i7 with HT on and 3-4 GPU's to see the payoff difference, and that is only at very high resolutions.

There is literally no benefit to home consumers for 8 cores as very few applications even scale to 4 cores, and you can get 4 cores in a single socket.

Anyone saturating 4 cores, and in need of 8 core processing power is no longer a home user, and is looking for workstation type power. Trust me it sucks.. I have more processing power in my gaming rig (in sig), then i do in my dual socket opteron workstation with ECC ram that cost a lot more, but was needed for virtualiztion power at the time.

Technology has the same shelf life as a ripe banana.. $5-10k workstations are out classed by the latest desktops every 3-5 years.. no way around it.

 

Meelis S

Member
Mar 14, 2009
34
0
0
Originally posted by: Fox5
Originally posted by: MikeShunt
Originally posted by: Meelis S
The memory speed is not that important, but the amount of memory and the cpu speed.

A lot of people would disagree with that. no use having a blazing fast CPU(s) if it cant get data to/from RAM fast enough.

Benchmarks typically disagree.

It's soo because i7 max memory speed is ~25 Gb/s but 1x 1333Mhz ddr3 is ~10Gb/s.
Soo 3x ddr3 is 30Gb/s, 6x=> 60Gbs.
U may buy faster ddr3 in 2 modules confiq, but no point in that because it's more expencive than more Gb cheaper memory. Most important more memory is faster.

Originally posted by: Dravic
1 Question, how was Nvidia able to make multi gpu single video cards what u can have in sli when market for that is even lower than for cpu's?
Maybe 1 person of 100,000,000 will buy it.

You have that wrong, there is much higher demand for GPU power then CPU power for a home user that games at all.

Because SLI and crossfire have real tangible benefits for the average gamer, is just more expensive. Going from 40 - 70 fps in a game provides a decent improvement when gaming at higher resolutions. SLI'ing $150 GPU is actually getting to be pretty common way of getting close to $500 single gpu performance. The single SLI cards are even nicer because you don't require a more expensive SLI mobo.

Now TRI and QUAD SLI would be a better comparison as the benefits of those setups require a i7 with HT on and 3-4 GPU's to see the payoff difference, and that is only at very high resolutions.

There is literally no benefit to home consumers for 8 cores as very few applications even scale to 4 cores, and you can get 4 cores in a single socket.

Anyone saturating 4 cores, and in need of 8 core processing power is no longer a home user, and is looking for workstation type power. Trust me it sucks.. I have more processing power in my gaming rig (in sig), then i do in my dual socket opteron workstation with ECC ram that cost a lot more, but was needed for virtualiztion power at the time.

Technology has the same shelf life as a ripe banana.. $5-10k workstations are out classed by the latest desktops every 3-5 years.. no way around it.

That's the problem, the price of workstation - if u compare speed test's, they aint soo fast. U can have 2+ i7 computers insteed DP Xeons for less money. And it's faster.
I belive that's the problem why there is no market for home users in multi cpu range.
Like 2 lowend video cards in sli can be faster, can allsoo cpu's be.
Compare speed and price of i7 920 and 965, or lowend/hiend 5500 DP Xeons.

But many cpu demanding programs can use multi cpu's and cores just fine. Some not soo mutch.
 

deimos3428

Senior member
Mar 6, 2009
697
0
0
There's always an improvement to the system performance by adding more cores, until you actually have more cores than processes. (Single-application performance may or may not improve.)

The problem is obviously cost. If quad-socket boards with 12 or 16-core processors were available and affordable, we could ditch sucky pre-emptive multitasking outright. (Most of us, anyway.) In a decade or so, we might actually be there.
 

davidrees

Senior member
Mar 28, 2002
431
0
76
Multi GPU and Multi CPU are not at all the same.

GPUs run highly parallel, highly repetitive abstracted code and they all run it a little differently. By that, I mean the shader output from nvidia and ATI will always look a little different. On the other hand, if you are running code on a CPU, the output has to be consistent and predictable.

GPUs render as many frames and as much detail as possible in a specific time window. CPUs do the opposite - they use as much time as needed to execute specific instructions (generally).

If you want to do major rendering, why not just use a lot of machines? Don't they still use render farms? Would not the best solution be the most calculations per second per dollar? With energy and cooling costs possibly factored in?

You can build a single processor i7 system for under $1k - great, stack them floor to ceiling. Would you pay $3200 for a dual processor i7 system? It would save space and might save some power, but it does not seem like the ideal approach to me.