AND SO BEGINS the (unstoppable?) evolution of Multi-GPU gaming

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

wanderer27

Platinum Member
Aug 6, 2005
2,173
15
81
Originally posted by: nitromullet
Having owned both the current ATI and NVIDIA dual gpu single cards, I have to say that both companies have done a very nice job of making the dual gpu nature of the card very transparent to the end user. The only indication that you're running a dual gpu with the GX2 is the fact that the radio button next to SLI is checked. Otherwise, it feels like a single card. The 3870 X2 is the same way... Nothing to mess with.

I think that the if dual gpu is going to be adopted by the mainstream (or even just the enthusiast gamer), it will be on the form of dual gpu singe cards, and not dual card setups. Dual gpu cards have all the advantages of a dual card setup (except maybe re-sale options), but they don't require a specific motherboard to run. This gives consumers a lot more flexibility because it doesn't lock them into a platform. Plus, as I mentioned before, the drivers for the dual gpu cards are designed to make the existence of the dual gpus transparent to the end user, which is really what it's all about. If you can drop a singe or multi-gpu card into your rig and the only difference is that the muti-gpu card runs faster, then there isn't really that much of a hurdle left. Granted, we aren't quite there yet, but both SLI and Crossfire continue to get better and better every time I try them.

I think this pretty well sums it up - one card but multiple GPU's. That's the only real way this is going to take off.

Heck, I've had these SLI MB's for a couple years now and have never felt the need or desire to go SLI.

Why add the additional Heat & Power when most everything I run works fine?

 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
this is a ridiculous thread...
Multi GPU has existed for over 10 years, and they STILL suck.
Multi CPU are useful cause each core is running many different applications, the more cores, the better.
Multi GPU tend to suck because you have multiple cores all trying to run ONE application, and since that app is a game it is even worse, because there is no multi tasking, when you play a game you are at full screen mode and rendering only one thing. At least a multi core system can use extra cores on a single threaded game to do background tasks. You could be downloading things, compressing files, etc. Not so with a GPU.

It will be interesting to note that I have seen marketing refer to the SP on video cards as cores... I don't remember if it was "beauty is 320 cores deep" or if it said "beauty is 128 cores deep"
 

Jax Omen

Golden Member
Mar 14, 2008
1,654
2
81
taltamir, the problem with that is Multi GPU is as much a software problem as a hardware problem. Graphics is a highly parallel problem in the first place. Logically, it's not much of a leap to add another GPU. The problem is in the implementation and the drivers.
 

ajaidevsingh

Senior member
Mar 7, 2008
563
0
0
Well whats it called Intel Lareebee or something similar maybe it can kill multi-GPU power with multi-CPU power...!!
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
It is a leap, it is a WASTEFUL leap.

The GPU/CPU isn't a single math unit. It has many MANY individual processors structured together. multi GPU/CPU is an extremely INEFFCIENT way of doing thing, you aren't just duplicating the various processing units (stream processors, ALU, FPU, ETC) you are replicating the entire core, including redundant things like memory controllers. AND THAN you have the EXTRA problem of synching the various cores and communicating between them.

It makes more sense to have a single core and keep multiplying the amount of actual processing units within it.
With a CPU on the other hand you get significantly larger benefits due to the multiple app nature.

At the end of the day, multi core is just a software hack to inefficiently string together multiples of the same CPU/GPU as a cheaper way of expanding current power, compared to developing a core with more calculation units balanced for modern tasks.
Some of it is just marketing jargon too, as multi cores become more integrated, you end up with more and more of their component shared. The only difference is that it is clustered together.

What is better, a core with 128 SP, 64ROPs etc. or 2 die each with 64SP and 32ROP aswell as redundant controllers. Or a single die with two seperate clusters of 64SP and 32ROP each seperately addressable but actually part of the same whole?
For a CPU having the different clusters addressable SEPARATELY allows better multi tasking. For a GPU it just adds unnecessary overhead.

CPUs also have the issue where the standards seem to be locked and non scaleable. You can't just add ALU or FPU or whatever to a CPU. From what I understand part of the reason that 64bit is so much faster for CERTAIN applications is that the number of registers (or something) in it was increased, and if it is running in 32bit mode only part of them are accessible.

Imagine if using DX9 would have made a hard cap of only using 32SP.
only 1/4th out of the 128SP on the 9800GTX would function with the rest sitting idle? Well, you could make a quad core GPU with 32SP and then use inefficient software to quad SLI it to get 128 working SP, not as efficient as directly using 128SP, but much better then being limited to 32.
 

Jax Omen

Golden Member
Mar 14, 2008
1,654
2
81
Ok, so you want a videocard to be basically a second, smaller mobo, and let the user populate it with VRAM and a GPU, and upgrade that at will?

Actually, that could be really cool. And completely unmarketable.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Jax Omen
Ok, so you want a videocard to be basically a second, smaller mobo, and let the user populate it with VRAM and a GPU, and upgrade that at will?

Actually, that could be really cool. And completely unmarketable.

I don't know if the idea was like a video 'daughter card', but I think I recall some discussion previously about gpu sockets.

From Oct. 2005:

NVIDIA's Secret Flip Chip GPU

Manufacturers seem to think G72 and G73 will be an easy tool over from NV40/43, but another vendor claims NVIDIA has bigger plans. They claim that NVIDIA is working on flip chip GPU sockets for motherboards. Apparently, inside NVIDIA engineering teams have several prototypes where the GPU, rather than the CPU, is the main focus of a motherboard with two sockets: one for the GPU and another for the CPU. Whether or not such a machine will ever see the light of day is difficult to say right now. However, the idea of pin compatible GPUs already suggests that we are halfway there when it comes to buying GPUs the same way we buy CPUs: in flip chips. We have plenty of questions, like how the memory interface will work and how that will affect performance, but GPU sockets are likely less a question of "if", but rather "when".

http://www.anandtech.com/video/showdoc.aspx?i=2570

 

sgrinavi

Diamond Member
Jul 31, 2007
4,537
0
76
Multi is cool so long as the price stays reasonable... I paid $450 for a new MSI 9800 gx2 and $310 for a Sapphire-3870x2 -- my crossfire set up was about $320...

Considering that I paid $550 for a 8800 GTX I find it to be a reasonable alternative... ya know?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: CP5670
Originally posted by: gersson
On the reverse, I am an SLI/Crossfire user gone single card :p

I felt like I was still @ work when I got home...too much tweaking, bugs, underwhelming performance.
I 8800GT -- set it to HQ: Done.

This says it all for me. I had the exact same experience with multi GPU.

Not me ... i love my Crossfire
:heart:

even though i experience almost no issues - i'm getting great performance in most games with no micro stutter [because at 16x10 i max out Crossfire AA] - i am *still dying* to get back to a more powerful single GPU

then i will get another one :p

rose.gif
 

BDawg

Lifer
Oct 31, 2000
11,631
2
0
Originally posted by: m0mentary
*conspiracy hat on*

SLI/Crossfire should never become mainstream. If multi-gpu does, then gpu makers will no longer attempt to create high-performing single card solutions, forcing gamers to buy a second card to get respectable performance

*conspiracy hat off*

See, that's not the conspiracy I think of. I think that returns on CPU / GPU research are becoming more elusive and more costly to realize. Multi-core / Multi-GPU allows companies to cheaply increase performance with minimal technology increases.
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
Originally posted by: Jax Omen
Ok, so you want a videocard to be basically a second, smaller mobo, and let the user populate it with VRAM and a GPU, and upgrade that at will?

Actually, that could be really cool. And completely unmarketable.

That's actually how they used to be... with ram anyway. Pretty much every card with onboard memory worked that way.
 

Owls

Senior member
Feb 22, 2006
735
0
76
I don't mind multi GPU setups because it helps add inches to my penis. At least this is what I tell my wife.

BRB, going to the gym in 26 minutes.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
SLI and xFire will always serve a niche. Basically getting within 20-30% of the next generation single processor performance 12-18 months before it arrives.

Single processor imo will always be mainstream because it is easier to work with and cheaper for both parties.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
I'm a big fan of mid-range or budget cards used in multi-GPU configurations substituting high-end performance. I think that's a great benefit from the competition and recent releases by both camps. I'm not a big fan of multi-GPU solutions trying to pass off as high-end solutions though. I also dislike the artificial limitations placed on chipsets/boards when it comes to GPU vendor, limiting or requiring you to purchase certain chipsets for multi-GPU support.

Also, you have to look at why gaming requirements jumped so much in the last few years. Sure DX9 and DX10 have introduced much more demanding games, but I also think the mainstreaming of 20-24" wide aspect LCDs has increased resolution and GPU requirements as much if not moreso. As 1080p settles in as the standard, I think you'll see performance requirements normalize a bit, much as they did with 1280x1024. In any case, It doesn't look like multi-GPU will be the requirement, at least for another generation.
 

Jax Omen

Golden Member
Mar 14, 2008
1,654
2
81
Very true. monitors have pushed GPU requirements more than games ever could.

Hell, if you still game at 1280x1024, the X1950/Geforce 7900 will still run any game just fine, barring Crysis, I suppose.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
There is one way to end all this you know. A 1 on 1 online deathmatch. One who feels Multi-card is wasteful could take a lesson or two playing against somebody with multi-card setup at same settings. I offer my services in this respect :cool:

I have CoD4 that can be used. :D
Even CoD2 in DX9 with full settings might work.

And if your chugging along if a grenade goes off in your field of view, and then get clipped for it, oh well. :D

Taltamir, you up for it? Let me show you how wasteful 2x8800GTS 640's are. Or 2x9800 GTX's. C'mon, it will be fun. For one of us anyways. Muhwahahahaaa.... ;)

Anyone that has Crossfire setups and feel like going against single card users are most welcome!!
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Genx87
SLI and xFire will always serve a niche. Basically getting within 20-30% of the next generation single processor performance 12-18 months before it arrives.

Single processor imo will always be mainstream because it is easier to work with and cheaper for both parties.

What about GeForce Boost?

http://www.nvidia.com/object/hybrid_sli.html

There might come a time when end users don't even know that they are using something like SLI, they just know that they have a discreet card and an on board chip. Or they might not even know that.

I'm actually hoping that some day NVIDIA will put HybridPower on a single card instead of just using the integrated chips on motherboards. It would be pretty beneficial for heat and noise (and I wouldn't mind paying $20-30 more) if say my GX2 turned off the two G92 gpus and fan and enabled a low power gpu while I was doing 2D stuff. The same would be true for an 8800GTX/Ultra. There is no reason for a PC to suck that much juice while I'm web surfing.