4x 9800GX2's being used by Scientists

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
GPU User IQ

Single- Normal
SLI - Borderline Retard
Tri SLI- Severe Retardation
Quad SLI - neanderthalensis
More - under investigation :D

 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
It's nothing that doesn't already exist as an Nvidia Tesla product. You can put enough 1U Tesla boxes in just a single 40U rack that you won't be able to reliably cool the thing.
 

SniperDaws

Senior member
Aug 14, 2007
762
0
0
Originally posted by: Aberforth
GPU User IQ

Single- Normal
SLI - Borderline Retard
Tri SLI- Severe Retardation
Quad SLI - neanderthalensis
More - under investigation :D


Perfectly done, be carful the Nvidia team might snap you up for a job in advertising :)

 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Originally posted by: SniperDaws
Originally posted by: Aberforth
GPU User IQ

Single- Normal
SLI - Borderline Retard
Tri SLI- Severe Retardation
Quad SLI - neanderthalensis
More - under investigation :D


Perfectly done, be carful the Nvidia team might snap you up for a job in advertising :)

You know I was a SLI user once but I recovered and gave the other card to my pal.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
err.. Well , their are thousands of company starting to invest in GPGPU. Company i work , has invested $3 million in R&D to create a new team for this task. Currently they are doing study on which part of our IS can they improve performance and reduce cost on.
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
I think this is a much more promising use for multi GPU setups than gaming.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
not because I want to derail this thread even more, but what up with the bashing of Sli. It's not a flawless setup by anymeans but I know for damn sure that my 2 8800 GT's run crysis much better than my solo GTX did. To me , that means that running sli isn't "borderline retarded" :roll:
 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
lavaheadache, I see you didn't get my joke :D don't take it personally.

Whatever SLI is, it isn't worth the money. Come on, why pay double the price and not even get 50% more performance.... is a seriously flawed decision by a customer.
 

hooflung

Golden Member
Dec 31, 2004
1,190
1
0
Originally posted by: Aberforth
lavaheadache, I see you didn't get my joke :D don't take it personally.

Whatever SLI is, it isn't worth the money. Come on, why pay double the price and not even get 50% more performance.... is a seriously flawed decision by a customer.

I didn't pay much for my second 8800GS. I traded a broken 1900GT to newegg for it that only had 7 days left on the 1 year replacement policy. Its the sweetest SLI setup I've ever seen. It doesn't do 'much' more than the single but its worth every penny I paid for shipping to Cali.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Originally posted by: Aberforth
lavaheadache, I see you didn't get my joke :D don't take it personally.

Whatever SLI is, it isn't worth the money. Come on, why pay double the price and not even get 50% more performance.... is a seriously flawed decision by a customer.


Because it is the only way to get extra performance in the present time until a new architecture comes out, and some people need that extra performance. For everyone else it's a waste of money.

It's a perfectly good reason to get SLI, and if you don't understand it, it's ok. Just don't shit talk about it.



PS. I have no idea where you are getting your figures, but in my games I get almost always double FPS. Take DOD Source as an example, 70 fps with one card, slowdown to 40-45 with firefight, 130 fps with SLI, slowdown to 90-100 in firefight. HUGE DIFFERENCE. Same with PES2008, same with TF2, same with COD4, same with Crysis on medium, and a LOT OF OTHER GAMES.
 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Originally posted by: JAG87
Originally posted by: Aberforth
lavaheadache, I see you didn't get my joke :D don't take it personally.

Whatever SLI is, it isn't worth the money. Come on, why pay double the price and not even get 50% more performance.... is a seriously flawed decision by a customer.

It's a perfectly good reason to get SLI, and if you don't understand it, it's ok. Just don't shit talk about it.

I have every right to express my opinion on a subject as much as you do. Every gamer needs that *extra* performance, I just don't think it's worth the price - that's all (unless you get one cheap/free etc). For further details read my first post.

btw i used to have GTX SLI

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Aberforth
lavaheadache, I see you didn't get my joke :D don't take it personally.

Whatever SLI is, it isn't worth the money. Come on, why pay double the price and not even get 50% more performance.... is a seriously flawed decision by a customer.

Even if you were right and 50% more was all you could get, it's the only upgrade you can buy that will give you a 50% increase.

Try using one video card with $200 E8400 CPU, then try the same game with a $1400 QX9770. Do you think you would see 50% performance increase with any game on the planet?

I don't think I've ever seen a 50% jump if you start with a $200 CPU period- even on games optomized for Quad.

Since 99.9% of games aren't optomized for Quad, you wouldn't be able to tell the difference between those two processors, and one costs 7X as much.

The fact of the matter is, multicard GPU configurations (SLi and CF) are a good buy based on bang per buck- way smarter choice than other component upgrades.

 

Nathelion

Senior member
Jan 30, 2006
697
1
0
Depends on what you mean by bang per buck. I personally think that SLI is indeed retarded - unless you already have the best single card money can buy and you are planning to get another one. Otherwise, don't bother.

And it won't work in Linux:(
 

emilyek

Senior member
Mar 1, 2005
511
0
0
That box is hot.

They have nice taste in components.

I wonder what kind of software they had to write to get it to work properly, and if they could get 8x SLI running on it? o.o
 

ChaosDivine

Senior member
May 23, 2008
370
0
0
Coincidentally, I just saw an empty Tesla box sitting in my dept's corridor outside some grad student offices / labs. Lucky b*stards!
 

adlep

Diamond Member
Mar 25, 2001
5,287
6
81
IMO, this is the future of computing.

Edit: I would like to see some distributing computing clients that are able to utilize the parallel processing power of the GPU.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Aberforth
Originally posted by: JAG87
Originally posted by: Aberforth
lavaheadache, I see you didn't get my joke :D don't take it personally.

Whatever SLI is, it isn't worth the money. Come on, why pay double the price and not even get 50% more performance.... is a seriously flawed decision by a customer.

It's a perfectly good reason to get SLI, and if you don't understand it, it's ok. Just don't shit talk about it.

I have every right to express my opinion on a subject as much as you do. Every gamer needs that *extra* performance, I just don't think it's worth the price - that's all (unless you get one cheap/free etc). For further details read my first post.

btw i used to have GTX SLI

I can appreciate that everyone has their own definition of value, but things are changing in the SLi/CF scene:

http://www.techreport.com/articles.x/14686

?Finally, perhaps our most unexpected observation is that multi-GPU setups have the potential to deliver solid value. Mid-range cards like the GeForce 9600 GT and Radeon HD 3850 offer strong value propositions, and that effect is multiplied by pairing two of them together. Two 9600 GTs can be faster than a single GeForce 8800 Ultra, despite the fact that they cost substantially less. Similarly, two Radeon HD 3850s are a better deal than a single Radeon HD 3870 X2, if your motherboard can accommodate them. Of course, SLI and CrossFire bring with them a whole stable of caveats involving chipset compatibility, multi-monitor support, and the need for driver profiles. High-end multi-GPU configs can add additional expense in the form of higher PSU and cooling requirements, as well. But with both AMD and Nvidia now offering high-end cards with dual GPUs onboard, multi-GPU looks like it's here to stay.

There are more options than ever, it's not just a "buy two of the fastest cards" kind of deal anymore.

 

ther00kie16

Golden Member
Mar 28, 2008
1,573
0
0
Originally posted by: heymrdj
Originally posted by: SteelSix
A room full of scientists and one chesty chick :p

[Dreams of being a scientist...]

"Dude, run some calculations with Fastra while I perform required experiments on, er, with her."

"We'll be in the lab..."

Couch cushion

I wonder what mobo they use?

original link had a page with all the hardware they used. it's a msi k9a2 for the lazy. they explain it as 1 of 2 mb with 4 PCIe and double spacing (both amd). there are 3 now cuz foxconn just released a 780a chipset with 4 double-spaced PCIe. Someone should up the ante and utilize the integrated graphics as well. That would be what... equivalent of another 3 or 4 intel CPUs? Geez... why is the CPU so far behind?
 

heymrdj

Diamond Member
May 28, 2007
3,999
63
91
Originally posted by: ther00kie16
Originally posted by: heymrdj
Originally posted by: SteelSix
A room full of scientists and one chesty chick :p

[Dreams of being a scientist...]

"Dude, run some calculations with Fastra while I perform required experiments on, er, with her."

"We'll be in the lab..."

Couch cushion

I wonder what mobo they use?

original link had a page with all the hardware they used. it's a msi k9a2 for the lazy. they explain it as 1 of 2 mb with 4 PCIe and double spacing (both amd). there are 3 now cuz foxconn just released a 780a chipset with 4 double-spaced PCIe. Someone should up the ante and utilize the integrated graphics as well. That would be what... equivalent of another 3 or 4 intel CPUs? Geez... why is the CPU so far behind?

Because the GPU has its program recorded around it. The GPU is far...stupider in architecture than a CPU, and it can't run the instructions a CPU can. A CPU has to be able to be a jack of all trades when it comes to runs through it (unless you're Solaris or Intel Itanium :D) and at that compatibility comes a loss of die space.

Farid over at http://forum.beyond3d.com/showthread.php?t=29645
To measure something, you need to know what it is, and in the case of the GPU versus CPU Flops, it's not an easy task if one is to define what is a Flops.

Since in a GPU you have both programmable and fixed function hardware capable of Floating Point operations. Yet, one can't compare directly the non programmable Flops to the programmable ones.

And, even with the Flops from the ALUs (Programmables) of a GPU can't be easily compared to the Flops of a CPU... Or Even to the Flops from the ALUS of another GPU, with a different architecture.

At best you can compare the number and the type of Floating Point operations a GPU can do and the one a CPU can do.

But there's no meanigful way to encompass the results in a comprehensive benchmark, with a number for the CPU and a number for the GPU.
One could of course do it, but it would mean stricly nothing in the facts.

mhouston over at http://forum.beyond3d.com/showthread.php?t=29645
GPGPU.org in the forums and the papers linked sometimes go into deep detail about performance differences. Usually, the comparisons are done at the architecture, not app level. i.e. latency hiding, bandwidth, peak flop rates, etc. There isn't a "physics" benchmark persay, but there are fluid codes, bioinformatics codes, etc that you can get from GPGPU.org and the vrious paper links that you can run against the CPU.

At the moment, very few algs on current GPUs are more than 10x a CPU, and nobody should be 100X a CPU unless the comparison is against an untuned CPU code, or the algorithm is actually very different between the processors. Stay tuned for the next round of GPGPU high performance apps over the next year at conferences like Graphics Hardware, Supercomputing, ASPLOS, PACT, Micro, etc. I'm sure you'll also hear more from Nvidia/ATI and companies like Havok and Microsoft over the coming year.

Now, if you are going to render your simulation, doing the sim+render all on the GPU might be MUCH faster than sim on the CPU + render on the GPU since you've taken the costly feedback loop between the GPU and CPU out of the equation.

And rememeber, not everything will be fast on the GPU, only applications that can be shoehorned into fitting the characteristics of the architectures and their limitations.