nVidia Bags IBM - To Ship Tesla for Datacenters .

cbn

Lifer
Mar 27, 2009
12,968
221
106
This could be very interesting.

IBM even has the chip level watercooling technology. How long till they are able to combine the two together?
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Shouldn't this be in general hardware or something? those aren't really video cards are they.
 

edplayer

Platinum Member
Sep 13, 2002
2,186
0
0
if IBM were to render all my graphics for my games, and I could use it all I want, I would give them about $10 a month.
 

Paratus

Lifer
Jun 4, 2004
17,522
15,567
146
It did nothing but show how dumb he is and how much you agree with stupid comments and thread crapping.

Here ya go, read and learn something.
Graphic cards are not just for gaming.

:( I was just having a litt...r the Tesla market and not the gaming market.
 

shangshang

Senior member
May 17, 2008
830
0
0
I've been saying, NV will eventually exit the desktop gaming market completely, and this IBM news is just another bit that I think is showing why. No doubt in my mind that NV's overall strategy is mobile/handheld and HPC computing. AMD will eventually own the desktop gaming market, which seems to be losing out to console.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I've been saying, NV will eventually exit the desktop gaming market completely, and this IBM news is just another bit that I think is showing why. No doubt in my mind that NV's overall strategy is mobile/handheld and HPC computing. AMD will eventually own the desktop gaming market, which seems to be losing out to console.

I think Intel will eventually be the leader in cpu/gpu performance as someday they will be on one chip.
There will be no real videocards.

So Intel will dominate AMd in both markets just like they do in the cpu division.:(

Mabe my dream will come true and they will make a console with a mouse and keyboard. I won't need a graphics card then.
 

A_Dying_Wren

Member
Apr 30, 2010
98
0
0
if IBM were to render all my graphics for my games, and I could use it all I want, I would give them about $10 a month.

Aren't there a few companies (start-ups) hoping to do just this?

Latency would be a huge problem though I imagine. Not to mention frequent dropped frames and lag.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I think Intel will eventually be the leader in cpu/gpu performance as someday they will be on one chip.
There will be no real videocards.

So Intel will dominate AMd in both markets just like they do in the cpu division.:(

Mabe my dream will come true and they will make a console with a mouse and keyboard. I won't need a graphics card then.

1. consoles use graphics cards.
2. nvidia is heavily branching into non GPU markets right now as they predict the same thing.
3. the ceo of the company that makes unreal engine predicted the whole lifecycle of GPUs more then 10 years ago, and he is accurate within a year. His prediction on the matter for the next 10 years are fascinating, sound like they would be right, and seem to indicate something similar. IIRC I think he stated that the next verison of unreal engine would use CPU and GPGPU rather then direct X.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Still early days as not that much is designed to run on these things, but for nvidia they now have a distribution network that can get their cards into data centres the world over.

However someone still needs to port whatever software the data centre uses to work on telsa before they are any use. e.g. if someone like oracle were to port there DB server software to run on telsa then it'll really take off.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
3. the ceo of the company that makes unreal engine predicted the whole lifecycle of GPUs more then 10 years ago, and he is accurate within a year. His prediction on the matter for the next 10 years are fascinating, sound like they would be right, and seem to indicate something similar. IIRC I think he stated that the next verison of unreal engine would use CPU and GPGPU rather then direct X.

I just think he phrased it all wrong.
The way he said it, it was as if GPU technology would disappear altogether, and CPUs as we know them today would just become fast enough to replace them.
That's like saying that FPUs disappeared when Intel launched the 486.
It didn't disappear, it just got integrated with the rest of the CPU. But to this day, we have dedicated FPU units in our CPUs.
With GPUs the only thing that disappears is the fixed-function hardware. At every new generation, more and more functionality becomes programmable. Larrabee would be the endpoint of that trend, with no fixed-function hardware at all, except for basic texture filtering (which you could also view as a special case of caching/prefetching).
But the Larrabee was also remarkably similar to the G80 architecture, with very wide SIMD units processing scalar threads in parallel on a large scale.
That is typical GPU technology, as no regular x86 CPU ever had SIMD processing anywhere near that wide. And for a lot of applications it would be absolutely no use anyway, and the standard CPU/FPU units are still very much required to keep the performance up.
So it's more of a synergy of the technologies. Neither really disappears.
 

Scali

Banned
Dec 3, 2004
2,495
0
0

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I just think he phrased it all wrong.
The way he said it, it was as if GPU technology would disappear altogether, and CPUs as we know them today would just become fast enough to replace them.
That's like saying that FPUs disappeared when Intel launched the 486.
It didn't disappear, it just got integrated with the rest of the CPU. But to this day, we have dedicated FPU units in our CPUs.
With GPUs the only thing that disappears is the fixed-function hardware. At every new generation, more and more functionality becomes programmable. Larrabee would be the endpoint of that trend, with no fixed-function hardware at all, except for basic texture filtering (which you could also view as a special case of caching/prefetching).
But the Larrabee was also remarkably similar to the G80 architecture, with very wide SIMD units processing scalar threads in parallel on a large scale.
That is typical GPU technology, as no regular x86 CPU ever had SIMD processing anywhere near that wide. And for a lot of applications it would be absolutely no use anyway, and the standard CPU/FPU units are still very much required to keep the performance up.
So it's more of a synergy of the technologies. Neither really disappears.

That is pretty much what he predicted. I think it was more of me phrasing my recollections of his predictions wrong.
Anyways, if fixed function hardware disappears, why is it a GPU? not to mention if it also gets integrated into the CPU...
 
Last edited:

Scali

Banned
Dec 3, 2004
2,495
0
0
Anyways, if fixed function hardware disappears, why is it a GPU? not to mention if it also gets integrated into the CPU...

GPU just isn't a good name of itself (which is why we already have extended terms such as GPGPU or CGPU).
Most names describe the function of a processing unit, but a GPU describes an application instead of a function.
FPU is a floating point unit.
DSP is a digital (or dedicated) signal processor.
These terms describe what the processing units do, rather than what you could use them for.
If you were to describe what a GPU does, then it would be something like parallel stream processor, or just SIMD unit :)
Even DSP would be more or less applicable.

But if we look at how GPUs are designed today (actually GPGPU/CGPU architectures), and where things are likely going with Larrabee, Cuda and Fusion... the designs will remain close to what we call 'GPUs' today, rather than what we know as CPUs today.