dug777
Lifer
- Oct 13, 2004
- 24,778
- 4
- 0
It is, and surprisingly the problem on the power consumption side runs into how much a power plant can supply over the wire. I have heard about data centers that are tapped out from a power perspective. The only way to get more performance is to find a faster solution at the same power envelope or build their own on site power plant. In this case if a DC had a problem like this, the Nvidia solution would provide 4 times the performance for the same power envelope.
That's a rather romanticised and slightly strange way of describing it
You deliver as much power as you like via a 'wire' for all practical intents and purposes, otherwise how would you get power away from a power plant?
In most cases you'll be limited by your wiring and transformer most immediately, and then maybe your local distribution network and substation. Really big users might even run straight into the transmission network, but that's some pretty serious kit right there...
Now, it may be prohibitively expensive to upgrade the distribution and/or the transmission network to deliver that power, but it's certainly entirely possible and practicable.
EDIT: Out of curiousity, it appears that one of the largest data centres in the world uses about 100MW (and is in an amazingly cool old building!):
Lakeside Technology Center (350 East Cermak
http://www.datacenterknowledge.com/...ters/worlds-largest-data-center-350-e-cermak/
That's no worries at all for the grid to deliver (remember you're pulling power out of multiples of thousands of megawatt facilities onto the grid).
The diesel gensets they mention will be for backup I presume, burning diesel is scarily expensive compared to good old coal
Last edited: