DarkKnightDude
Senior member
I'd imagine it'll be cheaper just to build my own SteamBox and install SteamOS on it.
I'd imagine it'll be cheaper just to build my own SteamBox and install SteamOS on it.
You seem to have completely missed that a SteamMachine is just a PC (random parts) with SteamOS.
But there wont be any Mantle support in SteamOS. It rotates around OpenGL.
As we talked about last week, the Steam Machines available for sale next year will be made by a variety of companies. Some of those companies will be capable of meeting the demands of lots of Steam users very quickly, some will be more specialized and lower volume. The hardware specs of each of those machines will differ, in many cases substantially, from our prototype.
As a hardware platform, the Steam ecosystem will change over time
so your ridiculous claim is that your pc itself has never pulled more than 290 watts at the wall with 4770 and 680 sli? thats a load of crap as even my dinky pc hits near that in some demanding games.
EDIT: yeah you are really full of it because I just checked Crysis 3 and even just sitting still with no action I was just over 300 watts at the wall. your system would have to pull at least 125-150 watts more than mine under heavy load.
I accounted for his cpu using less power. still there is ZERO chance he has stayed under 290 watts at the wall in demanding games with his 680 sli pc.I dunno about that, but :
(1)- Not all power usage setups analyze in the same exact way.
and
(2)- Stock, the 4770K is incredibly low power. OC'd is another story entirely. Weirdly, at stock my 2700K @ 5Ghz uses a heap of power. One of my 3770K attempts stalled at 4.6Ghz, but it was using more power according to my kill-a-watt doing the same bench runs than the 5Ghz 2700 did. Sense? It makes none. A stock 4770K uses something like 1/3rd of my 5Ghz 2700K though. And with the Haswells not really OC'ing all that well, some people just run them stock. Very few things right now run into significant CPU bottlenecks with a stock 4770.
and
(3)- Different games pull dramatically different power usage. For some reason I can run Deus Ex : HR, all settings maxed, and it uses about 12% less power than running BF3 MP. Why? I don't know.
I dunno about that, but :
(1)- Not all power usage setups analyze in the same exact way.
and
(2)- Stock, the 4770K is incredibly low power. OC'd is another story entirely. Weirdly, at stock my 2700K @ 5Ghz uses a heap of power. One of my 3770K attempts stalled at 4.6Ghz, but it was using more power according to my kill-a-watt doing the same bench runs than the 5Ghz 2700 did. Sense? It makes none. A stock 4770K uses something like 1/3rd of my 5Ghz 2700K though. And with the Haswells not really OC'ing all that well, some people just run them stock. Very few things right now run into significant CPU bottlenecks with a stock 4770.
and
(3)- Different games pull dramatically different power usage. For some reason I can run Deus Ex : HR, all settings maxed, and it uses about 12% less power than running BF3 MP. Why? I don't know.
I'm guessing Valve is using gtx780's and Titan's because the first retail steam machines they make and sell will have 20nm Maxwell GPU's that provide that level of performance at a lower cost. This is just a beta and I think they are smart enough to know $1000 machines will not fly off the shelf.
Anyways, just as I alluded to in the other thread when I was arguing with Vulgar about Mantle's future significance, Valve is going with Nvidia. Why? Because Nvidia can provide a higher level of support and has a more mature software / driver team.
1. bought a K series and did not over clock it??
2. The 4770K is 84 watts TDP
3. I assume you have a HD and RAM so add another 25 watts
4. Your losing 10+% through the PSU
5.Fans? cooling?
Figure bare minimum pulling 150+ watts without the GPU's. If your getting 680's to run at 60 watts each Nvidia and Intel will break down your door to figure out your secret. At that wattage you can just pull power from the PCI-E slot and passively cool. oops, There is another 2 fans I overlooked..
I was right about the Nvidia GPUs