• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Most powerful computer under 100w.

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Nvidia GT630 with higher number of shaders
Some cheap Intel Celeron for testing
8gb of Ram
120GB Samsung SSD
Gigabyte H87N motherboard
 
I don't think you can use a decent video card and get any system to really run under 100 watts. Every USB port and every unused port is still probably still using some power also. Something like a Gigabyte BRIX might approach under 100w. They offer a wide variety of Processors designed to use lower power levels. Don't know about a video card.

http://www.gigabyte.us/products/product-page.aspx?pid=4604#sp

Well, the OP has built it and says that it meets their needs. So I guess it is under 100w for his use case (or close enough).
 
I dont think anyone is surprised by that. The question is, can you get a quad core + gpu to stay under 100 watts. But I guess if you dont need 4 cores then it doesnt matter. What I'm wondering is what you need that gpu for; ie what is it doing that the integrated gpu cant do? Have you used hwmonitor, GPU-Z, or some other program to monitor gpu load to see if you are even really needing a discrete gpu?
 
Last edited:
I dont think anyone is surprised by that. The question is, can you get a quad core + gpu to stay under 100 watts. But I guess if you dont need 4 cores then it doesnt matter. What I'm wondering is what you need that gpu for; ie what is it doing that the integrated gpu cant do? Have you used hwmonitor, GPU-Z, or some other program to monitor gpu load to see if you are even really needing a discrete gpu?

Because we are using stereoscopic vision, and algorithms that we use take advantage of cuda. 🙂
 
This is an absurdly easy question to answer. A retina macbook pro 15" uses 85W, has a i7 - 4960HQ (Haswell 4C8T 2.6 3.8Turbo, a GT750M (384 CUDA cores), and will easily outperform anything you can build yourself.
 
Compared to the 4960HQ that's incredibly slow. My rMBP is actually a bit quicker than a 4770K stock because it turbos to 3.8 and for some reason i guess it may run a bit quicker than the turbo in the K

4770K turbos up to 3.9GHz across 2 cores, 3.8GHz for 3 cores, and up to 3.7GHz across all 4 cores, and as a desktop part it can spend far more time in turbo than the 4960HQ... which turbos up to 3.8GHz only on 1 core.

the only thing that can save the 4960HQ vs a 4770K is the Iris Pro GPU and Crystalwell (albeit, only in niche scenarios, although it is a good argument for having massive L4 cache on CPUs), but then the 4770K also has its own safety net: overclocking
 
This is an absurdly easy question to answer. A retina macbook pro 15" uses 85W, has a i7 - 4960HQ (Haswell 4C8T 2.6 3.8Turbo, a GT750M (384 CUDA cores), and will easily outperform anything you can build yourself.

Too bad it has a very expensive screen that's totally useless for this application and is too big to fit in the allocated space.
 
Nvidia at GTC just revealed an interesting option:

GTC-2014-040_575px.jpg


Actually, it may not be as powerful as most of what's suggested here, but it should also use less power.
 
Back
Top