Building a Virtual Reality PC, AMD or Nvidia

holden j caufield

Diamond Member
Dec 30, 1999
6,324
10
81
I know nothing of the current or near future video cards. Is there a particular one I should get that is better optimized for VR. In the past AMD was far more optimized for mining coins for some reason. Is there something particular in VR that AMD or Nvidia excels at? Thanks
 

airfathaaaaa

Senior member
Feb 12, 2016
692
12
81
if you trully want a vr system i suggest you wait till the htc vive comes along and then you can see what specs we will need for it
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Unless you want to be a bleeding edge early adopter, just wait. There's nothing available now that will last a year.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Both nVidia and AMD are slated to release new cards this year. First new node shrink in many years, so it will be a big jump in performance. AMD is expected to release in the June time frame, and nVidia in the fall. Do not buy cards now unless you have to.

As for which to get, it depends which VR headset you want. HTC has been working with nVidia, and Oculus has been working with AMD.

AMD's current cards are better at VR than nVidia, but that could easily change once the new generation cards come out. We simply do not know, and there wont be any VR games released for some time.

So I say sit back, and see what unfolds this year.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Amd stated they wanted to bring 290/970 performance to the masses via having that performance but cheap, as that is the only way VR will take off. So wait until the new cards release.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
it has something very simple for a cable intensive place...

its wireless:)

They're still the same resolution and refresh rate. Barring some major issue, which one you get should have no bearing on what you choose to power it with.
 

holden j caufield

Diamond Member
Dec 30, 1999
6,324
10
81
thanks I'm currently running a samsung VR with my galaxy note 4. So any video card would be a big plus. I'll wait but in the mean time I like to do research on which one would be better.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Nothing currently out could at all plausibly be optimised for VR - it wouldn't even have really been an idea when they were designing them.

There's is a much faster new gen due to roll out during this year. That'll be better just in terms of brute force. I'm sure they'll all try to boast about it, but I'd tend to rather doubt if even this new generation of GFX cards will be genuinely optimised for VR.

Presuming it gets fully established then it'll be the generation or two coming after them where they really start to target it hard. At that point they'll actually know for sure what sort of things VR most needs in practice.
(And quite probably have major motivation to cater for it!).
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
Nothing currently out could at all plausibly be optimised for VR - it wouldn't even have really been an idea when they were designing them.

There's is a much faster new gen due to roll out during this year. That'll be better just in terms of brute force. I'm sure they'll all try to boast about it, but I'd tend to rather doubt if even this new generation of GFX cards will be genuinely optimised for VR.

I still don't know why people keep making this assumption. Intel has gone from 32 nm to 22 nm and then 14 nm..... With fairly modest gains.

How do you know that the next generation video cards offer the same levels of performance but with lower power consumption instead? Perhaps I am being pessimistic, but I'd expect that is what will actually happen.
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
I still don't know why people keep making this assumption. Intel has gone from 32 nm to 22 nm and then 14 nm..... With fairly modest gains.

How do you know that the next generation video cards offer the same levels of performance but with lower power consumption instead? Perhaps I am being pessimistic, but I'd expect that is what will actually happen.

We're going to see that, but both companies will release Halo cards with much better performance, power be damned.
 

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
I still don't know why people keep making this assumption. Intel has gone from 32 nm to 22 nm and then 14 nm..... With fairly modest gains.

How do you know that the next generation video cards offer the same levels of performance but with lower power consumption instead? Perhaps I am being pessimistic, but I'd expect that is what will actually happen.
Intel has kept the core count same, latency and bandwidth has not improved much.


GPUs work on embarrassingly parallel workloads and we will get double amount of work units which have been improved from previous models.

Same time at least high end GPUs will move to HMB2 which will help those GPUs in high bandwidth situations.. (Although it might be worse in latency intensive.. we will see.)

Power efficiency is performance, now that chips have hit the power wall. (Which also means massively parallel processors are the way forward.)
 
Last edited:

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
I still don't know why people keep making this assumption. Intel has gone from 32 nm to 22 nm and then 14 nm..... With fairly modest gains.

How do you know that the next generation video cards offer the same levels of performance but with lower power consumption instead? Perhaps I am being pessimistic, but I'd expect that is what will actually happen.

With GPUs you can pretty much trade power for performance freely as long as you can afford the die size just by adding functional units. CPUs don't have that luxury. You can only make cores so wide before you see no benefit for the small numbers of threads we get these days. So they get smaller and more efficient. Graphics companies can cash that efficiency right back into performance. Parallel and serial processing are very different beasts.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I still don't know why people keep making this assumption. Intel has gone from 32 nm to 22 nm and then 14 nm..... With fairly modest gains.

How do you know that the next generation video cards offer the same levels of performance but with lower power consumption instead? Perhaps I am being pessimistic, but I'd expect that is what will actually happen.

GPUs are not CPUs. Intel is trying to pack their CPUs into smaller and smaller devices.

GPUs are entirely parallel. Drop a node, double performance. This has been the case with everything single node change. We as users don't care if our desktop GPU sucks down 250W of power provided the performance makes it worthwhile.
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
I still don't know why people keep making this assumption. Intel has gone from 32 nm to 22 nm and then 14 nm..... With fairly modest gains.

How do you know that the next generation video cards offer the same levels of performance but with lower power consumption instead? Perhaps I am being pessimistic, but I'd expect that is what will actually happen.
Take AMD Polaris,

Its 2.5x more performance per watt figure. AMD stated that 70% of that is due to 14LPP process and 30% due to architectural improvements.

If you do the math,

Assume Fiji is 1x and take it to 2.5x that's a 1.5x performance per watt increase. 30% of 1.5x = 0.45. That's 45% more performance.

So if you were to make a Fiji spec Polaris GPU, you'd gain up to Fiji FPS + 45%.

That's quite the significant increase no? 60 FPS would be up to 87 FPS, 30 FPS would be up to 43.5 FPS (varying on the game engine and code).

That's without increasing the number of ROps, TMUs, SIMDs, clock speed etc.

Architecturally, the gains are there. Now as for how big of a card AMD will release first? Probably a Hawaii/Grenada (390x) level replacement GPU (~232mm2 die).
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I am waiting to see which combination of Card + Headset has the best experience, all factors considered (including software ecosystem, headset quality, everything). We won't get an unbiased perspective on this until reviewers get hardware in house with time to test both.
 

timbur666

Junior Member
Jan 5, 2010
1
0
66
I expect my Rift ("free" from the original KickStarter) to be shipping relatively soon. None of my PCs are powerful enough to use it, since I have been waiting to upgrade my gaming PC until either I needed to, or until this moment.

About how long do you expect it will be before the next gen GPU graphics cards come out? Should I just wait and not use the new Rift until then? Hell who knows when it will actually ship. I will probably wait at least until that shipping confirmation before ordering parts. Depending on how busy I am with other projects and work, I might end up getting it built instead from a boutique builder.
 

dark zero

Platinum Member
Jun 2, 2015
2,655
140
106
Wait for Pascal and Polaris. Then you can choose. Neither of the current card would give a long time pleasant experience.
 

steve wilson

Senior member
Sep 18, 2004
839
0
76
I still don't know why people keep making this assumption. Intel has gone from 32 nm to 22 nm and then 14 nm..... With fairly modest gains.

How do you know that the next generation video cards offer the same levels of performance but with lower power consumption instead? Perhaps I am being pessimistic, but I'd expect that is what will actually happen.

CPU and GPU markets are different. There is a need for more GPU horsepower for the average user, graphics have a lot of room to improve. The CPU performance is pretty much good enough now in a desktop, mobile is where the money is for now and in the foreseeable future. With that in mind battery life is what is important, so efficiency is where the R&D dollars go. For GPU the money is in games. PC gaming is still big business, not CPU big business, but still big. I don't know whether we are going to get these massive gains people are speaking about, but I do expect some decent gains at least.
 

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
GPUs are not CPUs. Intel is trying to pack their CPUs into smaller and smaller devices.

GPUs are entirely parallel. Drop a node, double performance. This has been the case with everything single node change. We as users don't care if our desktop GPU sucks down 250W of power provided the performance makes it worthwhile.


But we sure love to use power consumption publicly as the end all of statistics, while secretly overclocking to get every ounce of fps damn the actual specs. Shh, as long as now one knows... :D
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
thanks I'm currently running a samsung VR with my galaxy note 4. So any video card would be a big plus. I'll wait but in the mean time I like to do research on which one would be better.

I would grab an AMD 390X or Fury. With the combination of nvidia cards called "kepler" 780, titan etc tanking in benchmarks about a year after they came out.. and recent developments showing AMD cards as more future proof with DX 12 (DX 12 will factor heavily into VR games!) I think makes the decision for you or for anyone else interested in VR easy.