• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Building a Virtual Reality PC, AMD or Nvidia

holden j caufield

Diamond Member
I know nothing of the current or near future video cards. Is there a particular one I should get that is better optimized for VR. In the past AMD was far more optimized for mining coins for some reason. Is there something particular in VR that AMD or Nvidia excels at? Thanks
 
if you trully want a vr system i suggest you wait till the htc vive comes along and then you can see what specs we will need for it
 
Unless you want to be a bleeding edge early adopter, just wait. There's nothing available now that will last a year.
 
Both nVidia and AMD are slated to release new cards this year. First new node shrink in many years, so it will be a big jump in performance. AMD is expected to release in the June time frame, and nVidia in the fall. Do not buy cards now unless you have to.

As for which to get, it depends which VR headset you want. HTC has been working with nVidia, and Oculus has been working with AMD.

AMD's current cards are better at VR than nVidia, but that could easily change once the new generation cards come out. We simply do not know, and there wont be any VR games released for some time.

So I say sit back, and see what unfolds this year.
 
Amd stated they wanted to bring 290/970 performance to the masses via having that performance but cheap, as that is the only way VR will take off. So wait until the new cards release.
 
thanks I'm currently running a samsung VR with my galaxy note 4. So any video card would be a big plus. I'll wait but in the mean time I like to do research on which one would be better.
 
Nothing currently out could at all plausibly be optimised for VR - it wouldn't even have really been an idea when they were designing them.

There's is a much faster new gen due to roll out during this year. That'll be better just in terms of brute force. I'm sure they'll all try to boast about it, but I'd tend to rather doubt if even this new generation of GFX cards will be genuinely optimised for VR.

Presuming it gets fully established then it'll be the generation or two coming after them where they really start to target it hard. At that point they'll actually know for sure what sort of things VR most needs in practice.
(And quite probably have major motivation to cater for it!).
 
Nothing currently out could at all plausibly be optimised for VR - it wouldn't even have really been an idea when they were designing them.

There's is a much faster new gen due to roll out during this year. That'll be better just in terms of brute force. I'm sure they'll all try to boast about it, but I'd tend to rather doubt if even this new generation of GFX cards will be genuinely optimised for VR.

I still don't know why people keep making this assumption. Intel has gone from 32 nm to 22 nm and then 14 nm..... With fairly modest gains.

How do you know that the next generation video cards offer the same levels of performance but with lower power consumption instead? Perhaps I am being pessimistic, but I'd expect that is what will actually happen.
 
I still don't know why people keep making this assumption. Intel has gone from 32 nm to 22 nm and then 14 nm..... With fairly modest gains.

How do you know that the next generation video cards offer the same levels of performance but with lower power consumption instead? Perhaps I am being pessimistic, but I'd expect that is what will actually happen.

We're going to see that, but both companies will release Halo cards with much better performance, power be damned.
 
I still don't know why people keep making this assumption. Intel has gone from 32 nm to 22 nm and then 14 nm..... With fairly modest gains.

How do you know that the next generation video cards offer the same levels of performance but with lower power consumption instead? Perhaps I am being pessimistic, but I'd expect that is what will actually happen.
Intel has kept the core count same, latency and bandwidth has not improved much.


GPUs work on embarrassingly parallel workloads and we will get double amount of work units which have been improved from previous models.

Same time at least high end GPUs will move to HMB2 which will help those GPUs in high bandwidth situations.. (Although it might be worse in latency intensive.. we will see.)

Power efficiency is performance, now that chips have hit the power wall. (Which also means massively parallel processors are the way forward.)
 
Last edited:
I still don't know why people keep making this assumption. Intel has gone from 32 nm to 22 nm and then 14 nm..... With fairly modest gains.

How do you know that the next generation video cards offer the same levels of performance but with lower power consumption instead? Perhaps I am being pessimistic, but I'd expect that is what will actually happen.

With GPUs you can pretty much trade power for performance freely as long as you can afford the die size just by adding functional units. CPUs don't have that luxury. You can only make cores so wide before you see no benefit for the small numbers of threads we get these days. So they get smaller and more efficient. Graphics companies can cash that efficiency right back into performance. Parallel and serial processing are very different beasts.
 
I still don't know why people keep making this assumption. Intel has gone from 32 nm to 22 nm and then 14 nm..... With fairly modest gains.

How do you know that the next generation video cards offer the same levels of performance but with lower power consumption instead? Perhaps I am being pessimistic, but I'd expect that is what will actually happen.

GPUs are not CPUs. Intel is trying to pack their CPUs into smaller and smaller devices.

GPUs are entirely parallel. Drop a node, double performance. This has been the case with everything single node change. We as users don't care if our desktop GPU sucks down 250W of power provided the performance makes it worthwhile.
 
I still don't know why people keep making this assumption. Intel has gone from 32 nm to 22 nm and then 14 nm..... With fairly modest gains.

How do you know that the next generation video cards offer the same levels of performance but with lower power consumption instead? Perhaps I am being pessimistic, but I'd expect that is what will actually happen.
Take AMD Polaris,

Its 2.5x more performance per watt figure. AMD stated that 70% of that is due to 14LPP process and 30% due to architectural improvements.

If you do the math,

Assume Fiji is 1x and take it to 2.5x that's a 1.5x performance per watt increase. 30% of 1.5x = 0.45. That's 45% more performance.

So if you were to make a Fiji spec Polaris GPU, you'd gain up to Fiji FPS + 45%.

That's quite the significant increase no? 60 FPS would be up to 87 FPS, 30 FPS would be up to 43.5 FPS (varying on the game engine and code).

That's without increasing the number of ROps, TMUs, SIMDs, clock speed etc.

Architecturally, the gains are there. Now as for how big of a card AMD will release first? Probably a Hawaii/Grenada (390x) level replacement GPU (~232mm2 die).
 
I am waiting to see which combination of Card + Headset has the best experience, all factors considered (including software ecosystem, headset quality, everything). We won't get an unbiased perspective on this until reviewers get hardware in house with time to test both.
 
I expect my Rift ("free" from the original KickStarter) to be shipping relatively soon. None of my PCs are powerful enough to use it, since I have been waiting to upgrade my gaming PC until either I needed to, or until this moment.

About how long do you expect it will be before the next gen GPU graphics cards come out? Should I just wait and not use the new Rift until then? Hell who knows when it will actually ship. I will probably wait at least until that shipping confirmation before ordering parts. Depending on how busy I am with other projects and work, I might end up getting it built instead from a boutique builder.
 
Wait for Pascal and Polaris. Then you can choose. Neither of the current card would give a long time pleasant experience.
 
I still don't know why people keep making this assumption. Intel has gone from 32 nm to 22 nm and then 14 nm..... With fairly modest gains.

How do you know that the next generation video cards offer the same levels of performance but with lower power consumption instead? Perhaps I am being pessimistic, but I'd expect that is what will actually happen.

CPU and GPU markets are different. There is a need for more GPU horsepower for the average user, graphics have a lot of room to improve. The CPU performance is pretty much good enough now in a desktop, mobile is where the money is for now and in the foreseeable future. With that in mind battery life is what is important, so efficiency is where the R&D dollars go. For GPU the money is in games. PC gaming is still big business, not CPU big business, but still big. I don't know whether we are going to get these massive gains people are speaking about, but I do expect some decent gains at least.
 
GPUs are not CPUs. Intel is trying to pack their CPUs into smaller and smaller devices.

GPUs are entirely parallel. Drop a node, double performance. This has been the case with everything single node change. We as users don't care if our desktop GPU sucks down 250W of power provided the performance makes it worthwhile.


But we sure love to use power consumption publicly as the end all of statistics, while secretly overclocking to get every ounce of fps damn the actual specs. Shh, as long as now one knows... 😀
 
thanks I'm currently running a samsung VR with my galaxy note 4. So any video card would be a big plus. I'll wait but in the mean time I like to do research on which one would be better.

I would grab an AMD 390X or Fury. With the combination of nvidia cards called "kepler" 780, titan etc tanking in benchmarks about a year after they came out.. and recent developments showing AMD cards as more future proof with DX 12 (DX 12 will factor heavily into VR games!) I think makes the decision for you or for anyone else interested in VR easy.
 
Back
Top