That's a pretty steep minimum.
I worked with a guy who has been messing with the DK2 for a while. You need to drive the Oculus at something like 75 (or maybe 90?) fps, minimum, or almost everyone gets motion sickness or headaches really fast.
Ironically he was working on a program that allow physics teacher to demonstrate labs in a virtual environment. Somehow that project lasted almost 18 months before the guy in charge realized that no high school or community college was going to have the PC specs necessary to drive the Oculus.
And this is a problem that all the VR headsets are going to face, maybe 5% of their potential customers will have the hardware to be able to run them without problems. We probably need another 3-5 years before the hardware to run them smoothly becomes common enough for the masses to have available. Hopefully the big money behind these will be smart and bid their time.
Yes, but a GTX 760 SLI beats a GTX 780 in almost every benchmark. The ones where it doesn't are the ones that need more than 2GB, hence, using the 4GB versions.960gtx sli should have the performance just as long as there are no issues with the games and sli. Some games don't exactly double your performance just because you have 2 cards.
The target is 90hz. I assume most supported titles will be designed to hit 90hz with a GTX 970. I'm just hoping that A.) 4GB GTX 760 SLI is more capable than a GTX 970 for this and B.) SLI isn't inherently incompatible due to latencies or syncronization concerns or something.The thing is, these are not true requirements, they are guidelines. The rift isn't going to come with a game that requires a 970, the reason they choose the 970 was "Ok... what video card for the next year can render not one but TWO 720pish (or 1080p on a good day) screens of todays triple A titles at frame rates higher then 30 (I would say nothing lower than 45 if we don't want noticeably laggy experience, this being the FLOOR, not average)?
Ah. I plan to use Google Cardboard for movies.You won't be held to it, and when the OR comes out it isn't going to have a gpu check (I'm sure there will be many demos and simple games that can play on much lower end hardware, like the mobile market is doing with google cardboard). Just keep in mind that when you got a game infront of you, can you say to yourself "can this run at 90fps without sacrificing to much?"
Yep. My primary use with Google Cardboard will be simulating IMAX with a 4K mobile display. I don't even care about 3D. I just want to duplicate the IMAX experience at home so I can properly re-watch things like Interstallar.Not so fast. Some uses for vr glasses require nothing more than 360 degree 3d videos and such. Some solutions like vr video conference can just be a mix of 2d video streams in a simple 3d room.
Not so fast. Some uses for vr glasses require nothing more than 360 degree 3d videos and such. Some solutions like vr video conference can just be a mix of 2d video streams in a simple 3d room.
I'm an atypical gamer in that I've been playing Elite: Dangerous on a late 2013 iMac (Core i7-4771 CPU, Nvidia GeForce GTX 780M, 16GB of RAM, and a PCIe SSD) in Boot Camp and with an Oculus Rift DK2. Performance is perfectly acceptable. Without the Rift, I can play at my computer's native resolution (2560x1440) with every graphical option at max except for antialiasing, which I leave disabled; the in-game frame rate display sticks at 60 (with v-sync) while doing most things. Sitting docked in station and flying through asteroids are the most graphically demanding tasks; frame rates in those locations are mid-40s.
With the Rift, I have to turn down a number of settings in order to maintain a consistent frame rate (which is important for the head-mounted display because too low a frame rate can lead not just to stuttering, but to nausea-inducing lag), but a constant 75 FPS is generally attainable.
The consumer unit will be about 1080x1200 per eye (1080p is 1920x1080). I want to see a 4K display.Silly technical question, does the video card effectively have to render two 1080p screens one for each eye?
The consumer unit will be about 1080x1200 per eye (1080p is 1920x1080). I want to see a 4K display.
Silly technical question, does the video card effectively have to render two 1080p screens one for each eye?
It renders 2 different view points in the game onto a single display, using half the display for each view point, the total res is 1920x1080 and each eye is 960x1080.
I think there's a slight overhead with the use of 2 render points in the game, and applying various different effects such as the shader to warp the image, chromatic aboration correction and I have no doubt in some games the 2 view points might create a bigger total FOV which can effect how much is on screen. So there is some additional overhead but it's not a massive impact from what I've casually experienced.
It renders 2 different view points in the game onto a single display, using half the display for each view point, the total res is 1920x1080 and each eye is 960x1080.
I think there's a slight overhead with the use of 2 render points in the game, and applying various different effects such as the shader to warp the image, chromatic aboration correction and I have no doubt in some games the 2 view points might create a bigger total FOV which can effect how much is on screen. So there is some additional overhead but it's not a massive impact from what I've casually experienced.
I'm mostly guessing here, but there should be a massive overhead from rendering two different viewpoints in game. It is essentially rendering two different scenes that have to be perfectly synced with each other. Every polygon in each scene would have to be different to account for the slightly different viewpoint, and it would not be as simple as rendering a single scene and then splitting it in two and offsetting the pixels by x each way. Each eye would see a different angle on each object, that is how stereoscopic vision works. That is why when they film a 3D movie they need two cameras at a fixed distance.
I have a feeling the GFX card MFGs will pull out all the stops and push out a few new gens very quickly once VR hits. There hasnt been this much demand for graphics power by what is eventually gonna be a consumer device.
The good news for people is that CPU demands wont go up so I can see a bundle coming out that puts the glasses with a upper midrange card from either of the two top dogs. Ensure the consumer has the power and subsidize the card like they do games. Give people a reason to upgrade.
I am also thinking that current cards will lose value like crazy.
I am also wondering how HBM will help?