• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Dual GPUs and stereoscopic rendering

NTMBK

Lifer
With VR headsets like the Oculus Rift not far from the market, I was wondering- has any developer experimented with a new way of rendering with dual GPUs? Given that VR requires two independent viewpoints (one for each eye), it makes sense to me that you could cleanly divide the work between two GPUs. Have any games tried this approach? Or are they all sticking with AFR?
 
If the developer is targeting DirectX or OpenGL it isn't really an option to do anything other than AFR. If they are going mantle then in theory they could render each eye separately on a different card although how much common calculation there is for independence from the eye I don't know.

For the normal prominent APIs I don't think its possible really, they could do other things across the two cards in terms of compute but I think you are stuck with AFR unless they announce some magic that they haven't yet talked about.
 
Back
Top