Who else uses 3d vision?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
The gamebyro engine used in Oblivion, FO3, etc is notoriously cpu limited and single threaded. Once you throw some mods into play it is very cpu dependent. I noticed no difference between 6950 and 6950 crossfire in New Vegas but a pretty nice difference between an i5 760 and i5 2500k. Borderlands without AA(aa causes a massive performace hit in this game) is also very cpu limited. X-bit uses it in reviews if you want some benchmarks.

Even in certain graphics intensive games like Crysis the cpu can make a difference with minimum framerates. I noticed this more in sli, I'm just without a card for the time being.

I've found this to be a great article on microstutter. I haven't had much time to play with sli but it is something that I saw with 6950 crossfire.

It's an ancient article - and it you really have to stress a system to see microstutter.

i bench with Borderlands .. it is not very CPU-limited at all!
- it is UNREAL 3 ENGINE - and with this engine there is No AA that you can force (except for MLAA and soon FXAA). Unreal 3 engine is CPU-dependent - meaning it scales with more cores than 2 - however, a video card upgrade will (almost) always make a much bigger performance increase than a CPU upgrade will.

And Fallout 3
is on GameBryo engine's last legs - it's time for the trashcan for that engine; it was already old when Oblivion came out .. Thankfully, Skyrim is on a new engine. Say inefficient, perhaps - but not demanding. And it is multi-threaded. It's just that some people experience the "stutter" and others don't.
 
Last edited:

BababooeyHTJ

Senior member
Nov 25, 2009
283
0
0
It's an ancient article - and it you really have to stress a system to see microstutter.

i bench with Borderlands .. it is not very CPU-limited at all!
- it is UNREAL 3 ENGINE - and with this engine there is No AA that you can force (except for MLAA and soon FXAA). Unreal 3 engine is CPU-dependent - meaning it scales with more cores than 2 - however, a video card upgrade will (almost) always make a much bigger performance increase than a CPU upgrade will.



The article is still relevant no matter the age. Its a side effect from AFR. Makes 40fps on a sli/cf setup feel different than 40fps on a single card.

The cpu effects minimum framerates. In some games more than others and yes boarderlands on a modern high end card like a GTX570 at a lower resolution like 1920x1080. To say that your cpu has next to no effect on framerate especially in 3d, I'm not sure what to say. I just know what I have seen. Here are some borderlands benchmarks with GTX570 sli scaling. You can force msaa via the control panel just fine with both Nvidia and AMD in that game btw. It just comes at quite the cost.

And Fallout 3
is on GameBryo engine's last legs - it's time for the trashcan for that engine; it was already old when Oblivion came out .. Thankfully, Skyrim is on a new engine. Say inefficient, perhaps - but not demanding. And it is multi-threaded. It's just that some people experience the "stutter" and others don't.

I agree but that contradicts what you said in the last post.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
The article is still relevant no matter the age. Its a side effect from AFR. Makes 40fps on a sli/cf setup feel different than 40fps on a single card.

The cpu effects minimum framerates. In some games more than others and yes boarderlands on a modern high end card like a GTX570 at a lower resolution like 1920x1080. To say that your cpu has next to no effect on framerate especially in 3d, I'm not sure what to say. I just know what I have seen. Here are some borderlands benchmarks with GTX570 sli scaling. You can force msaa via the control panel just fine with both Nvidia and AMD in that game btw. It just comes at quite the cost.



I agree but that contradicts what you said in the last post.
Let me try again .. if you really load up your graphics subsystem and the CPU also, you can make microstutter pretty evident on many engines and in many gaming situations.

Some people are really sensitive to what is called micro stutter - and it also happens on a single GPU so that it cannot be differentiated practically from multi-GPU stuttering - the detailed Fraps graphs also look the same (you can get your CPU out of synch with your GPU; same stuttery results) If you experience it, back off on your settings and notice what happens. You can make that same "40 FPS" feel the same on a single GPU as with a multi-GPU system

That article was OK but there has been further research since then. This kind of stutter is not generally an issue for most enthusiasts. And my advice to the ones who notice it, is to back off on your settings and/or resolution and watch the results. People tend to overestimate their systems and push everything to the edge to get that bit of "extra".

And what are you trying to show me with Borderlands minimum FPS and the GTX 570?:confused:
-is that 34 minimum - and forcing 8xAA at that?
-- i don't see any issues with that ... and it doesn't scale particularly well with SLI. Forcing AA in Borderlands requires that you rename the .exe to another Unreal Engine game or use a utility - and it doesn't work particularly well for the performance hit it gives. If you enable it from the Control panel, the settings won't be applied.^_^

i didn't say that the CPU doesn't affect frame rates.
:p
- it does - just nowhere near the practical effect that a graphic card upgrade would give you. If i were in your situation, i'd much rather run a GTX 580 than a faster CPU. And when you run 3D Vision there is a pretty solid hit to performance - not quite half overall; but it depends on the game. If you can play at 1920x1080, you may have to drop a couple of notches of resolution or lower lettings on the same HW with 3D Vision.

Huh... I never knew they had wired versions. I may have to try it now.
i think $200 was a serious barrier to entry for active shutter glasses. When i first saw them at NVISION08, the projected price of the (then future) glasses was $100. But tech is really expensive for the early adopters. And the regular glasses are now $150 and the wired ones $99. So it is heading in the right direction.
 
Last edited:

gorcorps

aka Brandon
Jul 18, 2004
30,741
456
126
Decided to bite the bullet and just get the wireless ones. Very impressed with what I've seen. I only tried a few games with it, but from what I've seen:

Witcher 2 - Fantastic. Adds tremendous amount of depth which helps the immersion quite a bit if you keep the depth toned down a bit. I'll probably finish the majority of the game in 3d. The biggest drawback is a noticeable framerate hit, but I haven't tweaked anything. Still playable for a slower game like this.

Crysis 2 - actually a little underwhelming. I didn't play much so I'm sure some parts are better than others, but I find myself maxing depth and still wanting more. It just doesn't have as strong of an effect.

Fallout NV - surprisingly good for something I believe was never built into the game. Again, added sense of depth helps grasp the scale of the surroundings and adds to the immersion.
 

gorcorps

aka Brandon
Jul 18, 2004
30,741
456
126
If you have Batman AA try that out :D

I have already beaten it on the 360 many times so I never bothered with the PC version. I still have a lot of others to try though: Borderlands, Trine, maybe Grid, Metro 2033, Just Cause 2 and some others.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
I have already beaten it on the 360 many times so I never bothered with the PC version. I still have a lot of others to try though: Borderlands, Trine, maybe Grid, Metro 2033, Just Cause 2 and some others.
Batman AA is spectacular if you adjust the convergence and the depth. S3D is a bit shallow. Search Nvidia's blogs for the "how to" tweak this and many other new games.

Just Cause 2 is my favorite S3D game in Super-Widescreen 3D Vision (5760x1080). The aerial views over Panau are spectacular; highly recommended for S3D. Metro 2033 is pretty good also. Crysis 2 is great in 3D Vision with the DX11 effects - max the depth out as far as you can; i have yet to play it over 3-panel S3D.
- DNF also looks great in 3D :p
 

gorcorps

aka Brandon
Jul 18, 2004
30,741
456
126
Batman AA is spectacular if you adjust the convergence and the depth. S3D is a bit shallow. Search Nvidia's blogs for the "how to" tweak this and many other new games.

Just Cause 2 is my favorite S3D game in Super-Widescreen 3D Vision (5760x1080). The aerial views over Panau are spectacular; highly recommended for S3D. Metro 2033 is pretty good also. Crysis 2 is great in 3D Vision with the DX11 effects - max the depth out as far as you can; i have yet to play it over 3-panel S3D.
- DNF also looks great in 3D :p

Unfortunately multi-monitor gaming is quite past me ATM, so I'll just have to stick with the single 1080p for now. :p
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Unfortunately multi-monitor gaming is quite past me ATM, so I'll just have to stick with the single 1080p for now. :p
Me too. Nvidia sent me 3 x ASUS 1080p displays as part of my (shortly upcoming) Mega 3D Vision evaluation. i figured since they were sending me one, why not ask for two more so that i could evaluate SuperWidescreen (5760x1080) gaming resolutions for my regular reviews (versus Eyefinity) and they would have double-purpose long-term use for this function.
- the surprise was, they agreed and sent me some nice HW for the eval.
:eek:

Anyway, i am not so impressed with multi-display for S3D except for a few titles. i find it tiring - too much action going on over a 6' 3-Dimensional display to focus on at one time in action shooters - my favorite.

For racing games, it is spectacular; also very good for RPG-action games like Oblivion - or JC 2 when the action is centered in the center of the screen. Bulletstorm had too much going on in the periphery for me to enjoy and i played it on a single screen. Duke Nukem Forever was ideal for a shooter and worked really well in S3D at 5760x1080.

For rich kids who need everything, 3D Vision Surround is pretty cool to play with
:whiste:
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
Wow, I had no idea that you actually have to ensure that your (3d capable) monitor is supported by your video card's 3d implementation. I have an AMD graphics card, and I knew I heard about a 3D technology from them, so I looked it up. Apparently, it only supports 3 different computer monitors, maybe 20 TVs and 20 projectors.

http://www.amd.com/us/products/technologies/amd-hd3d/pages/supported-hardware.aspx

Then the one monitor I find at Newegg, and amusingly enough... it isn't supported by nVidia because the glasses are different or something.
 

gorcorps

aka Brandon
Jul 18, 2004
30,741
456
126
Wow, I had no idea that you actually have to ensure that your (3d capable) monitor is supported by your video card's 3d implementation. I have an AMD graphics card, and I knew I heard about a 3D technology from them, so I looked it up. Apparently, it only supports 3 different computer monitors, maybe 20 TVs and 20 projectors.

http://www.amd.com/us/products/technologies/amd-hd3d/pages/supported-hardware.aspx

Then the one monitor I find at Newegg, and amusingly enough... it isn't supported by nVidia because the glasses are different or something.

AMD and Nvidia 3d implimentations are very different. There's been lots of talk about it so you can google it or search the forums for details/clarification, but it basically is like this (correct me if I'm wrong anybody):

AMD uses "passive" 3D which is similar to what movie theaters use. It renders split frames which are displayed at all times, and each glass lens is polarized to see one of the frames. When brought together you see the 3D effect. I'm not sure the specifics on what the display is doing to properly polarize or split these frames so the glasses can filter them, but the support isn't great as you're finding.

additional info: There's different ways of splitting the screen which is what I believe to be the difference of theaters and this. One way is to have each individual pixel alternate which eye it's displaying, giving each eye a "checkerboard" type of pattern. The other way is to have each line of single pixels (either horizontal or vert lines) display a different eye. The theaters do one and AMD does the other IIRC, and I'm not sure which is which. However in either case the image you're seeing is actually half of the resolution of your display, since each eye only sees half of the pixels regardless of which way it does it.

Nvidia uses "active" 3D. It requires powered shutter glasses (either wired or battery powered) which alternate blocking one eye or the other. This happens so fast the human eye can't detect it (120 times a second). The game renders one eye at a time and plays them back to back in rapid succession, and the shutters have to sync with the monitor which is what the IR emitter helps to do with the battery powered glasses. When synced the shutter glasses change eyes exactly when the game changes frame, and since each eye is getting a different frame the blend of both gives the 3D effect. It requires a true 120Hz monitor to give you 60Hz per eye. Your eye is also only seeing the image for half the time it normally would with no glasses so you usually have to bump up the brightness on the display. In this way you also get the full resolution of your display in 3D instead of half, but you get half the effective framerate.

So there ya go. Two different ways of doing the same thing: make each eye see something slightly different so when your brain blends them it gives the appearance of depth. Both play the same trick on your eyes which give some people a headache, but it's rare and most get used to it.
 
Last edited:

KaOTiK

Lifer
Feb 5, 2001
10,877
8
81
Your eye is also only seeing the image for half the time it normally would with no glasses so you usually have to bump up the brightness on the display.

Actually the 3D driver automatically bumps your monitors brightness to full (and turns it back down to what you had it at afterwards) when you activate the 3D or it starts when launching a game (if you have it for auto start).

Some dark games are a bit darker due to the 3D even with the brightness at full, but I haven't had any problems with it myself.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
The issues with S3D being fatiguing are real. Here is the synopsis of a study that was published recently:
http://arstechnica.com/gadgets/news...hnica/index+(Ars+Technica+-+Featured+Content)

i sent the study on to Nvidia for comment and got a response from them acknowledging that they had read the report:
we are aware of the challenges with viewing 3D content. We put a lot of effort into making the experience comfortable for users:


  1. We default at a low 3D depth, 15%, so we ease users into eyestrain. In order words, our 3D range is reduced and there arent dramatic differences in the convergence and focal points.
  2. We encourage developers to always develop games with 3D going into the screen, rather than out of the screen, which can cause eye strain.
  3. We encourage developers to use our dynamic convergence APIs to keep a consistent convergence from game scene, to cinematic cutscene, to make the game more enjoyable.
  4. We have a group of 3D professionals who determine the optimal settings for things like convergence. We can do this because we also control the driver.
I can see the work that goes into this and the differences between the games where the devs had S3D in mind from the outset and in older games where they never considered it.

Tweaking each individual game can greatly enhance the experience while also reducing ghosting.
From Nvidia's 3D vision reviewer's guide:
DNF-HowS3Dworks.jpg




AMD and Nvidia 3d implimentations are very different. There's been lots of talk about it so you can google it or search the forums for details/clarification, but it basically is like this (correct me if I'm wrong anybody):

AMD uses "passive" 3D which is similar to what movie theaters use. It renders split frames which are displayed at all times, and each glass lens is polarized to see one of the frames. When brought together you see the 3D effect. I'm not sure the specifics on what the display is doing to properly polarize or split these frames so the glasses can filter them, but the support isn't great as you're finding.

additional info: There's different ways of splitting the screen which is what I believe to be the difference of theaters and this. One way is to have each individual pixel alternate which eye it's displaying, giving each eye a "checkerboard" type of pattern. The other way is to have each line of single pixels (either horizontal or vert lines) display a different eye. The theaters do one and AMD does the other IIRC, and I'm not sure which is which. However in either case the image you're seeing is actually half of the resolution of your display, since each eye only sees half of the pixels regardless of which way it does it.

Nvidia uses "active" 3D. It requires powered shutter glasses (either wired or battery powered) which alternate blocking one eye or the other. This happens so fast the human eye can't detect it (120 times a second). The game renders one eye at a time and plays them back to back in rapid succession, and the shutters have to sync with the monitor which is what the IR emitter helps to do with the battery powered glasses. When synced the shutter glasses change eyes exactly when the game changes frame, and since each eye is getting a different frame the blend of both gives the 3D effect. It requires a true 120Hz monitor to give you 60Hz per eye. Your eye is also only seeing the image for half the time it normally would with no glasses so you usually have to bump up the brightness on the display. In this way you also get the full resolution of your display in 3D instead of half, but you get half the effective framerate.

So there ya go. Two different ways of doing the same thing: make each eye see something slightly different so when your brain blends them it gives the appearance of depth. Both play the same trick on your eyes which give some people a headache, but it's rare and most get used to it.
 

gorcorps

aka Brandon
Jul 18, 2004
30,741
456
126
Actually the 3D driver automatically bumps your monitors brightness to full (and turns it back down to what you had it at afterwards) when you activate the 3D or it starts when launching a game (if you have it for auto start).

Some dark games are a bit darker due to the 3D even with the brightness at full, but I haven't had any problems with it myself.

That makes sense. I remember reading about the brightness thing so I posted it, even though I found I didn't need to touch it myself. Guess that explains it. I did boost the game brightness a tad though.


Might go ahead and grab Batman AA today. It's on sale starting today so that seems like a sign :p
 

KaOTiK

Lifer
Feb 5, 2001
10,877
8
81
That makes sense. I remember reading about the brightness thing so I posted it, even though I found I didn't need to touch it myself. Guess that explains it. I did boost the game brightness a tad though.


Might go ahead and grab Batman AA today. It's on sale starting today so that seems like a sign :p

I gotta admit, when I was making the thread and saw it was batman I did laugh thinking of this thread haha
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
That makes sense. I remember reading about the brightness thing so I posted it, even though I found I didn't need to touch it myself. Guess that explains it. I did boost the game brightness a tad though.


Might go ahead and grab Batman AA today. It's on sale starting today so that seems like a sign :p
Here is another sign .. *optimize* Batman for 3D Vision
:thumbsup:

http://3dvision-blog.com/optimizing-batman-arkham-asylum-demo-for-best-3d-vision-results/

It's for the demo but it should apply to the full game. There might even be a full-game version optimization guide on Nvidia's 3D Vision blogs.
 

QuantumPion

Diamond Member
Jun 27, 2005
6,010
1
76
Anyone have issues with 3d youtube videos? Everytime I try to turn on 3d for them, it just says HTML5 not available for this video. However, the embedded youtube videos on the 3dvision live website work fine.
 

gorcorps

aka Brandon
Jul 18, 2004
30,741
456
126
Tried out Just Cause 2, Batman, and L4D2 as I saw my friends playing it.

Batman is quite impressive and a bunch of the little stylistic touches here and there really shine in 3D. Like when you defeat an enemy or hit Joker's teeth with a batarang, and a small group of bats fly out of them right at the screen... looks very cool.

Just Cause 2 looks sweet too. It's always been a good looking game but the 3D makes some of the textures look better than they did before somehow. The 3D adds depth to the textures that normally aren't heavily bump mapped or anything, so it all looks a lot better. There is some slowdown when walking through heavily wooded areas, but it's not terrible. I can probably drop some shadows down or something and be fine, or finally try OCing something.

L4D2 was actually pretty neat, except for the names that hover over everybody's head. They're a lot closer than they should be, but they're also kind of necessary to tell friendlies from not... so I left them on. One time a zombie that was on fire ran right at me and it looked awesome. I was trying to get a smoker to tie me up head on to see how that would look but it didn't happen.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
AMD and Nvidia 3d implimentations are very different. There's been lots of talk about it so you can google it or search the forums for details/clarification, but it basically is like this (correct me if I'm wrong anybody):

Ah that all makes sense. I've heard of those terms before as I've considered buying a 3D starter kit for my Mitsubishi 3D-capable TV. Although, I've always been so turned off by movies, that I just kind of passed off... the original $400 price tag of the 3D kit didn't help that either. :p Mostly this thread and people talking about how the 3D works well with games has rekindled my interest.

Although, I'm a bit curious about two things...

1) Say I went and purchased the 3D Kit for my TV... now obviously we're getting more into the realm of console gaming, but I'm actually wondering about the glasses. Since they're also active shutter, would they work with nVidia's 3D Vision or is there a difference between glasses?

2) Does the game need to specifically support a 3D functionality for nVidia and AMD or does 3D support include both companies by default?

I do wish they'd focus more on commonalities rather than trying to segment the market. :\
 

gorcorps

aka Brandon
Jul 18, 2004
30,741
456
126
Ah that all makes sense. I've heard of those terms before as I've considered buying a 3D starter kit for my Mitsubishi 3D-capable TV. Although, I've always been so turned off by movies, that I just kind of passed off... the original $400 price tag of the 3D kit didn't help that either. :p Mostly this thread and people talking about how the 3D works well with games has rekindled my interest.

Although, I'm a bit curious about two things...

1) Say I went and purchased the 3D Kit for my TV... now obviously we're getting more into the realm of console gaming, but I'm actually wondering about the glasses. Since they're also active shutter, would they work with nVidia's 3D Vision or is there a difference between glasses?

2) Does the game need to specifically support a 3D functionality for nVidia and AMD or does 3D support include both companies by default?

I do wish they'd focus more on commonalities rather than trying to segment the market. :\

1) I don't think so. The glasses and display have to sync together in order to minimize ghosting and get proper 3D effect. It does this on the PC by an IR emitter connected via USB, and the glasses have an IR reader. They can "talk" to each other to sync up. The TV one works in the same way, except their IR emitter obviously has no USB connection so there's no way to plug it into the PC. At a minimum you would need a separate IR emitter for the TV and PC (and I don't think they're sold separately). Even with that I'm not sure that one set of glasses can utilize both emitters. I doubt it.

2) This I'm not sure about. There are 3D enabled games, but I think those are a function of what Nvidia or AMD decided to support on their end via drivers. Newer games are being designed with 3D in mind so work better, but older games are being "adapted" to each 3D tech by either Nvidia or AMD... so it's up to them what they support. I'm sure there's lists of supported games for both out there. Regardless of new or old games, I'm pretty sure it's up to Nvidia or AMD to pick up the support. Devs can and will help with the 3D on either side (Crytek worked w/ Nvidia a lot) so support will vary.
 

KaOTiK

Lifer
Feb 5, 2001
10,877
8
81
1) Say I went and purchased the 3D Kit for my TV... now obviously we're getting more into the realm of console gaming, but I'm actually wondering about the glasses. Since they're also active shutter, would they work with nVidia's 3D Vision or is there a difference between glasses?

2) Does the game need to specifically support a 3D functionality for nVidia and AMD or does 3D support include both companies by default?

I do wish they'd focus more on commonalities rather than trying to segment the market. :\

1. No, as things sync up different and use different tech to do the same thing (I could be wrong about this but that was the general understanding I got like a year ago when I read up on that stuff)

2. For Nvidia, the only requirement is from what I can remember is the game needs to use Direct X, it wont work in openGL. The games that have specifically supported the 3D make sure that everything works correctly (like hardware icons instead of 2D so they are not floating above everything for example).A lot of old games do look great in 3D, though it is hit or miss, generally though the experience for me has been positive with older games. I have no idea for ATI, i didn't even know they had their own 3D till earlier in this thread :p
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
1) I don't think so. The glasses and display have to sync together in order to minimize ghosting and get proper 3D effect. It does this on the PC by an IR emitter connected via USB, and the glasses have an IR reader. They can "talk" to each other to sync up. The TV one works in the same way, except their IR emitter obviously has no USB connection so there's no way to plug it into the PC. At a minimum you would need a separate IR emitter for the TV and PC (and I don't think they're sold separately). Even with that I'm not sure that one set of glasses can utilize both emitters. I doubt it.

Hmm that makes sense. I did a bit of extra research on the Mitsubishi DLP side of things, and it seems using an IR emitter with my DLP TV (WD65C9 : http://www.mitsubishi-tv.com/product/WD65C9 ) is just worthless anyway. It seems any model prior to 2010 (the '9' in "C9" stands for 2009) has an awkward 3D implementation where you cannot turn off the DLP link. DLP link is another method of telling the glasses that they need to switch, which uses a quick flash. The problem is that leaving this on with the IR emitter is that you get a brighter picture as your eyes pick up that white flash. So to get a decent 3D picture with my TV, I'd need to use different technology than nVidia's 3D Vision Kit uses anyway! That's a bit of a bummer.

I'm actually curious if the Mitsubishi converter box could convert a signal from nVidia or AMD. I know it can do the normal "cinema" types like side-by-side, top-bottom and BluRay's Frame Packing, but I don't think any of those are the same (although Frame Packing seems close).

I found the spec sheet on Mitsubishi's website ( http://www.mitsubishi-tv.com/pdf/specsheet-3DA1.pdf ), but without knowing exactly what nVidia uses, I can't be too sure.


2) This I'm not sure about. There are 3D enabled games, but I think those are a function of what Nvidia or AMD decided to support on their end via drivers. Newer games are being designed with 3D in mind so work better, but older games are being "adapted" to each 3D tech by either Nvidia or AMD... so it's up to them what they support. I'm sure there's lists of supported games for both out there. Regardless of new or old games, I'm pretty sure it's up to Nvidia or AMD to pick up the support. Devs can and will help with the 3D on either side (Crytek worked w/ Nvidia a lot) so support will vary.

Yeah, it looks like AMD only supports a handful of titles shown here:

http://www.amd.com/US/PRODUCTS/TECHNOLOGIES/AMD-HD3D/Pages/gaming.aspx

I'm not really sure about API capabilities, but it makes me wonder if there'd be a way to generalize 3D into DirectX. Then the 3D aspect of DirectX would only need to be supported.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
3D is already "built-in" to every PC game. The "depth" is already there and is easily read by the driver. Sometimes the devs took shortcuts with the HUD or with other elements that show up as 2D and conflict with S3D if the game is not specifically developed for S3D.

AMD and Nvidia use different methods for S3D. Nvidia uses active shutter glasses for 3D Vision (and they develop their own drivers for it) and AMD uses HDMI 1.4 for HD3D (which their partners develop).

Each manufacturer of 3DTVs uses their own glasses and technology. Nvidia provides software for viewing HD 3D movies also.