Who else uses 3d vision?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
3D is already "built-in" to every PC game. The "depth" is already there and is easily read by the driver. Sometimes the devs took shortcuts with the HUD or with other elements that show up as 2D and conflict with S3D if the game is not specifically developed for S3D.

I guess my thought was that it'd be like 3D movies where only certain things tend to really use the 3D aspects, but I guess turning everything into 3D wouldn't be much different from filming a movie in 3D. My only gripe about those is it seems the biggest "added bonus" is the one camera captures the foreground and the other captures the background to reduce the "fuzzy background" aspect of 2D movies. So I guess I'd wonder what benefit 3D would provide in a game when it's "always on."

I guess it's just something that I'd need to see.

Something I was wondering as well... have you tried to get nVidia to send you one of their 3D laptops? I see the option available at places like Dell, and especially after reading your proposed video card requirements above, I can't help but wonder what performance hit these laptops would take (especially since laptop GPUs are shadows of their desktop brethren).

It's interesting that Alienware's m18x doesn't even support 3D, yet it comes with SLi and Crossfire as an option, but the m17x does support 3D. :hmm:

AMD and Nvidia use different methods for S3D. Nvidia uses active shutter glasses for 3D Vision (and they develop their own drivers for it) and AMD uses HDMI 1.4 for HD3D (which their partners develop).

I think you may be misunderstanding what I meant when I was talking about standards. In the TV world, you have standards. There are four main methods of 3D: checkerboard, side-by-side, top-bottom and frame packing. These are what is sent from the output device (i.e. PS3) to your display. TV displays typically support all four of these, but it's possible for some to only support a subset and require an adapter to translate. As an example, my Mitsubishi DLP only supports checkerboard. I technically could play the Avatar game on my PS3 in 3D as long as I have glasses for it as it has a Checkerboard 3D setting in the options.

But anyway... my real gripe was that the difference between nVidia and AMD seems to be causing a splintering in the actual display hardware (i.e. the monitor) as well. Why can't we at least get a standard output option so any 3D monitor will work?

Each manufacturer of 3DTVs uses their own glasses and technology. Nvidia provides software for viewing HD 3D movies also.

I'd say that 3D-capable TVs are a lot more standardized than you're trying to suggest. Sure, there can still be active vs. passive, but the majority of TVs are active. There's also IR signaled vs DLP Link signaled.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
I guess my thought was that it'd be like 3D movies where only certain things tend to really use the 3D aspects, but I guess turning everything into 3D wouldn't be much different from filming a movie in 3D. My only gripe about those is it seems the biggest "added bonus" is the one camera captures the foreground and the other captures the background to reduce the "fuzzy background" aspect of 2D movies. So I guess I'd wonder what benefit 3D would provide in a game when it's "always on."

I guess it's just something that I'd need to see.

Something I was wondering as well... have you tried to get nVidia to send you one of their 3D laptops? I see the option available at places like Dell, and especially after reading your proposed video card requirements above, I can't help but wonder what performance hit these laptops would take (especially since laptop GPUs are shadows of their desktop brethren).

It's interesting that Alienware's m18x doesn't even support 3D, yet it comes with SLi and Crossfire as an option, but the m17x does support 3D. :hmm:



I think you may be misunderstanding what I meant when I was talking about standards. In the TV world, you have standards. There are four main methods of 3D: checkerboard, side-by-side, top-bottom and frame packing. These are what is sent from the output device (i.e. PS3) to your display. TV displays typically support all four of these, but it's possible for some to only support a subset and require an adapter to translate. As an example, my Mitsubishi DLP only supports checkerboard. I technically could play the Avatar game on my PS3 in 3D as long as I have glasses for it as it has a Checkerboard 3D setting in the options.

But anyway... my real gripe was that the difference between nVidia and AMD seems to be causing a splintering in the actual display hardware (i.e. the monitor) as well. Why can't we at least get a standard output option so any 3D monitor will work?



I'd say that 3D-capable TVs are a lot more standardized than you're trying to suggest. Sure, there can still be active vs. passive, but the majority of TVs are active. There's also IR signaled vs DLP Link signaled.

First of all, HD3D TVs are not standardized - you cannot use glasses interchangeably among ANY of the big TV vendors. And there are two major kinds of 3D that compete - especially between Samsung and LG; they have some "dirty" ad campaigns going on in the East.

Secondly, there ARE 120Hz displays that will accept (AMD's) 1.4 HDMI 3D signals *and* use Nvidia's 3D vision glasses (separately). They just came out this year and are pretty expensive. AMD has hinted that they will have a HD3D setup for evaluation for ABT in the future. Right now their emphasis is on Eyefinity.

As to PC games, there is depth already pre-programmed into the game. ALL PC games can work with 3D Vision if the driver supports it. Some games will work great; others - a very few - will make you sick playing them. It is something you have to experience for yourself - ideally in your home, for hours of gaming. And going from 2D to 3D is nearly instantaneous - you press a switch on the emitter and take off/put on your glasses. Simple as that!

Also when Nvidia works with devs for "special effects", they encourage the action to take place "inside" the screen - the 'pop out' effects don't work well for small screens and PC gaming although pop outs DO work *great* for movies shown on the Big Screen HD 3DTV in the living room. So ideally games are developed with one kind of 3D and movies for another (and still different for the theater experience).

Coincidentally, i am writing a notebook "makeover" article with a SSD/RAM/OS upgrade. As soon as i am done with that (and my Mega 3D Evaluation), i am going to ask both AMD and Nvidia for S3D-enabled notebooks. AMD already offered them to ABT earlier this year.

There is SO much HW to review, so little time and our staff is small
:'(
 
Last edited:

gorcorps

aka Brandon
Jul 18, 2004
30,741
456
126
I think the 3D notebooks are more for 3D pics and movies vs. actual gaming. I can't see a whole lot of 3D gaming going on with the games that really shine in it.

Tried Grid tonight and it worked quite well. There is quite a bit of ghosting on the lines on the road, but you're usually going fast enough not to notice.

I also tried out Audiosurf, which is a neat little indie music game I got for cheap. Nvidia doesn't even have a rating for it but it actually works okay once you get through the menus. The menus are all at random depths it seems, and the buttons are at other depths, then the text on them is on an even different plane. But once you get past the menus and into the main game it's pretty cool. There's a ton of ghosting going on because there's huge contrast differences and bright flashing colors, but it's not distracting because there's ALWAYS bright flashing colors going on anyway.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
First of all, HD3D TVs are not standardized - you cannot use glasses interchangeably among ANY of the big TV vendors. And there are two major kinds of 3D that compete - especially between Samsung and LG; they have some "dirty" ad campaigns going on in the East.

I've mostly been looking at the DLP market, which might not be the best given how small it is, but all the glasses seem to be interchangeable. I might pick up a converter and a pair of glasses to use with my PS3. Mortal Kombat could be fun.

Secondly, there ARE 120Hz displays that will accept (AMD's) 1.4 HDMI 3D signals *and* use Nvidia's 3D vision glasses (separately). They just came out this year and are pretty expensive. AMD has hinted that they will have a HD3D setup for evaluation for ABT in the future. Right now their emphasis is on Eyefinity.

Well... what I don't get is what video feed is AMD sending over the wire when it outputs in 3D? I'm trying to understand what methods they're using for representing the picture for both eyes. :)

As to PC games, there is depth already pre-programmed into the game. ALL PC games can work with 3D Vision if the driver supports it. Some games will work great; others - a very few - will make you sick playing them. It is something you have to experience for yourself - ideally in your home, for hours of gaming. And going from 2D to 3D is nearly instantaneous - you press a switch on the emitter and take off/put on your glasses. Simple as that!

I write software, but I've never really done any game programming. I assumed that while an engine would lay out a 3D world, actual sizes don't really exist... everything is simply perceived as being x, y, z pixels away from you. I mean, I guess it could just take that information since if main_character.z < this_texture.z, then it's further away in the background (not sure what exactly is 0,0,0 though :p).

Have any of the ones you've tried made you sick?

Coincidentally, i am writing a notebook "makeover" article with a SSD/RAM/OS upgrade.

I actually did that with my Dell XPS M1530.... I definitely love how much the SSD improved the laptop, but its 8800GTS is definitely showing its age :(. I'm looking around at what I could replace it with, but finding a laptop with two drive bays that isn't a monster truck in size seems nearly impossible. :p

As soon as i am done with that (and my Mega 3D Evaluation), i am going to ask both AMD and Nvidia for S3D-enabled notebooks. AMD already offered them to ABT earlier this year.

There is SO much HW to review, so little time and our staff is small
:'(

I have some spare time if you want to send them my way. ;)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
I've mostly been looking at the DLP market, which might not be the best given how small it is, but all the glasses seem to be interchangeable. I might pick up a converter and a pair of glasses to use with my PS3. Mortal Kombat could be fun.



Well... what I don't get is what video feed is AMD sending over the wire when it outputs in 3D? I'm trying to understand what methods they're using for representing the picture for both eyes. :)



I write software, but I've never really done any game programming. I assumed that while an engine would lay out a 3D world, actual sizes don't really exist... everything is simply perceived as being x, y, z pixels away from you. I mean, I guess it could just take that information since if main_character.z < this_texture.z, then it's further away in the background (not sure what exactly is 0,0,0 though :p).

Have any of the ones you've tried made you sick?



I actually did that with my Dell XPS M1530.... I definitely love how much the SSD improved the laptop, but its 8800GTS is definitely showing its age :(. I'm looking around at what I could replace it with, but finding a laptop with two drive bays that isn't a monster truck in size seems nearly impossible. :p



I have some spare time if you want to send them my way. ;)
i'd be glad to - if you have the writing skills :p

Projectors are different. But they are not so good for 3D unless they are extremely bright and can handle a high resolution (read: somewhat expensive).

You can toss out your optical drive and use a SSD and your HDD together in your notebook and a external DVD drive; or just use a USB HDD. i only have a GeForce 8200M, so it's light and retro gaming for me - i use my notebook for everything except gaming and video encoding.

AMD's HD3D is completely different than 3D Vision. You are (currently) rather limited in FPS and in resolution compared to 3D Vision; the signal is over HDMI 1.4 while Nvidia uses DVI. AMD's HD3D is not quite ready for prime time from what i can see. Their partners are developing the standards and AMD is facilitating it. Nvidia's solution is truly "plug and play"
Perhaps these two wikipedia article will explain how depth is already built-into PC games
http://en.wikipedia.org/wiki/3D_computer_graphics
http://en.wikipedia.org/wiki/Hidden_surface_determination

Crysis 2 made me ill until the new drivers came out - now 3D Vision is awesome. Evidently the right image would get "stuck" and lag seriously behind the left eye; but it was broken in DX 11. Mirror's edge is difficult for me to play in S3D also. Older games can have issues with text and the HUD being 2D and at a different depth than everything else on the screen.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
i'd be glad to - if you have the writing skills

I've been tempted to try and write something when Anandtech is looking for writers, but I've always wondered if my writing isn't... flowery enough? Although, theoretically someone can always "fluff up" another person's writing. As long as the meat's there, you can always cook the potatoes later. ;)

Projectors are different. But they are not so good for 3D unless they are extremely bright and can handle a high resolution (read: somewhat expensive).

You can toss out your optical drive and use a SSD and your HDD together in your notebook and a external DVD drive; or just use a USB HDD. i only have a GeForce 8200M, so it's light and retro gaming for me - i use my notebook for everything except gaming and video encoding.

I've heard of laptops supporting removing the optical drive to allow adding a second hard drive, but I didn't think that was a universal thing. Especially since my XPS M1530 has a slot-loading DVD drive. I currently use the USB HDD method, but I don't necessarily like it since I prefer setting things to the HDD, but I don't want to do that if it won't necessarily be connected.

Ouch... I hope you don't have a large resolution to go with that 8200M. Actually, I think mine has an 8600GT in it not an 8800GT, and I tried playing Trine on it (1680x1050 resolution), but it wasn't working so well. :p

AMD's HD3D is completely different than 3D Vision. You are (currently) rather limited in FPS and in resolution compared to 3D Vision; the signal is over HDMI 1.4 while Nvidia uses DVI. AMD's HD3D is not quite ready for prime time from what i can see. Their partners are developing the standards and AMD is facilitating it. Nvidia's solution is truly "plug and play"

I guess my overall curiosity lies with the question: why do we need separate hardware? For example, I have a 60Hz monitor... so what exactly stops nVidia from simply doing 3D at 60Hz rather than 120Hz? I know that 30FPS displayed probably isn't ideal but given tougher hardware requirements for 3D, a lot of your games probably perform between 30-60 FPS anyway. I mean, it sounds like all nVidia is doing is alternating frames (left eye -> right eye -> *repeat*), which means technically any monitor could display it, right? You'd still need the glasses to perform the optical obfuscation (+ the signaling, of course).

That's why I was curious about AMD's signal. If they were outputting using one of the current TV-based standards (checkerboard, top-bottom, side-by-side or frame packing), it would require a display that could process the image versus simply using a display that can display an image.
 

gorcorps

aka Brandon
Jul 18, 2004
30,741
456
126
i'd be glad to - if you have the writing skills :p

I'd be willing to have a try out :awe:

I used to write for my HS paper. I won an award one year. That was many moons ago though before I became an engineer and sacrificed my vocabulary for technical writing.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
Hmm given how you say AMD's 3DHD is limited in resolution and their supported hardware page lists Mitsubishi's 2010 and 2011 3D DLPs, I'm going to assume that my assumption ( double assuming? :p ) about them using one of the TV's standard 3D methods is correct. The resolution limiting would make sense, since I assume that they'd use Blu-Ray's frame packing method, which still allows for full resolution for each eye. The problem is, 60Hz is limited to 720p as I assume gamers wouldn't want 1080p, which based on Mitsu's specs is listed as 24Hz.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Hmm given how you say AMD's 3DHD is limited in resolution and their supported hardware page lists Mitsubishi's 2010 and 2011 3D DLPs, I'm going to assume that my assumption ( double assuming? :p ) about them using one of the TV's standard 3D methods is correct. The resolution limiting would make sense, since I assume that they'd use Blu-Ray's frame packing method, which still allows for full resolution for each eye. The problem is, 60Hz is limited to 720p as I assume gamers wouldn't want 1080p, which based on Mitsu's specs is listed as 24Hz.
Yes, you are somewhat limited at 1080p with HD3D currently over the HDMI 1.4 connection. i expect that is going to change as AMD works with its partners to refine their own S3D.

That is why Nvidia requires a 120Hz display; you have to keep the refresh rate high. And they also support anaglyph glasses with the 3D Vision drivers. You can just use the cheap red and blue glasses to get a feel for 3D Vision with a 60Hz display - just realize that the color balance is way off.

If you are serious about writing, PM me.
 
Last edited:

Nebor

Lifer
Jun 24, 2003
29,582
12
76
But anyway... my real gripe was that the difference between nVidia and AMD seems to be causing a splintering in the actual display hardware (i.e. the monitor) as well. Why can't we at least get a standard output option so any 3D monitor will work?

Yeah, that is a huge WTF. I didn't know Samsung was coming out with a 27" 3d monitor, so when I went to look and see if Asus had come out with theirs on newegg this afternoon, I was amazed to see a Samsung 3d 27" display. Actually, THREE of them. I immediately added the $830 one to the cart, and nearly finalized my order when I started reading Hardforum and overclockers.net and saw that the Samsung displays are only compatible with AMD 3d. Bogus. :|
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
Yes, you are somewhat limited at 1080p with HD3D currently over the HDMI 1.4 connection. i expect that is going to change as AMD works with its partners to refine their own S3D.

No... no... BAD AMD! Go and work with nVidia... us consumers do not like splintering!

That is why Nvidia requires a 120Hz display; you have to keep the refresh rate high. And they also support anaglyph glasses with the 3D Vision drivers. You can just use the cheap red and blue glasses to get a feel for 3D Vision with a 60Hz display - just realize that the color balance is way off.

I did a little Googly searching on "nvision 3d 60hz" to see what reasons people gave, and it seems that the reason is that our eyes won't "make the magic happen" if you try it at 60Hz.

I did order a Mitsubishi 3D Adapter (and glasses)... I should drag my second PC (Radeon 5870) down to my living room and try it out! :D

If you are serious about writing, PM me.

I'll have to consider it. Although, given gorcorps' profession, it seems like he'd probably be better at it than me. :p

Yeah, that is a huge WTF. I didn't know Samsung was coming out with a 27" 3d monitor, so when I went to look and see if Asus had come out with theirs on newegg this afternoon, I was amazed to see a Samsung 3d 27" display. Actually, THREE of them. I immediately added the $830 one to the cart, and nearly finalized my order when I started reading Hardforum and overclockers.net and saw that the Samsung displays are only compatible with AMD 3d. Bogus. :|

If my frame packing assumption is correct, I don't really get why they can't make relatively cheap displays that support both methods. It seems nVidia 3dVision monitors only require 120Hz and AMD 3DHD should only require being able to process one of those four 3D types that I mentioned earlier... unless they're doing something vastly different.

I really want to try this 3D vision sort of stuff... you guys are tempting me so much, but I may have to settle for AMD's lame version. :p
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Well, try out Nvidia's version with the cheap red/blue glasses. It will give you a good idea of how it works and if you can handle it. Batman or Batman demo works pretty well.

AMD does not like Nvidia's proprietary 3D Vision. They want their partners to put together "the best solution" (and they believe they have). It's just that Nvidia has a headstart here; i saw their 3D vision demoed at Nvision08 with the active shutter glasses long before they were released. i really don't know as much about HD3D as i should (but then i don't have that setup); there are supposed to be some advantages including less of a framerate hit to performance.

Evidently 120Hz is the magic number and 60FPS is the framerate to strive for. You can adjust your displays down to 110Hz or even 100Hz in Nvidia's 3D Vision CP - but then you will probably see flickering.

As to writers, i would say that it is far more important to be dedicated and absolutely passionate about hardware (or whatever you review) and to be really knowledgeable about your subject. You can always learn to write acceptably. i have been a writer since high school and as an editor, i can usually help writers with their work
(unless it is completely hopeless; then we'll just use a lot of pictures, charts and graphs). :D
 
Last edited:

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
Well, try out Nvidia's version with the cheap red/blue glasses. It will give you a good idea of how it works and if you can handle it. Batman or Batman demo works pretty well.

Unfortunately, my best nVidia card is an old 8800GTX, and it's sitting in a box because it barely works right. :( The reason why I swayed away from nVidia is because that card always gave me trouble and I paid over $600 for it! I was actually in contact with a few guys from nVidia's driver team to help solve the issues, but we never did. :( So my two desktops have AMD graphics (5870, 6950).

I wish I could RMA that crappy old 8800GTX, but I forgot to register it within that 30 day period, so I lost my lifetime warranty. That made me refuse to buy eVGA's products ever again. Now I mostly buy XFX with their double lifetime warranty.

But yeah... I can't unfortunately. :(

AMD does not like Nvidia's proprietary 3D Vision. They want their partners to put together "the best solution" (and they believe they have). It's just that Nvidia has a headstart here; i saw their 3D vision demoed at Nvision08 with the active shutter glasses long before they were released. i really don't know as much about HD3D as i should (but then i don't have that setup); there are supposed to be some advantages including less of a framerate hit to performance.

Ugh, I just hope everyone gets on the same boat at some point and they push to use as little extra hardware as possible.

You know what I was thinking? I have two monitors... it would be nifty if one monitor could do each eye for 3D, but that would probably require something like passive Real3D that does the whole polarization thing.