Sharing 1 graphics card across multiple guests or machines

Kilgore23

Member
Jun 26, 2014
37
0
0
With advances in graphics cards today we now know it's possible to share 1 graphics card across multiple guests and machines with products such as the Nvidia Grid hosting many different OS/VMs/Machines simultaneously.

Because I'm not in a position to buy a Nvidia Grid K1 graphics card that can cost upwards 1500-2000 dollars I was wondering if there was a cheaper alternative.

Many people I might have asked in the past usually end the conversation here and tell me flat out no, but that's hard to imagine and when I pursue my curiosity further I read about different individual's experience where they've successfully modified their existing consumer grade graphics cards such as a GTX690 to have the sharing capabilities that of a Nvidia Grid K2 card.

I was eager to ask the said individual further about his success in doing so and what the limitations there might have been if any. Sadly that's why I'm here asking an open forum because I was not able to get a hold that individual and so I'm left with a success story without any reasons to that success.




I would really love to know if there was anyone here that could help shed light on how someone goes about taking a graphics card that doesn't natively support the ability to share itself among more than 1 host simultaneously.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I haven't heard of such a technology for standard cards yet. It would be big news if it did arrive.
 

zir_blazer

Golden Member
Jun 6, 2013
1,219
511
136
It is possible to mod a regular GeForce into a Quadro or even GRID. The problem is that the actual support for GPU virtualization that comes with GRID seems to be limited to just a few Hypervisors, and no idea about Drivers for the VMs themselves. AMD was going to release their Radeon Sky aimed at the same GPU Virtualization segment, but it got MIA. Never heared about actual experience from someone that was experimenting with multiple VMs and a GRID for sharing.

Alternatively, check XenGT. Some guys are trying to virtualize the Haswell IGP for sharing among many VMs.
 
Last edited:

cytg111

Lifer
Mar 17, 2008
25,228
14,719
136
I suppose this is radically different from vt-d. I am not sure how, problary something about resource management but that should be down to the OS (?).. Anyway, please keep us updated on your progress :).
 

tollingalong

Member
Jun 26, 2014
101
0
0
IMO the technology isn't there yet. You can pass-through to a limited degree but but afaik you can only have 1 vm use the card at a time.
 

thecoolnessrune

Diamond Member
Jun 8, 2005
9,673
583
126
There are hardware mods to turn a GeForce into a Tesla, but you aren't going to turn a GeForce into a GRID. A GRID card is a very different beast.

There's passthrough of the GPU, and then there's virtualization of the GPU, which are two completely different things. We can passthrough GPU's at the consumer level with a few headaches. I do it with my ATI graphics card into a Windows 8.1 VM. But to send a GPU and distribute it across multiple VM's is not as simple as you are trying to describe it.

At some point you need a virtualization layer between the GPU and the Hypervisor so control calls to the GPU, manage memory, and manage allocation of resources. VMWare, and Parallels both have technologies that do this. Parallels, VMWare Fusion, VMWare Workstation, vSphere, VirtualBox, etc, etc, all support the creation of a software layer called an SGA (Software Graphics Adapter). All these SGA's make calls to the GPU using the SGA's driver, and the SGA management system in the Hypervisor appropriates these calls to the shared GPU. This process doesn't work well at the consumer level because SGA's only support a varying degrees of the DirectX stack, with DX9 support fairly solid, and DX10 almost completely missing. In addition to this, since the VM only gets this virtualized GPU, most games have no idea what to do with it, and will at many times give you lots of grief of warning you that you are not running a supported GPU. You also will not be using any of the optimizations that nVidia or AMD put in their drivers.

So Workstations needed a way to get better performance and get real GPUs assigned to multiple virtual machines, so what you're wanting to do, the DGA (Direct Graphics Adapter) was born. The difference here is that instead of the Hypervisor running a virtualized graphics subsystem to handle calls to the GPU, the GPU itself handles it in hardware. Again, this is not as simple as flipping a switch, and then BAM, we went from Tesla to GRID. GRID is a lot of work that went into developing extra hardware dedicated to the small Hypervisor running on the Graphics Card, that manages all the system calls from multiple VM's, and making sure no thread locking occurs. Essentially, you need to stop looking at GeForce and GRID like they're the same card. Many cars from the same manufacturer use the same engine under the hood, but are built vastly different to suit different purposes. That's what you're seeing going on with GRID. This is also why the technology is only supported by heavy weights right now like Citrix and VMWare.

I too like pushing usable technology in the home, but it will take a while to filter down to us. :) I'm just happy we can at least passthrough a GPU at this point with a relatively known success rate. Alternatives to GRID will come about likely through Xen as mentioned before, as Xen already supports GPU Virtualization. It will just come down to having to build a capable and fast GPU Hypervisor that also emulates the GPU being virtualized (unlike VMWare SGA) so that compatibility is maintained. That will take a long time. In the mean time, the companies will be recouping their large initial investment.

As to your comments about people converting GPUs to GRID GPUs, I have to say that flipping resistors means absolutely nothing. I have read through several posts like these in the past, and all these guys show is benchmarking in Windows, or passing the GPU through in Xen or a different Hypervisor to a dedicated host. No post that I have seen shows someone hacking apart their GPU, and then licensing the incredibly expensive VMWare Horizon View or Xen HDX to allow for multi-VM GPU virtualization. Have you seen anyone post about actually doing what you want to do? Because I would be very interested if they accomplished it :) Also, you haven't even mentioned if even though a GRID K1 GPU (about $1,000 used on eBay) is too rich for your blood, whether or not you are still going to pay the licensing costs for one of the only 2 existing GPU Virtualizing extensions, for VMWare or Citrix. Also, neither company will let you license those extensions without having full support contract, licensed editions of the Hypervisor. It's all very expensive.

There's a whole chain of things you would have to be "inventive" with here before you could even consider making a random GPU work. :)
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
HYper-Vs RemoteFX is another method. But still requires compatible graphics card.
 

thecoolnessrune

Diamond Member
Jun 8, 2005
9,673
583
126
And Hyper-V RemoteFX has the same problem as the other technologies. Its either a virtualized GPU with specific calls offloaded to the GPU (same pitfalls as any other Hypervisor's SGA solution), or its a 100% software emulated CPU-based SGA. Hyper-V does not support pass through of devices outside of the hypervisor so you can't even send 1 GPU to 1 VM, let alone 1 to many.

That being said, Hyper-V is the only one with a good SGA support for DirectX. They have DX 11 supported decently well even as long as it can virtualize the calls to a discrete GPU supporting it.
 

zir_blazer

Golden Member
Jun 6, 2013
1,219
511
136
There are hardware mods to turn a GeForce into a Tesla, but you aren't going to turn a GeForce into a GRID. A GRID card is a very different beast.

[...]

As to your comments about people converting GPUs to GRID GPUs, I have to say that flipping resistors means absolutely nothing. I have read through several posts like these in the past, and all these guys show is benchmarking in Windows, or passing the GPU through in Xen or a different Hypervisor to a dedicated host. No post that I have seen shows someone hacking apart their GPU, and then licensing the incredibly expensive VMWare Horizon View or Xen HDX to allow for multi-VM GPU virtualization. Have you seen anyone post about actually doing what you want to do? Because I would be very interested if they accomplished it :) Also, you haven't even mentioned if even though a GRID K1 GPU (about $1,000 used on eBay) is too rich for your blood, whether or not you are still going to pay the licensing costs for one of the only 2 existing GPU Virtualizing extensions, for VMWare or Citrix. Also, neither company will let you license those extensions without having full support contract, licensed editions of the Hypervisor. It's all very expensive.
The nVidia GRID K1 and K2 cards are technically four GK208 and two GK104 dies on the same PCB respectively, being the first one a very bizarre piece for being four GPUs on the same card, and the latter a standard dual GPU card akin the GeForce GTX 690. I don't think that from the Hardware side they are different than standard GeForces or Quadros as the GPUs themselves seems to be the same dies used along the many lines (Consumer GeForces, Professional Quadros, HPC Teslas, etc), so most if not all of the differences should be Software or Firmware based.
The modding trick is possible but wasn't tested - it was possible to make some GPUs report as a GRID, but no further testing to see if it can run with the Software intended for it (Whatever you need for GPU Virtualization), which would be the main purpose of doing that. Indeed, even if it had all the GPU Virtualization features of the GRIDs working, currently I don't know of anyone that could test it. You would need access to that expensive Software first.
Now, due to the lack of info regarding how a GRID (Or the paperlaunched Radeon Sky, for that matter) works, I can only speculate based on the info provided by nVidia. Because one of their uses is supposed to be cloud gaming (Services like those provided by OnLive, where you play a game that is getting processed elsewhere and streamed to you on real time), they SHOULD work with standard games without a lot of hazzle. Chances are that for the VM they appear and act as a regular GeForce or compatible while throwing the extra complexity to the Hypervisor, would find hard to believe if it was harder than that.


Modding GeForces to Quadros is a more common sight among the Xen guys that do VGA Passthrough, because they are supposed to be much more easier to work with due to a Driver or Firmware feature that only works if it detects a Quadro instead of a GeForce. Because I'm on the Radeon side, for me VGA Passthrough worked easily, but as far that I know that solder mod does help the GeForce guys. So yes, flipping resistors to change the DeviceID does accoplish something important for them. Due to lack of in-depth testing, I don't know if it also enables some of other features exclusive to Quadros like higher FP64 performance, as GeForces were capped in that. I do know that during the Radeon HD 4xxx generation, it was possible to softmod Radeons into FirePro with actual performance increases.
 
Last edited:

Kilgore23

Member
Jun 26, 2014
37
0
0
As to your comments about people converting GPUs to GRID GPUs, I have to say that flipping resistors means absolutely nothing. I have read through several posts like these in the past, and all these guys show is benchmarking in Windows, or passing the GPU through in Xen or a different Hypervisor to a dedicated host. No post that I have seen shows someone hacking apart their GPU, and then licensing the incredibly expensive VMWare Horizon View or Xen HDX to allow for multi-VM GPU virtualization. Have you seen anyone post about actually doing what you want to do? Because I would be very interested if they accomplished it :) Also, you haven't even mentioned if even though a GRID K1 GPU (about $1,000 used on eBay) is too rich for your blood, whether or not you are still going to pay the licensing costs for one of the only 2 existing GPU Virtualizing extensions, for VMWare or Citrix. Also, neither company will let you license those extensions without having full support contract, licensed editions of the Hypervisor. It's all very expensive.

There's a whole chain of things you would have to be "inventive" with here before you could even consider making a random GPU work. :)


Sorry for not responding to this post or this thread any sooner. Yes, the main reason why I created this thread in the first place was only because I've heard of some individuals making use of their consumer grade graphics cards with the properties that of a grid so that their graphics cards are shared amongst multiple guests;

Sure it can. Grid K2 is essentially a GTX690 without the video output ports (I have a GTX690 modified into a Grid K2 in the ESXi test box under my desk). How well it will run it depends on how many clients you are running off a single GPU.
As a rough comparison, a single Quadro 6000 (essentially a clocked down and shader reduced GTX480) on ESXi vSGA managed about 6 simultaneous 800x600@25fps Borderlands sessions (rendering and encoding into video). Of course, you need a client end that can decompress the desktop video stream coming off the server in realtime.

This post is from a different community called theregister and I'm not apart of it and becoming apart of it is rather tedious as I need to have at least 5 forum posts to be approved by the rest of the community before I started creating my own threads and I can't seem to send him a private message to ask how he did what he did or else I'd just ask him lol bleh.


But pretty much as he describes he sliced up quadro 6000 (or essentially a clocked down shader reduced gtx480) on ESXi vSGA and managed 6 simultaneous 800x600@25 fps boarderland games. This is pretty much exactly what I'm looking to do
 
Last edited:

Kilgore23

Member
Jun 26, 2014
37
0
0
Also would anyone know if I could share 1 nvidia quadro 6000 across multiple guests or is this just limited to k1 or k2 nvidia cards?
 

zir_blazer

Golden Member
Jun 6, 2013
1,219
511
136
May want to have some reads:

http://www.vmware.com/files/pdf/techpaper/vmware-horizon-view-graphics-acceleration-deployment.pdf
http://www.vmware.com/resources/compatibility/search.php?deviceCategory=vsga
http://blogs.citrix.com/2013/12/11/true-hardware-gpu-sharing-arrives/
http://blogs.citrix.com/2014/01/08/under-the-hood-of-gpu-sharing-technologies/
http://blogs.citrix.com/2014/01/03/...ay3d-turbine-demo-observations-and-questions/

VMWare vSGA appears functional but is not virtualization, it is not getting access to a transparent partition of the physical Hardware. It needs some custom support in the form of guest Drivers, and the APIs it currently supports are quite outdated (Just DirectX 9.0c and OpenGL 2.1). Sounds a bit like Wine, it trails by a few years current technologies and takes a lot of developer effort and testing because you need to adapt it to get more applications specifically supported by it.
The idea is that the virtualization layer takes care of that in a generic way without need for specific guest OS or application support, in such case you have close if not 100% compatibility. As it is now, you can achieve that with VGA Passthrough, as it allows you to use native Drivers inside the VM - GPU virtualization should be able to do the same, but be able to share it among many VMs partitioning Hardware resources that runs with native Drivers, which the Citrix version seems to closely offer. I'm curious on how XenGT compares to Citrix solution.

If ESXi can only use that mode of vSGA with a nVidia GRID, even if functional, it is subpar to what I am expecting, because it is not universally compatible. Also, VMWare ESXi seems to not support vDGA/VGA Passthrough with any AMD Video Card at all - which Xen does.
 
Last edited:

Kilgore23

Member
Jun 26, 2014
37
0
0
May want to have some reads:

http://www.vmware.com/files/pdf/techpaper/vmware-horizon-view-graphics-acceleration-deployment.pdf
http://www.vmware.com/resources/compatibility/search.php?deviceCategory=vsga
http://blogs.citrix.com/2013/12/11/true-hardware-gpu-sharing-arrives/
http://blogs.citrix.com/2014/01/08/under-the-hood-of-gpu-sharing-technologies/
http://blogs.citrix.com/2014/01/03/...ay3d-turbine-demo-observations-and-questions/

VMWare vSGA appears functional but is not virtualization, it is not getting access to a transparent partition of the physical Hardware. It needs some custom support in the form of guest Drivers, and the APIs it currently supports are quite outdated (Just DirectX 9.0c and OpenGL 2.1). Sounds a bit like Wine, it trails by a few years current technologies and takes a lot of developer effort and testing because you need to adapt it to get more applications specifically supported by it.
The idea is that the virtualization layer takes care of that in a generic way without need for specific guest OS or application support, in such case you have close if not 100% compatibility. As it is now, you can achieve that with VGA Passthrough, as it allows you to use native Drivers inside the VM - GPU virtualization should be able to do the same, but be able to share it among many VMs partitioning Hardware resources that runs with native Drivers, which the Citrix version seems to closely offer. I'm curious on how XenGT compares to Citrix solution.

If ESXi can only use that mode of vSGA with a nVidia GRID, even if functional, it is subpar to what I am expecting, because it is not universally compatible. Also, VMWare ESXi seems to not support vDGA/VGA Passthrough with any AMD Video Card at all - which Xen does.

Yeah I was looking at these two videos comparing vsphere 5.5 and xenserver 6.2

https://www.youtube.com/watch?v=0QYc1AGCDzw

&

https://www.youtube.com/watch?v=oYefdXMSGn0

So I guess my best bet is to go with xenserver hypervisor?

If I had about 600 USD to spend on a graphics card which graphics card should I buy? Could a Nvidia Quadro 6000 be split and utilized among multiple guests?
 

zir_blazer

Golden Member
Jun 6, 2013
1,219
511
136
If I was you, I will sit and wait, research some more, and buy nothing. While technically possible, there is no plug-and-play solution for what you want to do, and for sure there is nothing available that can be deployed in a el-cheapo way.

XenServer seems to be perfect were you using it with a GRID and proper Software support, but they're not exactly accessible for everyone. VMWare vSGA doesn't say anywhere it works with Quadros, just with the GRIDs and some FirePros. Chances are that it can work with more cards unofficially (That Quadro 6000, or maybe even a consumer GeForce, which may need to be modded to Quadro), that should be why that guy got it working. Xen doesn't have a more intermediate solution equivalent to vSGA, even if at the top end they have a better solution.

Purchasing a Quadro 6000 doesn't make sense, keep it mind it is a GeForce GTX 480 equivalent, a 3 generations old GPU. If you're basing your love for the Quadro 6000 just because the other guy got it working with vSGA, I would look around to see if I can get a cheaper, more modern GeForce, mod it to Quadro, and get better results. Also, on the Xen side, Quadros are much more easier to do VGA Passthrough with that GeForces, but a modded one works, too. If the GRID mod works for Xen vGPU, I don't know.
So basically, if you want to use VMWare sVGA for cheap, you need to figure out if a modern GeForce-to-Quadro mod works, or a GeForce-to-GRID for the Xen vGPU solution if you somehow manage to get the required Software.
 

thecoolnessrune

Diamond Member
Jun 8, 2005
9,673
583
126
Also to note in Kilgore's last post, again, neither of the above videos is a free solution. Both, as I noted in my previous posts, is relying on a custom VDI model to go along with their customer GPU Virtualization Technology, which Citrix calls XenDesktop HDX 3D Pro, and VMWare calls Horizon View with vDGA. Not only do you need a compatible GPU, but even if you hack one up, you still need the licensing to the VDI, which for Citrix is roughly $400 per virtual machine, and for VMWare is closer to $500. Then you'll either need to use their app and yet another Windows license to get a view into the VDI, or, to maximize display compatibility and minimize headaches, you need a device that can act as a Thin Client that is certified for use with the Xen HDX or VMWare Horizon View, like the NComputing N500, another $200. Even if you get the $600 card hacked together, you're not doing much for yourself in having to spend another $600-$700 on a supporting rig just to view the VDI.

As mentioned by zir_blazer and myself previously, vSGA is really not a way to go at this point for games. The technology was designed for modeling systems that will be sticking to DirectX 9 for the forseeable future. The technology has sharp overhead due to the emulation layer involved, and additionally will fall further behind as DirectX 10 and 11 games gain the brunt of releases.

For what you want to do, which is have full 3D Virtualization amongst several VM's, there is no free / cheap way to do it at this time, even if you were to manage hacking a card into a fully functional GRID card (which I still have to vouch for the improbability of).

If you want to just pass through a single graphics card to a single virtual machine, I've had the best luck using Arch Linux with a 3.9.x kernel (to support experimental passthrough features in the 4.4.x branch of Xen Project. Since Xen piggybacks onto a full Linux install of your choosing, it can accommodate and passthrough pretty much any graphics card you want with a big of work.
 

Kilgore23

Member
Jun 26, 2014
37
0
0
If I was you, I will sit and wait, research some more, and buy nothing. While technically possible, there is no plug-and-play solution for what you want to do, and for sure there is nothing available that can be deployed in a el-cheapo way.

XenServer seems to be perfect were you using it with a GRID and proper Software support, but they're not exactly accessible for everyone. VMWare vSGA doesn't say anywhere it works with Quadros, just with the GRIDs and some FirePros. Chances are that it can work with more cards unofficially (That Quadro 6000, or maybe even a consumer GeForce, which may need to be modded to Quadro), that should be why that guy got it working. Xen doesn't have a more intermediate solution equivalent to vSGA, even if at the top end they have a better solution.

Purchasing a Quadro 6000 doesn't make sense, keep it mind it is a GeForce GTX 480 equivalent, a 3 generations old GPU. If you're basing your love for the Quadro 6000 just because the other guy got it working with vSGA, I would look around to see if I can get a cheaper, more modern GeForce, mod it to Quadro, and get better results. Also, on the Xen side, Quadros are much more easier to do VGA Passthrough with that GeForces, but a modded one works, too. If the GRID mod works for Xen vGPU, I don't know.
So basically, if you want to use VMWare sVGA for cheap, you need to figure out if a modern GeForce-to-Quadro mod works, or a GeForce-to-GRID for the Xen vGPU solution if you somehow manage to get the required Software.

Thank you zir_blazer so much for helping me figuring this out, you are providing me with a lot of clarity. Out of the 5 different communities I have asked you are pretty much the only one who's really providing me with set path to follow.

As for the ESXi sVGA compatability with certain cards, a member on a different forum claimed the following cards were officially supported:

vSGA
Nvidia GRID K1
NvidiaGRID K2
Nvidia Quadro 4000
Nvidia Quadro 5000
Nvidia Quadro 6000
Nvidia Tesla M2070Q

vDGA
Nvidia GRID K1
Nvidia GRID K2
Nvidia Quadro K2000
Nvidia Quadro K4000
Nvidia Quadro K5000
Nvidia Quadro K6000
Nvidia Quadro 1000M
Nvidia Quadro 2000
Nvidia Quadro 3000M
Nvidia Quadro 4000
Nvidia Quadro 5000
Nvidia Quadro 6000
Nvidia Tesla M2070Q

Is this not the case? I wasn't too excited to purchase a Quadro 6000 for the same reason you mentioned, it being pretty much a gtx 480 or worse and it costing about 600 USD. I would rather buy a stronger geforce and mod it to act as a quadro (if quadro is actually supported) for about the same price but I'm not exactly sure about how to do that? I have to mod the bios and softstraps? At least that's what I was informed on a different forum:

You can buy a GTX 465 for cheap and then mod the BIOS and SoftStraps so the Device ID shows up as a Quadro 4000. On that note, I have a GTX465 for sale ;-)

So I thought just buying a quadro would be easier to deal with as a plug and play solution with providing the appropriate drivers

Again thank you so much for your help zir_blazer, I've been holding off this project for over 2 months and this graphic cards problem is really starting to frustrate me
 

Kilgore23

Member
Jun 26, 2014
37
0
0
Also to note in Kilgore's last post, again, neither of the above videos is a free solution. Both, as I noted in my previous posts, is relying on a custom VDI model to go along with their customer GPU Virtualization Technology, which Citrix calls XenDesktop HDX 3D Pro, and VMWare calls Horizon View with vDGA. Not only do you need a compatible GPU, but even if you hack one up, you still need the licensing to the VDI, which for Citrix is roughly $400 per virtual machine, and for VMWare is closer to $500. Then you'll either need to use their app and yet another Windows license to get a view into the VDI, or, to maximize display compatibility and minimize headaches, you need a device that can act as a Thin Client that is certified for use with the Xen HDX or VMWare Horizon View, like the NComputing N500, another $200. Even if you get the $600 card hacked together, you're not doing much for yourself in having to spend another $600-$700 on a supporting rig just to view the VDI.

As mentioned by zir_blazer and myself previously, vSGA is really not a way to go at this point for games. The technology was designed for modeling systems that will be sticking to DirectX 9 for the forseeable future. The technology has sharp overhead due to the emulation layer involved, and additionally will fall further behind as DirectX 10 and 11 games gain the brunt of releases.

For what you want to do, which is have full 3D Virtualization amongst several VM's, there is no free / cheap way to do it at this time, even if you were to manage hacking a card into a fully functional GRID card (which I still have to vouch for the improbability of).

If you want to just pass through a single graphics card to a single virtual machine, I've had the best luck using Arch Linux with a 3.9.x kernel (to support experimental passthrough features in the 4.4.x branch of Xen Project. Since Xen piggybacks onto a full Linux install of your choosing, it can accommodate and passthrough pretty much any graphics card you want with a big of work.

Thank you too thecoolnessrune! I didn't realize to what extent how much all the licencing would cost, thank you for clarifying this for me as well. Ideally you guys are correct View or HDX would be the way to go, but if that were too pricey for me then as you mentioned vSGA would be my only option but within the limitation of directx 9? For now this isn't a huge problem. I'll only be using a directx9 applications for a while anyway for now. So then would using vSGA through ESXi be okay? I don't care for excessive fps per guest/vm, if I can manage even 15-20 fps per guest then I'll be happy.

Right now I'm using VMware workstation to produce 8 guests in or ontop of my windows OS on my home computer. I forget exactly what method is used to emulate the graphics card, all I know is the graphics card isn't actually being utilized, instead it's a soft gpu making the cpu do the work?

If I were to use ESXi or say hyper-v and do remotefx wouldn't I be able to produce more vms with similiar results or better with the same hardware under a type 1 hypervisor as opposed to type 2 like workstation?

I'm sorry for my ignorance :( but any information to help get me this 2-3 month project in motion is very much appreciated, bleh
 

zir_blazer

Golden Member
Jun 6, 2013
1,219
511
136
As for the ESXi sVGA compatability with certain cards, a member on a different forum claimed the following cards were officially supported:

Is this not the case?
That list may be right - the sVGA one includes a few more cards that the official one on the website doesn't include (And lacks FirePros BTW).
Keep in mind that I don't know a lot about VMWare offerings. When I started in the Hypervisor arena, Xen had the best progress in the experimental VGA Passthrough, so I sticked to it. vGPU was just a wet dream at that time.


I would rather buy a stronger geforce and mod it to act as a quadro (if quadro is actually supported) for about the same price but I'm not exactly sure about how to do that? I have to mod the bios and softstraps?
Check this Thread. These guys discovered a solder mod which allows a Video Card to identify as something else, as DeviceID, which is one of the things that identifies what the Video Card is, is based around some resistors which these guys know how to manipulate. Don't recall if you also needed to modify the BIOS too.
You would need to make sure than the GeForce to Quadro conversion you want to do has been already documented, so you know exactly what you have to do. Purchasing a Video Card that no one toyed around wouldn't be a good idea. May want to ask there for details. And ask around if someone managed to get vSGA working with newer or unsupported Hardware, otherwise you would be forced to museum stuff if you want to use it.


So I thought just buying a quadro would be easier to deal with as a plug and play solution with providing the appropriate drivers
Quadros have a feature called "MultiOS" that allows them to properly reset the GPU during a VM reboot cycle. Radeons are weak in that area. In my personal experience with a Radeon 5770 using Xen VGA Passthrough, a fresh VM start works always fine, but restarts of the VM can led to either BSODs on boot or that the Video Card sticks to the highest Power State as if the PowerPlay feature was not working. When that happens, GPU-Z is unable to report GPU Voltage, so it forces me to reboot the VM a few more times until it gets it right. Is still not perfect, but I could say that quite close to it.


Again thank you so much for your help zir_blazer, I've been holding off this project for over 2 months and this graphic cards problem is really starting to frustrate me
This will not be encouraging: From the time I first heared about VGA Passthrough so I could play games in a VM, to the moment when I actually was able to make such a setup, it took me THREE YEARS. Main issue was BIOS support for the VT-d/AMD-Vi feature, check here. VMWare vDGA and Xen PCI/VGA Passthrough requires that, you can at least get more reliabily available Hardware support to work than when I started my research on this topic.


So far, your vGPU dreams are technically possible but not cheap to implement. What you can currently do easily, is VGA Passthrough. Xen is much more flexible in this area that VMWare vDGA which only works with Quadros and GRIDs, while Xen plays nice with consumer Radeons. Just remember that you need one GPU per VM, and chances are that a Monitor and Sound Card too, but these should be Software workaroundeable.
As far that I know from all the Posts I read, the purchase order preference is GeForces at the very bottom, Radeons as the mainstream cheap solution because they can be consistently passed through with not a lot of issues, and Quadros (Or modded GeForces) on top due to the MultiOS feature so they work flawlessly on VM reboots without the issues I described with my Radeon.


BTW, I forgot to mention: Up to this point, I didn't even mentioned XenGT. Because I didn't looked around to see how hard it is to implement and if it gives good results, I can't really tell, but one of my ideas was to use XenGT (Which requieres a Haswell) to drive all VMs but the gaming one, where I would use VGA Passthrough to give it the Radeon 5770. Assuming it works like that, you could meet your goal. Alternatively consider a lot of cheaps Video Cards, you could easily fit 2 or 3 30 U$D or so Video Cards and do VGA Passthrough. Problem is that I didn't researched a lot on the remote view area, as after being GPU accelerated, you still need a way to route the video output to your main screen, otherwise you need another Monitor.
 
Last edited:

Kilgore23

Member
Jun 26, 2014
37
0
0
Heya guys, thanks so much for participating or even reading through this thread. All of you have helped me a lot to understand what I should expect and help me get this rig up and running, special thanks to zir_blazer and thecoolnessrune for sharing their in depth experiences.


I guess I'm hitting a hard wall and vGPU is out of the question for me. When the opportunity arises I will upgrade to a GRID card I suppose but for now it looks like I'm limited to either ESXi sVGA or hyper-Vs remoteFX.



Honestly I am okay with either solution as the games I want to play don't need to be anywhere over 30 fps, even as low as average of 15-20 fps is okay with me on the lowest possible resolution and settings. So if either ESXi sVGA or hyper-v remotefx can produce this on a large scale of multiple vms/guests then I'm all for that.



I guess by going with what thecoolnessrune had to say about ESXi's sVGA is that it doesn't have any directx support past 9, where as remotefx has 11. I suppose the clear choice here is remotefx




Also I found this 40 second youtube demonstrating sVGA on ESXi with a quadro 6000?

http://youtu.be/ME3xaLUTZgU


The author wrote this about what he was doing:

Tim Federwitz1 year ago said:
I will have to answer this in multiple parts because of character limitation per comment. We are giving each VM 512MB of video memory. This reserves 256MB on the GPU and 256MB on the Host in RAM (so this will need to be taken into consideration in host sizing). The video card we are using is an Nvidia Quadro 6000 which has 6GB of VRAM. So in theory, and depending on the application and it's GPU load, you could assign 24 VMs to the card with 512MB video memory each.
 
Last edited: