Multi-headed VMWare Gaming Setup
| Table of Contents:
- Introduction
- Hardware Requirements
- Virtual Machine Setup
- Performance and Impressions
- Conclusion & Use Cases
- Extra: How far is too far?
|
NVIDIA GRID which GPU virtualization and virtual machines to stream a virtual desktop with full GPU acceleration to a user. NVIDIA GRID is built around streaming the desktop, which requires robust network infrastructure and high quality thin clients. Even with the best equipment, there is latency, video compression, and high CPU overhead. These can be worked around for many application, but are all big turn-offs to gamers.
What that in mind, we set out to build a PC that uses virtualization technologues to allow multiple users to game on one PC but where there is no streaming and no additional latency because all of the user inputs (video, sound, keyboard and mouse) are directly connected to the PC. By creating virtual machines and using a mix of shared resources (CPU, RAM, hard drive and LAN) and dedicated resources (GPU and USB) we were able to create a PC that allows up to four users to game on it at the same time. Since gaming requires minimal input and display lag, we kept the GPU and USB controllers outside of the shared resource pool and directly assigned them to each virtual OS which allows the keyboard/mouse input and video output to bypass the virtualization layer. The end result is a single PC running four virtual machines; each of which behaves and feels like any other traditional PC.
|
We tried to get a number of GeForce cards to work (we tested with a GTX 780 Ti, GTX 660 Ti, and GTX 560), but no matter what we tried they always showed up in the virtual machine's device manager with a code 43 error. We scoured the internet for solutions and worked directly with NVIDIA but never found a good solution. After a lot of effort, NVIDIA eventually told us that PCI passthrough is simply not supported on GeForce cards and that they have no plans to add it in the immediate future.
In addition, we also had trouble with multi-GPU cards like the new AMD Radeon R9 295x2. We could pass it through OK, but the GPU driver simply refused to install properly. Most likely this is an issue with passing through the PCI-E bridge between the two GPUs, but no matter what is actually causing the issue the end result is that multi-GPU cards currently do not work well for PCI passthrough.
While this list is nowhere near complete, we specifically tested the following cards during this project:
Cards that work | Testing Hardware |
VMWare ESXI 5.5 to host our virtual machines since it is the one we are most familiar with. We are not going to go through the entire setup in fine detail, but to get our four virtual machines up and running we performed the following:
Getting four gaming machines running on a single physical system was not quite as easy as we originally expected it to be, but it worked very well once we figured out all the little tricks. The only obstacle we were not able to overcome was the fact that NVIDIA GeForce cards do not support virtualization, but the other issues we ran into were all resolved with various small configuration and setup adjustments. With those things figured out, it really was not overly difficult to get each virtual OS up and running with its own dedicated GPU and USB controller.
Performance wise, we are very impressed with how well this configuration worked. Being a custom PC company, we have plenty of employees that enjoy gaming and none of them noted any input or display lag. But beyond the cool factor, what is the benefit of doing something like this over using four traditional PCs with specs similar to each of the four virtual machines?
- Hardware consolidation - This probably is not terribly important to many users, but having a single physical machine instead of four means that you have less hardware to maintain and a smaller physical footprint.
- Shared resources - For our testing we only assigned four CPU cores to each of our virtual machines since we knew that we were going to be loading each virtual machine equally but with virtualization you typically would over-allocate resources like CPU cores to a much greater extent. Instead of equally dividing up a CPU, with virtualization you can assign anything from a single core to all the cores on the physical CPU to each virtual machine. Modern virtualization is efficient enough that it can dynamically allocate resources to each virtual machine on the fly which gives each machine the maximum performance possible at any given time. Especially in non-gaming environments, it is rare to use more than a small percentage of the CPUs available power the majority of the time, so why not let your neighbor use it while they use Photoshop or encode a video?
- Virtual OS management - Since virtual environments are primarily made for use with servers, they have a ton of features that make them very easy to manage. For example, if you want to add a new virtual machine or replace an existing machine that is having problems, you can simply make a copy of one of the other virtual machines. You need to change things like CD keys and the PCI passthrough devices, but the new machine would be up and running in the fraction of the time it would take to install it from scratch. In addition, you can use what are called "Snapshots" to create images of a virtual machine at any point. These images can be used to revert the virtual machine back to a previous state which is great for recovering from things like virsuses. In fact, you can even set a virtual machines to revert to a specific snapshot whenever the machine is rebooted. This makes it so you don't have to worry about what a user might install on the machine since it will automatically revert to the specified snapshot when the machine is shut down.
As for specific use-cases, a multi-headed system could be useful almost any time you have multiple users in a local area. Internet and gaming cafes, libraries, schools, LAN parties, or any other place where a user is temporarily using a computer would likely love the snapshot feature of virtual machines. In fact, you could even give all users administrator privileges and just let them install whatever they want since you can have the virtual machine set to revert back to a specific snapshot automatically.
In a more professional environment, snapshots might not be as exciting (although they would certainly still be very beneficial), but the ability to share hardware resources to give extra processing power to users when they need would be very useful. While it varies based on the profession, most employees spend the majority of their time doing work that requires little processing power intermixed with periods where they have to wait on the computer to complete a job. By sharing resources between multiple users, you can dramatically increase the amount of processing power available to each user - especially if it is only needed in short bursts.
Overall, a multi-headed system is very interesting but is a bit of a niche technology. The average home user would probably never use something like this, but it definitely has some very intriguing real-world benefits. Would something like this be useful in either your personal or professional life? Let us know how you would use it in the comments below.
|
A Quad Xeon system could theoretically run 66 video cards and USB controllers at PCI-E x1.
(Yes, this picture is photoshopped) |
For this project we used four video cards to power four gaming virtual machines because that was a very convienant number considering the PCI slot layout and the fact that the motherboard had four onboard USB controllers. However, four virtual machines is not at all the limit of this technology. So just how many virtual desktops could you run off a single PC with each desktop still having direct access to its own video card and USB controller?
The current Intel Xeon E5 CPUs have 32 available lanes that PCI-E cards can use. If you used a quad Xeon system you would get 128 PCI-E lanes which you could theoretically divide into 128 individual PCI-E x1 slots using PCI-E expanders and risers. The video cards would likely see a bit of a performance hit unless you are using very low-end cards, but by doing this you could technically get 66 virtual personal computers from a single quad Xeon system (assuming the motherboard has 4 onboard USB controllers).
Is 66 virtual machines off a single box too far? Honestly: yes. The power requirements, cooling, layout and overall complexity is pretty ridiculous at that point. Plus, how would you even fit 66 users around one PC (if it could even be called a PC at that point)? USB cables only have a maximum length of about 16 feet, so very quickly you would simply run out of space to put people. Really, at that point you should probably look into virtual desktop streaming instead of the monstrosity above that we mocked up in Photoshop.
What do you think? How many virtual desktops do you think is right to aim for with a setup like this?
|
|