@olivierlambert There are a number of different ways it can be done, evidently; however, there are some inherent variables involved that are way beyond my scope of expertise. I've linked a couple articles\posts below that go into more detail, and, also provide additional resources discussing this topic. I find it very intriguing; but, it certainly can become the deep end of the pool very quickly.
I believe the second link provides the most additional resources discussing "No Pill\Red Pill\Blue Pill" scenarios. The third link is vmWare's official resource on detection, the first one is a thread discussing the possibilities of hiding the hypervisor from a VM. It seems, according to one thread response on the first link, that KVM may mask the hypervisor present bit as its way of hiding the VM (at least it starts with that).
@apayne Honestly, it could be Linux or Windows for the gaming side - doesn't really matter (although, with Steam, Windows is preferred), as the concept is to use it with something like a Steam box or mobile client akin to Steam's streaming client. The ideal scenario is still to have some good "oomph" in the GPU on the back-end. NVIDIA's driver implementation from a VM essentially bars you from using anything modeled higher than a 1030; and, I've read a few places that finding stable older drivers for even their 900 series can be difficult from a VM perspective. AMD is an option, of course; but, the idle [and under load] power consumption and cooling requirements of an AMD card vs. an NVIDIA are quite different - especially in a server setting (even moreso in a cost-sensitive home environment - which is part of my use case). My server runs 24x7 regardless, and is battery and generator-backed. My gaming rig consumes as much electricity on its own as my entire server, switching, and storage infrastructure in a half-rack. If I can utilize the already-on server infrastructure to provide even half of my gaming needs, the power savings are well worth it.
For the media side, it could still be Linux or Windows; but, my use case revolves around time\latency-sensitive media encoding\decoding (think Plex and\or DVR for reference) of very high quality video - a GPU would do wonders for this.