johnarvid Posted December 2, 2017 Share Posted December 2, 2017 (edited) Hello, I wanted to share my story and ask for help regarding what to do next. Setup: Motherboard: ASrock x99x killer Gpu1: MSI gtx980 (Pci1) Gpu2: MSI gtx980 (Pci2) Gpu3: XFX 6950 (Pci3) Cpu: Intel I7 Ram: 64GB ddr4 HDD: 18TB What I want to do: Have three windows 10 virtual machines that has one gpu each, running at the same time. Atleast 4 usb port for each vm. (This has been solved in a way) (I have two Logitech G27 so me and a friend can play together, so the USB has to be passed through. This is working, can share if wanted, but the information is on the forum) What I have done/tested: Unraid uses the first GPU as it's head. I can successfully choose the Gpu2 and Gpu3 in Unraid GUI and start the VM. This works without a problem. But if I choose Gpu1 and start the VM, Unraid looses it's head, the screen goes black, all the CPU cores throttles and the VM starts. I can now log in via RDP but there is no screen output. When I log into the VM with RDP I get this error in device manager: Windows has stopped this device because it has reported problems. (Code 43) This led me to https://forums.lime-technology.com/topic/43785-solved-gpu-drivers-failing-in-windows-10-vm-code-43/ Tried to disable HyperV but it was the same problem. Tried the newest virtio drivers, same problem. Tried to install a new VM. If I choose Gpu1 and start the VM, there is no output to the screen, and the forums posts says that I should not use VNC at any time. (Tested with OVMF and SEABIOS) so I can't really proceed more on this step. But all information leads to using a VBIOS for the main GPU. Then if I use the VBIOS and add this to the xml, when I start the VM all the CPU cores throttles to almost 100% and stays this way for however long I wait. The VM does not start, so no RDP. I have tested with rom from my GPU, though I could only do a "cat rom" on Gpu2 and not on Gpu1. (input output error) I have tested with a rom from techpowerup after I removed the first part in a hex editor. I have also added vfio_iommu_type1.allow_unsafe_interrupts=1 with no luck. I'm sure I am doing something wrong, or missing something. It should not be that difficult to use Unraid!? Should it? \JAK Edited December 2, 2017 by johnarvid Quote Link to comment
johnarvid Posted December 2, 2017 Author Share Posted December 2, 2017 2 minutes ago, DZMM said: Hence: Quote Then if I use the VBIOS and add this to the xml, when I start the VM all the CPU cores throttles to almost 100% and stays this way for however long I wait. The VM does not start, so no RDP. I have tested with rom from my GPU, though I could only do a "cat rom" on Gpu2 and not on Gpu1. (input output error) I have tested with a rom from techpowerup after I removed the first part in a hex editor. I have also added vfio_iommu_type1.allow_unsafe_interrupts=1 with no luck. Quote Link to comment
johnarvid Posted December 2, 2017 Author Share Posted December 2, 2017 I have followed these tutorials and tested different things that comments where mentioning. How to pass through an NVIDIA GPU as primary or only gpu in unRAIDHow to easily passthough a Nvidia GPU as primary without dumping your own vbios! in KVM unRAID The best way to install and setup a windows 10 vm as a daily driver or a Gaming VM All videos made by Spaceinvader One \JAK Quote Link to comment
SpaceInvaderOne Posted December 6, 2017 Share Posted December 6, 2017 Hi, I have noticed that behaviour with one of my servers. If I assign a large number of cores in windows VM eg 8 cores plus then the CPUs hit 100% on the start of the VM. Nothing happens for about 1 minute 30 seconds then the VM loads, CPU goes back to normal and VM is fine. Try waiting and waiting and waiting and see if VM starts up. Also, try fewer cores and see if that speeds up the boot. I find it happens to me if I use GPU 1, GPU 2 VNC graphics, so the Vbios passthrough isn't what effects this behaviour for me. . I also use an ASRock x99 board, so maybe it is in to do with our boards as I haven't heard many people with this issue but those who have normally have large amounts of ram of CPU cores assigned to the VM. Quote Link to comment
nox_uk Posted December 10, 2017 Share Posted December 10, 2017 getting the same - as rock x99 motherboard & nvidia 970 (only) been browsing a lot of videos on this - the dumping rom and editing the xml seemed to make it worse (the vm went into recovery mode so no rdp...) tried both SeaBIOS & OVMF - not sure which is most likely to work. Seabios is a black screen for AGES then i see a flash of 'boot from DVD' and it goes back into it's cycle i am using their x99e-itx (mitx board) with a view to purchasing a matx if i can get it going on this thing. Quote Link to comment
nox_uk Posted December 10, 2017 Share Posted December 10, 2017 working absolutely 100% fine if i switch it to VNC or use the 970 with an ubuntu vm Quote Link to comment
nox_uk Posted December 10, 2017 Share Posted December 10, 2017 figured it out - don't use the new UEFI version 6.4. Works fine with 6.35 Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.