VMWare Esxi and iPxe boot problems
In our normal environment we usually use physic machines but now we want to use virtual machines to manage our gold images. In our institution we have some VMware ESXI servers to storage virtual machines and we want to use one of them to manage and upload the gold image.
FOG server: 1.5.0 (we are using 184.108.40.206)
VMware ESXI server version: 5.5 and 6.5
Virtual machine config:
- VM hardware profile 10
- Windows 10 64-bit guest operative system
- EFI boot mode
- Network interfaces switched between E1000, E1000E and VMXNET3
When we try to boot from iPXE and UEFI, the virtual machine can not boot. We saw the network traffic betwaeen the FOG server and the virtuaol machine and the wireshark file shows the conversation and the FOG servers send the ipxe.efi file but seems that the virtual machine can load the ipxe kernel.
Curiously, if we config the virtual machine to boot from BIOS, the virtual machine can boot perfectly. Using undionly.kpxe.
Has anyone an ESXI environment? Which can be the problem? the virtual machine config file or the kernel of iPXE?
belac last edited by
@george1421 sorry, i was still researching what might be happening.
For a little bit more background, i can’t get my vm to boot to the win10 1803 iso using efi but I didnt have an issue with the ltsb version last year.
I setup a second esxi server for testing, running 6.5 to see if it was issue with 5.5 and i am able to boot win10 1803 using efi on that test server but not able to pxe boot.
efi network comes up then says downloading nbp file then continues booting. I dont see any other messages but i will do a slow mo video see if i can see if anything flashes up briefly.
On the production server that I can’t get the iso to boot, I was able to get it to pxe boot after i deleted the nic and re-added it using vmxnet3.
prod esxi server 5.5u3 – cant boot 1803 iso using EFI
Test server esxi 6.5, able to boot iso to EFI but cant PXE boot EFI.
I just did quick google search that brought me to this thread and was just making a general commenting that i was having issues. i can make a new thread if that would be better.
cant get them to work for the life of me now
I know what this says, but what do you mean? Where does it fail? Is it fog pxe booting not working, or is it post image deployment. We need a bit more information on where it doesn’t work. Clear pictures taken with a mobile phone help set the context of “can’t get them to work” too.
In addition, unless your circumstances are exactly the same as the OP of this thread, you should start your own thread. For a limited time, new threads are free to create. “Don’t delay, open a new thread today…”
belac last edited by
i did not have any issues last year getting efi vm’s to pxe boot but cant get them to work for the life of me now. I dont recall exactly what verion of esxi i had last summer but i think it was 5.5 u1? on u3 now and only other change is win10 1803 vs 1609 ltsb.
We are working this with our Systems Operation Center (SOC) team and they are testing something in the EXSI servers.
In this link http://ipxe.org/howto/vmware the iPXE team explains how create a ROM to support iPXE in VMWARE network interefaces, but the instructions are for BIOS. They try it and works fine, but is for BIOS.
vmx config file:
firmware = "bios" ethernet0.virtualDev = "e1000" ethernet0.opromsize = 262144 e1000bios.filename = "/usr/lib/vmware/resources/8086100f.mrom" e1000ebios.filename = "/usr/lib/vmware/resources/808610d3.mrom" nbios.filename = "/usr/lib/vmware/resources/10222000.rom" # nxbios.filename = "" nx3bios.filename = "/usr/lib/vmware/resources/15ad07b0.rom"
They tried create the ROM files to UEFI but did not work. They don’t know how create the EFIROM file with the ROM of the vmware network interfaces.
I wonder if this is related to something that is currently affecting Hyper-V. https://forums.fogproject.org/topic/11348/hyper-v-and-pxe-boot-to-fog-problems
With that said, I was able to boot into iPXE on a vm in uefi mode using my uefi secure boot hack. Which kind of makes me think it could be related to the hyper-v certificate thing. I would be interesting to see if the FOG developers could create a test iPXE kernel with the mentioned modules missing to see if that would address the iPXE booting in VMWare uefi. If I have time this weekend I can/will create one by hand using the rom-o-matic site and test it.
@Sebastian just for your fyi.
when you create the VM, classify it as a windows 7 machine in it’s settings, even though you’re going to put windows 10 on it. also, if you’re going to use a vm to create your golden image for physical hosts, make sure you don’t put the vmware tools on it. it can make your hosts act in ways you don’t want, like laptops that will never sleep.
I remember now. I get the same as you… Nothing in uefi mode. Also remember what we did to capture the image.
We created a uefi image with MDT (our standard process). Then when we went to capture with FOG, we ran into this issue. So we switched to bios mode to capture the image. Then when we deployed to a physical machine in uefi mode it worked perfectly. So only capture from VM in bios mode was the trick (not the answer but the trick to keep moving). I have to go to a few meetings now. But I think I was working on something because I was able to uefi pxe boot into grub because that is what came up when I started the vm. I’ll look into that in a bit.
@george1421 We have tried with e1000, e1000e and vmxnet3 interfaces, but nothing :(
I can test this in my lab. I have esxi 6.5, but FOG 1.4.4 in our production environment. Let me setup a test to see if I can pxe boot a uefi system. I’m sure that I’ve done that before because we have a uefi image we captured.
I can tell you that you should to use the E1000 or E1000e network adapter to be safe.