Setting up the right FOG Environment



  • Can you chaps please help with a decison on setting up a preferred environment based on your experience with FOG.
    Current hardware options available are:

    New HP Microserver Gen8 1610T, 8Gb Ram
    64or 128 Gb SSD as Boot drive (or Boot off internal 32Gb USB stick)
    2x 8Tb Ironwolf Hd’s in Raid 1 (8Tb available)
    Dual Gigabit Network ports (Teamed) (Can add another 2 if deemed necessary)
    Was originally going to install Freenas/Rockstor and configure it as a NAS for storing both images and backup files, but I have since come to realise that setting a NAS as image store is not easy.
    So I thought of installing Xubuntu 16.04 LTS and making the device the fogserver but still able to export shares for other purposes. ie backups, off server storage of ISO’s etc.

    Existing Dell T620 with available slots for drives ( Windows Server 2012 R2 Host)
    Dual Xeon 3.6 running at average 8%
    Quad Port Gigabit Network card (Teamed) (over two switches for redundancy)
    Installing ubuntu 16.04/17.04 in VM as fogserver with 2xCPU, 2gb/4gb Ram.
    Can add 2x 2Tb(Raid1) SAS Drives into existing storage spaces on the host and shared with the fogserver VM as virtual drive/s for OS and Images.

    The fogserver will not work very often, but when it does it will be busy. We have 2x labs of 35 machines which need to be reimaged about 3/4 times a year. About another 50 Admin machines that will need imaging only when necessary for upgrades, reinstalls etc

    I last used fog about 5/6 years ago (0.29 to 0.32) when it ran on a stand alone machine and it worked very well until the machine died. I tried setting up ubuntu 10.04 in a VM and was only partially successful. Changed over to using WDS. We still have WDS but find it a pain to use, and I miss some of the features in fog. I was very happy to see fog is still being actively supported so thought I’d give it another bash.

    There are probably 10 or so different machines to support so probably 20 to 30 images to be stored.

    Your advice about how to use the available options will be much appreciated.



  • @george1421
    That is gold. Thank you.


  • Moderator

    @sebastian-roth The issue is (only guessing here), that the OP needs to setup a multicast router, or on their vlan router allow multicast data to pass through the router. Most routers have this disabled by default. One other thing is to have IGMP Snooping enabled on the switches so that the switches know who is a IGMP subscriber and who doesn’t care about the data stream (i.e. PIM Sparse Mode vs Dense Mode).


  • Developer

    @tomcatkzn said in Setting up the right FOG Environment:

    Had some issues with multicast as the labs are on different subnets. Any pointers will be appreciated.
    Have 2x Netgear 7328FS backbone switches.
    Each dept has its own Netgear GS724T or GS748T switch.

    Well, I have never had one of these in use and therefore can’t give specific advise. Probably best if you just give it a go and post a new question in the forums when you hit the wall with that.



  • @george1421 Awesome! Thank you.


  • Moderator

    @tomcatkzn said in Setting up the right FOG Environment:

    So a vhdx of about a 250Gb for /Images should be a good starting point then.

    Yes that is a good size since you need storage for 30 computer images.


  • Moderator

    @tomcatkzn said in Setting up the right FOG Environment:

    Can you elaborate on how you: “use FOG to place the required diver files in a predefined location on the target system during imaging.”

    Sure…
    https://forums.fogproject.org/topic/7740/the-magical-mystical-fog-post-download-script
    https://forums.fogproject.org/topic/7391/deploying-a-single-golden-image-to-different-hardware-with-fog
    https://forums.fogproject.org/topic/4278/utilizing-postscripts-rename-joindomain-drivers-snapins

    IMO building your reference image on a VM is the only proper way to build your reference image. I would dislike doing what I do on a real system, it would really slow down our workflow with building new reference images.



  • @george1421 said in Setting up the right FOG Environment:

    @tomcatkzn said in Setting up the right FOG Environment:

    Any idea what the size of a basic win7 and win10 image is with the new compression code?

    We’ve been rolling out fat images lately, but I think our thin Win7 image is about 8GB on the target disk.

    When setting up the VM I would create 2 vmdk files. One for the OS and one for images. If you are comfortable with the linux installer you can do it when you manually partition the disk. Create an OS disk of about 20GB in size and then create the image disk as a single disk, with a standard partition (not lvm) formated with ext4. Then for your mount point make it /images. That way when you install fog everything is setup.

    The other option is to install linux first on that single 20GB disk and then after the OS is installed then add in the imaging disk and mount it to /images BEFORE fog is installed. Again you will want to make it a single vmdk, with a standard partition (not lvm) formatted with ext4. The reason why I say a standard partition is that if you need to you can expand that vmdk later and then the partition, and file system. Its a bit harder if lvm is involved.

    Thank you for this very helpful feedback.
    So a vhdx of about a 250Gb for /Images should be a good starting point then.



  • @george1421 said in Setting up the right FOG Environment:

    @tomcatkzn said in Setting up the right FOG Environment:

    I was thinking that I need for each operating system/hardware combination :

    A base image with drivers and updates but no other software
    then a pre-sysprepped image of each of the above with other software installed.
    then a final sysprepped image for deployment.

    In our case we have a generic OS image and then use FOG to place the required diver files in a predefined location on the target system during imaging. Granted we only use Dells in our company, but we have 3 images (Win7x64, Win10x64, Win10x64EFI) for 15 different hardware models. It takes a while to set this up, but we can add new models without needing to recapture images for each new model. Also with this design, we recapture these 3 images each quarter with the latest updates (we may abandon this since Win10 will change the way updates are applied anyway).

    We build our reference images on a VM so that we have capabilities like snapshotting, and hardware independence during reference system creation. We do use Microsoft MDT to automate our reference image build. That way we get a consistently built reference image each quarter.

    Thank you for the feedback. I was lead to believe that using a VM for the reference image was not a good idea according to an article by Mitch Tullock (the link escapes me at the moment), although we are currently doing it with MDT/WDS.

    Can you elaborate on how you: “use FOG to place the required diver files in a predefined location on the target system during imaging.”



  • @wayne-workman said in Setting up the right FOG Environment:

    @tomcatkzn said in Setting up the right FOG Environment:

    Could you offer the reasons why you prefer Centos over Ubuntu?

    There’s this: https://forums.fogproject.org/topic/10006/ubuntu-is-fog-s-enemy

    Thank you for pointing this out. Def going with Centos.



  • @sebastian-roth said in Setting up the right FOG Environment:

    @TomcatKZN While talking about the “right FOG environment” you might also consider the network part. PXE boot and multicast are mainly the things that cause issues for FOG users. Possibly you have this all ready as you use WDS right now. Just came to my mind and though I better mention it to be worth consideration.

    Thank you. PXE boot is working great. Had some issues with multicast as the labs are on different subnets. Any pointers will be appreciated.
    Have 2x Netgear 7328FS backbone switches.
    Each dept has its own Netgear GS724T or GS748T switch.


  • Developer

    @TomcatKZN While talking about the “right FOG environment” you might also consider the network part. PXE boot and multicast are mainly the things that cause issues for FOG users. Possibly you have this all ready as you use WDS right now. Just came to my mind and though I better mention it to be worth consideration.


  • Moderator

    @tomcatkzn said in Setting up the right FOG Environment:

    Could you offer the reasons why you prefer Centos over Ubuntu?

    There’s this: https://forums.fogproject.org/topic/10006/ubuntu-is-fog-s-enemy


  • Moderator

    @tomcatkzn said in Setting up the right FOG Environment:

    Any idea what the size of a basic win7 and win10 image is with the new compression code?

    We’ve been rolling out fat images lately, but I think our thin Win7 image is about 8GB on the target disk.

    When setting up the VM I would create 2 vmdk files. One for the OS and one for images. If you are comfortable with the linux installer you can do it when you manually partition the disk. Create an OS disk of about 20GB in size and then create the image disk as a single disk, with a standard partition (not lvm) formated with ext4. Then for your mount point make it /images. That way when you install fog everything is setup.

    The other option is to install linux first on that single 20GB disk and then after the OS is installed then add in the imaging disk and mount it to /images BEFORE fog is installed. Again you will want to make it a single vmdk, with a standard partition (not lvm) formatted with ext4. The reason why I say a standard partition is that if you need to you can expand that vmdk later and then the partition, and file system. Its a bit harder if lvm is involved.


  • Moderator

    @tomcatkzn said in Setting up the right FOG Environment:

    I was thinking that I need for each operating system/hardware combination :

    A base image with drivers and updates but no other software
    then a pre-sysprepped image of each of the above with other software installed.
    then a final sysprepped image for deployment.

    In our case we have a generic OS image and then use FOG to place the required diver files in a predefined location on the target system during imaging. Granted we only use Dells in our company, but we have 3 images (Win7x64, Win10x64, Win10x64EFI) for 15 different hardware models. It takes a while to set this up, but we can add new models without needing to recapture images for each new model. Also with this design, we recapture these 3 images each quarter with the latest updates (we may abandon this since Win10 will change the way updates are applied anyway).

    We build our reference images on a VM so that we have capabilities like snapshotting, and hardware independence during reference system creation. We do use Microsoft MDT to automate our reference image build. That way we get a consistently built reference image each quarter.


  • Moderator

    @tomcatkzn said in Setting up the right FOG Environment:

    Could you offer the reasons why you prefer Centos over Ubuntu?

    Based on recent changes to Ubugtu they seem to alter the code for some reason that causes FOG issues. Centos has always been a stable build, and that is what I run. As Wayne said, Centos or Debian seems to be are most stable OS platforms. With that said FOG does support and is tested against a large number of Linux OS distributions. Wayne runs install checks every morning against the supported platforms: http://theworkmans.us:20080/fog_distro_check/installer_dashboard.html



  • In the Vm, should I set up a single virtual hard disk, or separate the OS from the image store by adding another virtual hard disk. How would this change the fog setup?



  • Thank you George and Wayne.

    "In my environment the FOG server is running Centos7 on a virtual machine with 2 vCPU and 5GB of RAM. Image push time for a 25GB Win10 image is about 4.5 minutes."
    That sounds like nirvana!!

    "FOG can run comfortably with 4GB of RAM & 2 cores in most use cases."
    Great info, thank you.

    So the Virtual Machine route seems to make sense then as it can be switched off when its not needed (amongst other things, like migration to another host, snapshots, etc) and I can go back to setting up the HP Microserver as a proper NAS.

    Could you offer the reasons why you prefer Centos over Ubuntu?

    I was thinking that I need for each operating system/hardware combination :

    1. A base image with drivers and updates but no other software
    2. then a pre-sysprepped image of each of the above with other software installed.
    3. then a final sysprepped image for deployment.

    Any idea what the size of a basic win7 and win10 image is with the new compression code?


  • Moderator

    I agree with George, both options are highly overkill. Stay away from Ubuntu, go with CentOS 7 or Debian 9.

    FOG can run comfortably with 4GB of RAM & 2 cores in most use cases.


  • Moderator

    Really either of the two hardwares will work just fine for FOG. In your environment either would be a bit of an overkill. I would stick with one of the main stream linux distributions, rhel/centos, debian, ubuntu and not go with one of the variants unless you have a specific case.

    As long as your network is setup for multicasting you can image those 35 lab machines in one push, as long as you have the same image setup for all systems.

    For your setup I would work on developing a single image (or group of images) that will deploy to all systems on your campus. This will give you the most flexibility as new models are brought on board.

    In my environment the FOG server is running Centos7 on a virtual machine with 2 vCPU and 5GB of RAM. Image push time for a 25GB Win10 image is about 4.5 minutes.


Log in to reply
 

522
Online

39.4k
Users

11.1k
Topics

105.4k
Posts

Looks like your connection to FOG Project was lost, please wait while we try to reconnect.