• Recent
    • Unsolved
    • Tags
    • Popular
    • Users
    • Groups
    • Search
    • Register
    • Login

    Minimum Server Specs

    Scheduled Pinned Locked Moved
    Hardware Compatibility
    4
    5
    1.5k
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • N
      nerdstburns
      last edited by

      What are the minimum CPU, RAM, and Storage specs that fog can operate on? What is the recommended specs for handling unicast images on up to 100-200 computers with a queue size of 6-12 PCs at a time?

      1 Reply Last reply Reply Quote 0
      • DJslimD1kD
        DJslimD1k
        last edited by DJslimD1k

        We gave our server 32 GBs of Ram, 6 core Xeon processor, 10 GB Ethernet link, and 1 TB of SSD storage. It transfer at about 15 GB per minute while imaging 15 computers. The server specs are important, but having a good network to transfer the images on is just as important.

        1 Reply Last reply Reply Quote 1
        • george1421G
          george1421 Moderator
          last edited by

          The recommendations are based on your campus size (up to 200) and/if you have the fog client installed on the target computers.

          If you are purely imaging then min 2 vCPU with 4GB of ram if you are using a text only install of linux on the FOG host server and 6GB if you are running xwindows (GUI interface) on the FOG server.

          Ideally you would want a 10GbE interlace if you are planning on imaging 12 systems at a time. Using modern target computers you can saturate a 1 GbE link with imaging 3 simultaneous systems. You can setup a multiple 1GbE LAG link to spread the load across all links. For 12 simultaneous images you will want a 4 link LAG (802.3ad) group

          Your disk subsystem is going to be the most critical factor in your FOG server design. Consider imaging 5 systems at a time using a single spindle HDD. That HDD head is going to be seeking all over the platter to keep up feeding those 5 systems with data. Moving to a multiple spindle disk array will help but if you are really trying to image 12 systems at a time you will need a flash disk array to keep up with the target computers.

          Consider for imaging alone there is not much workload on the FOG host server. All its doing during imaging is moving data between the disk subsystem and the network subsystem. There is no computational load on the FOG server other than moving data between storage and network as well as monitoring the overall imaging process. All of the heavy lifting for imaging is being done on the target computers. You don’t need a monstrous CPU with 1000 cores, it simply will not help the imaging process directly. All you should care about is an efficient disk and network subsystems.

          Now enter the FOG service running on the client computers. This service checks in with the FOG server every X minutes to see if there are any tasks that need to be processed. This check-in takes CPU time on the FOG server. So the number of target computers running the FOG client will have a greater impact on FOG host resources and imaging does. There is some finetuning you can do to help with performance losses due to the fog client service. Understand I’m not saying the fog service is evil, just you need to be mindful of it and make a few system tweaks accordingly.

          Please help us build the FOG community with everyone involved. It's not just about coding - way more we need people to test things, update documentation and most importantly work on uniting the community of people enjoying and working on FOG!

          DJslimD1kD 1 Reply Last reply Reply Quote 1
          • DJslimD1kD
            DJslimD1k @george1421
            last edited by

            @george1421 That is some good information! Thank you!

            1 Reply Last reply Reply Quote 1
            • S
              Sebastian Roth Moderator
              last edited by

              @nerdstburns said in Minimum Server Specs:

              unicast images on up to 100-200 computers

              Is there any good reason you don’t want to use multicast? That would take a huge load off the disk subsystem and network of your FOG server!

              Web GUI issue? Please check apache error (debian/ubuntu: /var/log/apache2/error.log, centos/fedora/rhel: /var/log/httpd/error_log) and php-fpm log (/var/log/php*-fpm.log)

              Please support FOG if you like it: https://wiki.fogproject.org/wiki/index.php/Support_FOG

              1 Reply Last reply Reply Quote 0
              • 1 / 1
              • First post
                Last post

              155

              Online

              12.0k

              Users

              17.3k

              Topics

              155.2k

              Posts
              Copyright © 2012-2024 FOG Project