• Recent
    • Unsolved
    • Tags
    • Popular
    • Users
    • Groups
    • Search
    • Register
    • Login

    Copy Cloud Storage of /images to Remote Servers Globally

    Scheduled Pinned Locked Moved Solved
    General
    fogapi-psmodule api automation cloud
    4
    18
    1.5k
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • T
      typotony
      last edited by JJ Fullmer

      Hi everyone,

      Was trying to find cases similar to mine and was hoping someone has similar experience. I have three servers in different offices around the world. For specific reasons, we cannot have a VPN. What i’ve done is have a cronjob of images stored in /images copied to cloud storage (s3, gcp bucket) and have the other two servers pull the images from there and copy into their own /images dir.

      The issue now is that aside from the server that did the initial copy to the cloud storage, the other two servers images are not showing up when it comes time to deploy, even though its available in /images. Is there a way to have images that are copied to /images automatically be imported so we don’t have to run the manual process myself of exporting and importing CSVs?

      george1421G JJ FullmerJ 2 Replies Last reply Reply Quote 0
      • S
        Sebastian Roth Moderator
        last edited by

        @typotony said in Copy Cloud Storage of /images to Remote Servers Globally:

        Is there a way to have images that are copied to /images automatically be imported so we don’t have to run the manual process myself of exporting and importing CSVs?

        No, there is no auto detection of images within FOG. You can either do the simple CSV export/import thing or go the manual route - create the image definitions on the other nodes and make sure they match exactly the ones you have on your other server. I’d definitely use CSV. If you transfer the extra large image files around the world. What keeps you from generating the CSV and importing it to the other servers?

        Web GUI issue? Please check apache error (debian/ubuntu: /var/log/apache2/error.log, centos/fedora/rhel: /var/log/httpd/error_log) and php-fpm log (/var/log/php*-fpm.log)

        Please support FOG if you like it: https://wiki.fogproject.org/wiki/index.php/Support_FOG

        1 Reply Last reply Reply Quote 0
        • T
          typotony
          last edited by

          We have pretty non-technical people in other offices and would like them to not have to do the csv importing/exporting themselves. A small task but a task they wouldn’t do themselves. I appreciate the response.

          1 Reply Last reply Reply Quote 0
          • S
            Sebastian Roth Moderator
            last edited by

            @typotony Uhh, I somehow forgot about the third option, which would be scripting the import via mysql.

            How do the people in the other offices tranfer the image files to those FOG servers? Using WinSCP or something similar? Or did you script the download of images from the cloud storage?

            In any case you should be able to automate the import via mysql. Maybe a script can check if a SQL dump file exists in a certain place every minute and if it does than do the import and rename/delete the import.sql.

            Let us know if you need further help with this.

            Web GUI issue? Please check apache error (debian/ubuntu: /var/log/apache2/error.log, centos/fedora/rhel: /var/log/httpd/error_log) and php-fpm log (/var/log/php*-fpm.log)

            Please support FOG if you like it: https://wiki.fogproject.org/wiki/index.php/Support_FOG

            T 1 Reply Last reply Reply Quote 0
            • george1421G
              george1421 Moderator @typotony
              last edited by

              @JJ-Fullmer do you know if there is an API for exporting image definitions from fog?

              https://forums.fogproject.org/topic/12026/powershell-api-module

              The other option might be to write some mysql code to export the image def table on the root fog server then to import it on the remote fog server end. This could be added to your cron job as part of the image movements.

              As long as the receving end fog server doesn’t create its own images the export and import with mysql should be pretty clean. If the receiving end fog server also created images then there is the problem of duplicate image IDs in the database.

              Please help us build the FOG community with everyone involved. It's not just about coding - way more we need people to test things, update documentation and most importantly work on uniting the community of people enjoying and working on FOG!

              T JJ FullmerJ 2 Replies Last reply Reply Quote 0
              • T
                typotony @Sebastian Roth
                last edited by

                @Sebastian-Roth Hey, thanks for responding so quickly. Yes, the script pulls the images from the cloud AWS S3 bucket into /images everynight and skips images that have already exists.

                The import option sounds like it could work. Would I be exporting a csv/db from the root server? So to summarize it sounds like this would be the optimal workflow.

                On the root server

                • Export mysql to csv/db
                • Place file into dir that will be uploaded to cloud storage

                On satellite servers

                • Cronjob pulls images nightly from cloud storage
                • have cronjob to check if sql dump exists in /images dir and then import it, which should update the image definitions within Fog.

                Does that sound right?

                1 Reply Last reply Reply Quote 0
                • T
                  typotony @george1421
                  last edited by

                  @george1421 Thank you for the feedback. Ideally, yes, we would like them to create their own images and sync it back up to cloud storage as well so all of the images from each office are available to each other (things like, keyboard differences, language, etc).

                  But yes, maybe having one server create all of the images might be the way to go.

                  george1421G 1 Reply Last reply Reply Quote 0
                  • S
                    Sebastian Roth Moderator
                    last edited by

                    @typotony Probably a good idea to not export the whole DB but just the images table (mysqldump -u fogmaster -p fog images > images.sql). Though you need to make sure the IDs of the images do not diverge. For example, if you create new images on the other two servers manually (not via the import) and assign those to hosts it will break as soon as you import the mysql dump which essentially wipes the whole images table and re-creates it.

                    Surely one can come up with some kind of “check and merge” instead of the DROP/CREATE/INSERT done by a plain mysql export/import but it would take a fair amount of coding skill to make this fail proof I reckon.

                    Web GUI issue? Please check apache error (debian/ubuntu: /var/log/apache2/error.log, centos/fedora/rhel: /var/log/httpd/error_log) and php-fpm log (/var/log/php*-fpm.log)

                    Please support FOG if you like it: https://wiki.fogproject.org/wiki/index.php/Support_FOG

                    1 Reply Last reply Reply Quote 0
                    • george1421G
                      george1421 Moderator @typotony
                      last edited by george1421

                      @typotony said in Copy Cloud Storage of /images to Remote Servers Globally:

                      Ideally, yes, we would like them to create their own images and sync it back up to cloud storage as well so all of the images from each office are available to each other (things like, keyboard differences, language, etc).

                      One way replication is the easiest to implement. Having based images created at HQ and having them replicated to the sites is also possible (your environment makes it challenging but not impossible). Having the sites to be able to add their images and merge them with the images sent by HQ isn’t impossible either. Having images move N ways is a bit complicated and not something I thing we would want to do, but still its not impossible.

                      Here is the idea: When you deploy a new FOG server with a fresh database its blank. When you add a new target it gets the hostid of 1. You add the next target computer it gets the hostid of 2 and so on. The same way for images. When you add the first image it gets an imageid of 1, second one 2 and so on. Stick with me I’m almost at the point of this. The first image or host gets #1 because on a blank FOG database the seed value for the imageid or hostid is ( 1 ). We could change the seed value to any number we really wanted (that fits into a 32 bit address space).

                      So lets say for the HQ server we don’t adjust any settings. So the first host get the number 1, second host get 2 and so on. Now at remote site A we set the seed value for both the hostid and imageid to 10,000. For site B we set the seed value to 20,000. So the first host at site B will be assigned 10001, second host 10002 and so on. We have now created a banded database where hosts or images comes from HQ will be 1-9,999, site A 10,001-99,999. site B 20001-29999. Now you have a platform where you can merge records from HQ with Site A and Site B. (the only problem with this is ‘if’ the FOG developers search for the next record ID by using the sql max command. So instead of using the seed counter the highest number in the database would be used for the next record ID.) That maybe overthinking the problem a bit.

                      Changing the column seed value is a mysql thing and will be transparent to FOG so no coding should be required to implement this strategy.

                      Understand I have not put a lot of thought into this idea, its just something that sounds logical and should work. It will require some bash scripting and mysql prep work on your part.

                      Please help us build the FOG community with everyone involved. It's not just about coding - way more we need people to test things, update documentation and most importantly work on uniting the community of people enjoying and working on FOG!

                      T 1 Reply Last reply Reply Quote 0
                      • S
                        Sebastian Roth Moderator
                        last edited by Sebastian Roth

                        @george1421 Nice idea you’ve come up with! Should be fairly straight forward and robust to implement.

                        @typotony Now it’s your turn to come up with a solution that works in your environment. You are welcome to share it as well with the community: https://github.com/FOGProject/fog-community-scripts/

                        Web GUI issue? Please check apache error (debian/ubuntu: /var/log/apache2/error.log, centos/fedora/rhel: /var/log/httpd/error_log) and php-fpm log (/var/log/php*-fpm.log)

                        Please support FOG if you like it: https://wiki.fogproject.org/wiki/index.php/Support_FOG

                        1 Reply Last reply Reply Quote 0
                        • T
                          typotony @george1421
                          last edited by

                          @george1421 Thank you for that. It does make sense and I think its pretty logical as well.

                          So to be clear, creating this banded sort of dbs together shouldn’t cause issues if I were to export and import the images table from mysql server to server since each server has its unique block of IDs that it should be using when creating an image.

                          I’ll have to test this out but that sounds like the gist of it.

                          george1421G 1 Reply Last reply Reply Quote 0
                          • george1421G
                            george1421 Moderator @typotony
                            last edited by george1421

                            @typotony

                            To give you a few more hints. Here is what the hosts table looks like

                            MariaDB [fog]> describe hosts;
                            +------------------+---------------+------+-----+---------------------+----------------+
                            | Field            | Type          | Null | Key | Default             | Extra          |
                            +------------------+---------------+------+-----+---------------------+----------------+
                            | hostID           | int(11)       | NO   | PRI | NULL                | auto_increment |
                            | hostName         | varchar(16)   | NO   | UNI | NULL                |                |
                            | hostDesc         | longtext      | NO   |     | NULL                |                |
                            | hostIP           | varchar(25)   | NO   | MUL | NULL                |                |
                            | hostImage        | int(11)       | NO   |     | NULL                |                |
                            | hostBuilding     | int(11)       | NO   |     | NULL                |                |
                            | hostCreateDate   | timestamp     | NO   |     | current_timestamp() |                |
                            | hostLastDeploy   | datetime      | NO   |     | NULL                |                |
                            | hostCreateBy     | varchar(50)   | NO   |     | NULL                |                |
                            | hostUseAD        | char(1)       | NO   | MUL | NULL                |                |
                            | hostADDomain     | varchar(250)  | NO   |     | NULL                |                |
                            | hostADOU         | longtext      | NO   |     | NULL                |                |
                            | hostADUser       | varchar(250)  | NO   |     | NULL                |                |
                            | hostADPass       | varchar(250)  | NO   |     | NULL                |                |
                            | hostADPassLegacy | longtext      | NO   |     | NULL                |                |
                            | hostProductKey   | longtext      | YES  |     | NULL                |                |
                            | hostPrinterLevel | varchar(2)    | NO   |     | NULL                |                |
                            | hostKernelArgs   | varchar(250)  | NO   |     | NULL                |                |
                            | hostKernel       | varchar(250)  | NO   |     | NULL                |                |
                            | hostDevice       | varchar(250)  | NO   |     | NULL                |                |
                            | hostInit         | longtext      | YES  |     | NULL                |                |
                            | hostPending      | enum('0','1') | NO   |     | NULL                |                |
                            | hostPubKey       | longtext      | NO   |     | NULL                |                |
                            | hostSecToken     | longtext      | NO   |     | NULL                |                |
                            | hostSecTime      | timestamp     | NO   |     | 0000-00-00 00:00:00 |                |
                            | hostPingCode     | varchar(20)   | YES  |     | NULL                |                |
                            | hostExitBios     | longtext      | YES  |     | NULL                |                |
                            | hostExitEfi      | longtext      | YES  |     | NULL                |                |
                            | hostEnforce      | enum('0','1') | NO   |     | 1                   |                |
                            +------------------+---------------+------+-----+---------------------+----------------+
                            29 rows in set (0.002 sec)
                            
                            

                            You see that extra value of auto increment?

                            This command should change the seed value of that column

                            ALTER TABLE hosts AUTO_INCREMENT = 10000;
                            The next value for the next host added will be 10001.

                            ref: https://www.techonthenet.com/mysql/auto_increment_reset.php

                            As long as you don’t have duplicate key values it should work. Now I don’t know off the top of my head when you want to import a record that already has a hostid. How will mysql react to it because ITS going to want to create a new host ID for the inserted record. There are a few unknowns at the moment but I think the concept is sound still.

                            Please help us build the FOG community with everyone involved. It's not just about coding - way more we need people to test things, update documentation and most importantly work on uniting the community of people enjoying and working on FOG!

                            1 Reply Last reply Reply Quote 1
                            • JJ FullmerJ
                              JJ Fullmer Testers @george1421
                              last edited by JJ Fullmer

                              @george1421 said in Copy Cloud Storage of /images to Remote Servers Globally:

                              @JJ-Fullmer do you know if there is an API for exporting image definitions from fog?

                              https://forums.fogproject.org/topic/12026/powershell-api-module

                              The other option might be to write some mysql code to export the image def table on the root fog server then to import it on the remote fog server end. This could be added to your cron job as part of the image movements.

                              As long as the receving end fog server doesn’t create its own images the export and import with mysql should be pretty clean. If the receiving end fog server also created images then there is the problem of duplicate image IDs in the database.

                              Sorry I just saw this

                              100% you can get image definitions from the api and you could then store them in json files. It would be a lot simpler, easier and safer than direct database calls in my opinion and it handles the whole column sql stuff.

                              Here’s a quick sample of what you might do

                              On the source server with the original image install and configure the fogApi powershell module.

                              Open up a powershell console

                              #if you're new to powershell you may need this
                              Set-executionpolicy RemoteSigned;
                              #install and import the module
                              Install-Module FogApi
                              Import-Module FogApi;
                              # go obtain your fog server and user api tokens from the fog web gui then put them in this command
                              Set-FogServerSettings -fogApiToken 'your-server-token' -fogUserToken 'your-user-token' fog-server 'your-fog-server-hostname';
                              

                              You’ll need to do that above setup for each location to connect to the api

                              Now you can get the images and store them as json files, could do 1 big one and could use other file formats, this example will create a json file for each image you have and store it in \server\share\fogImages which you should of course to change to some share you have access to.
                              This is a function you could paste into powershell and change that path using the parameter -exportPath "yourPath\here"

                              function Export-FogImageDefinitions {
                                  [cmdletBinding()]
                                  param (
                                      $exportPath = "\\sever\share\fogImages"
                                  )
                                  $images = Get-FogImages;
                                  $images | Foreach-object {
                                      $name = $_.name;
                                      $_ | ConvertTo-Json | Out-File -encoding oem "$exportPath\$name.json";
                                  }
                              }
                              

                              So you can copy and paste that (or save it into a .ps1 you dotsource, I may also just add it as a function in the fogAPi module as it would be handy for other users especially when migrating servers.)

                              You’d run it like this (after having setup your fog api connection Export-FogImageDefinitions -exportPth "some\path\here"
                              Now you have them all in json format on some share so you need an import command

                              Now this one I can’t easily test at the moment and may need a little bit of TLC, I based the fields to pass in on the ones defined in the matching class https://github.com/FOGProject/fogproject/blob/master/packages/web/lib/fog/image.class.php

                              This is the function you’d run on the other fog servers after configuring the powershell module for the destination fog server on that destination network.

                              EDIT: I adjusted this one after some testing. The api will ignore the old id automatically so it doesn’t matter if you pass that and you can pass all the other information that was exported as well, so no need to filter out anything with Select-Object, in fact you can just pass the raw json files

                              Function Import-FogImageDefinitions {
                                  [cmdletBinding()]
                                  param(
                                       $importPath = "/images/imageDefinitions"
                                  )
                                  $curImages = Get-FogImages;
                                  #get the json files you exported into the given path
                                  $imageJsons = Get-ChildItem $importPath\*.json;
                                  $imageJsons | Foreach-Object {
                                      # get the content of the json, convert it to an object, and create the image definition
                                      #create the image on the new server if it doesn't already exist
                                      if ($curImages.name -contains $_.baseName) { 
                                           #this will assume the exported definition is the most up to date definition and update the image definition.
                                           Update-FogObject -type object -coreObject image -jsonData (Get-Content $_.Fullname -raw); 
                                      } else {
                                           New-FogObject -type object -coreObject image -jsondata (Get-Content $_.Fullname -raw)
                                      }
                                  }
                              } 
                              

                              So you copy paste this function into powershell and run import-fogImageDefinitions -importPath \\path\to\where\you\exported\images

                              That should do the trick.

                              Also, I believe that all of this will work in the linux version of powershell as well if you don’t have windows in your environment or if you wanted to have all of this running on the fog servers themselves. I also didn’t include what hosts have it assigned as that would probably be different on different servers. That’s also possible to export and import if the hosts are all the same.

                              You could also take this example and expand on it and get things automated via scheduled tasks on a windows system or cron on the fog servers so it essentially auto-replicates these definitions as you add new ones. If you want more information or more guidance on using the api for this let me know

                              Have you tried the FogApi powershell module? It's pretty cool IMHO
                              https://github.com/darksidemilk/FogApi
                              https://fogapi.readthedocs.io/en/latest/
                              https://www.powershellgallery.com/packages/FogApi
                              https://forums.fogproject.org/topic/12026/powershell-api-module

                              1 Reply Last reply Reply Quote 1
                              • JJ FullmerJ
                                JJ Fullmer Testers @typotony
                                last edited by

                                @typotony I had a chance to test things on linux with the examples I gave and all works swimmingly.

                                You can install pwsh on linux a few different ways, I usually user snap because it works the same on all distros. I used the LTS version. Microsoft’s instructions are found here https://learn.microsoft.com/en-us/powershell/scripting/install/install-other-linux?view=powershell-7.3

                                Here’s the commands in redhat based linux I use to get pwsh installed. I imagine you could simply swap yum fo apt/apt-get in debian distros for that first command
                                prefix sudo if you’re not running as root.

                                yum -y install snapd
                                #create this symlink to be able to use "classic" snaps required by the pwsh package
                                ln -s /var/lib/snapd/snap /snap
                                #enable and start snapd services
                                systemctl enable --now snapd.socket
                                systemctl enable --now snapd.service
                                systemctl start snapd.service
                                systemctl start snapd.socket
                                #install LTS pwsh
                                snap install powershell --channel=lts/stable --classic
                                #start pwsh (you may have to logout/login to refresh your path)
                                pwsh
                                

                                Bottom line though, you can automate the export and import operations using the api. My examples here use json but you could also use Export-csv to save the definitions as csvs but you’d then have to import them as objects and convert them to jsons, all of which there are built in functions for, but json is a bit simpler for this I think.

                                In order to do a multi-direction sync, you’ll probably need to add some extra processing to the export. i.e. check the .deployed variable in each image definition to determine if it has been updated and only export it then if that already exists.

                                To automate this, once you’ve configured the fogApi on each fog server those settings are saved securely for that user, so you can make a script for this that you plop in a cronjob for exporting and importing. i.e you do an export from each server to the /images/imageDefinitions folder
                                before your sync of the /images directory to cloud, then after you pull the latest you run the import that will add new ones and update existing ones.

                                For reference

                                To test this:
                                On my prod server I installed pwsh and the fogApi module and I made a new folder at /images/imageDefinitions and used the export-imagedefinitions function I put in my last comment to export the json definitions of each image to there.

                                In my dev server I mounted a clone of the /images disk from my prod server to /images (could have also mounted the nfs share and rsynced them, this was just faster for me at the time) Then installed pwsh and the module. Then I ran the import-imageDefinitions function specifiying the import path of /images/imageDefinitions. It added all the definitions and all the images show up and are ready to be used on the dev server.

                                I will probably add a general version of the export and import functions to the api as this could be handy for migrating servers in general.

                                Have you tried the FogApi powershell module? It's pretty cool IMHO
                                https://github.com/darksidemilk/FogApi
                                https://fogapi.readthedocs.io/en/latest/
                                https://www.powershellgallery.com/packages/FogApi
                                https://forums.fogproject.org/topic/12026/powershell-api-module

                                T 1 Reply Last reply Reply Quote 0
                                • T
                                  typotony @JJ Fullmer
                                  last edited by typotony

                                  @JJ-Fullmer Wow that is awesome! I just tested it myself and am running into the problem with the initial setup.

                                  I was able to get pwsh installed on Ubuntu and it seems to be working properly. I have installed and imported the FogApi module.

                                  My configuration is below. I am running into an positional parameter problem when using ‘Set-FogServer Settings’.

                                  import-module FogApi
                                  
                                  Set-FogServerSettings -fogApiToken 'mytoken' -fogUserToken 'mytoken' fog-server 'http://192.168.150.25';
                                  
                                  function Export-FogImageDefinitions {
                                      [cmdletBinding()]
                                      param (
                                          $exportPath = "\home\mydir\"
                                      )
                                      $images = Get-FogImages;
                                      $images | Foreach-object {
                                          $name = $_.name;
                                          $_ | ConvertTo-Json | Out-File -encoding oem "$exportPath\$name.json";
                                      }
                                  }
                                  

                                  PS /home/me> Get-InstalledModule -Name FogApi

                                  Version Name Repository Description


                                  2302.5.15 FogApi

                                  The error i’m seeing.
                                  | Set-FogServerSettings -fogApiToken 'NmFiMDI4YzAwMmQ3MGUzYTQ3NWEzMDgwN …
                                  | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
                                  | A positional parameter cannot be found that accepts argument ‘fog-server’.

                                  JJ FullmerJ 1 Reply Last reply Reply Quote 0
                                  • JJ FullmerJ
                                    JJ Fullmer Testers @typotony
                                    last edited by JJ Fullmer

                                    @typotony You’re missing a - where you’re defining the -fogserver you’re also adding a - into it, it’s -fogserver not fog-server fog-server is the default value for that setting. You can also try just running set-fogserversettings -interactive without params and it should prompt you for each value. You also don’t want http in there, just the hostname (ip should work fine too but I haven’t tested that, but its just pulled in as a string into the url, so no reason it wouldn’t work)

                                    Set-FogServerSettings -fogApiToken 'mytoken' -fogUserToken 'mytoken' -fog-server '192.168.150.25';
                                    

                                    That should do the trick

                                    Have you tried the FogApi powershell module? It's pretty cool IMHO
                                    https://github.com/darksidemilk/FogApi
                                    https://fogapi.readthedocs.io/en/latest/
                                    https://www.powershellgallery.com/packages/FogApi
                                    https://forums.fogproject.org/topic/12026/powershell-api-module

                                    T 1 Reply Last reply Reply Quote 0
                                    • T
                                      typotony @JJ Fullmer
                                      last edited by

                                      Hi @JJ-Fullmer ! That was Chefs Kiss beautifully done. That method is much easier than accessing the db directly and it solves everything I was hoping to achieve. Everything is working as its supposed to and I set it up to run on cronjobs to export and import the imageDefinitions nightly. Hopefully this improves the ability for companies to have various fog servers around their offices without requiring a VPN to sync images. Thanks as well @george1421 and @Sebastian-Roth!

                                      JJ FullmerJ 1 Reply Last reply Reply Quote 3
                                      • JJ FullmerJ
                                        JJ Fullmer Testers @typotony
                                        last edited by

                                        @typotony That’s Great to hear!
                                        Glad you got it working and that it was as easy as intended.

                                        Have you tried the FogApi powershell module? It's pretty cool IMHO
                                        https://github.com/darksidemilk/FogApi
                                        https://fogapi.readthedocs.io/en/latest/
                                        https://www.powershellgallery.com/packages/FogApi
                                        https://forums.fogproject.org/topic/12026/powershell-api-module

                                        1 Reply Last reply Reply Quote 0
                                        • [[undefined-on, JJ FullmerJ JJ Fullmer, ]]
                                        • [[undefined-on, JJ FullmerJ JJ Fullmer, ]]
                                        • 1 / 1
                                        • First post
                                          Last post

                                        185

                                        Online

                                        12.0k

                                        Users

                                        17.3k

                                        Topics

                                        155.2k

                                        Posts
                                        Copyright © 2012-2024 FOG Project