@Tom-Elliott I will gladly test them!
I’m currently trying the inits that came from the most recent btsync and they seem to be working too for the hardware inventory task anyway, haven’t tried an image yet.
Where are the new inits to test? Oh wait you posted while I was typing this, I’ll download those and give them a go right now.

Posts
-
RE: HP Z640 - NVME PCI-E Drive
-
RE: HP Z640 - NVME PCI-E Drive
Well it only seems to work when it was uploaded from nvme drive. That’s kind of odd.
-
RE: HP Z640 - NVME PCI-E Drive
@Sebastian-Roth No problem. I’m happy to help.
So I can’t upload a resizable image. It says
“Problem opening /dev/nvme0n1p for reading”I’m going to try uploading my base image as a multi-partition and see if I can download that one.
-
RE: HP Z640 - NVME PCI-E Drive
So downloading a multi-partition image that was uploaded from an nvme drive works on an nvme drive. But I can’t download one that was made before I updated the init
I’m going to try uploading and downloading a resizable image and then I’ll also try reuploading my main image to see if uploading it using the new init makes some sort of difference. -
RE: HP Z640 - NVME PCI-E Drive
So here’s some good news, image capture works!
I did just a default multiple partition install of windows and it uploaded no problem.
I had still set the host primary disk to /dev/nvme0n1 in the gui.
I’m going to try re-downloading the image to the same computer
Then I’ll try to downloading the image I actually need on it that is single disk resizable.Thanks for all the help thus far.
-
RE: HP Z640 - NVME PCI-E Drive
Here’s the information from the debug session with the new init. Looks mostly the same. I’m going to try installing windows with default partitions the old fashioned way and I’ll see if image capture works by chance.
-
RE: HP Z640 - NVME PCI-E Drive
@Sebastian-Roth Similar problem with a non-resizable disk, it doesn’t seem to add a 1 at the end of /dev/nvme0n1p
I’m going to do a debug session and see if the lsblk is any different from before and such. Let me know if there’s any other information you need. I’m here to helpOn a side note, I did take out the primary hard disk specification and ran a hardware inventory and it found the hard drive just fine on its own. It didn’t get a harddrive manufacturer, model, or s/n but it knows it exist now. Yay progress!
-
RE: HP Z640 - NVME PCI-E Drive
@Sebastian-Roth I just tested with a resizable image to no avail. I got some error messages. It looks like some numbering didn’t quite work as expected. Mainly when it is searching for /dev/nvme0n1p instead of /dev/nvme0n1p1 and when it tries to find the image file named d1pp1.img instead of d1p1.img
My apologies that some of these pictures have duplicate information, this happened twice I was just ready with the camera the second time since it happened so fast.
I am currently uploading a non-resizable version of that image to test that idea. I should be able to post results on that test in like 10 minutes or so.
-
RE: GIT 5676 Disk Information in web UI is incorrect
@mrdally204 I usually set the owner and group of images to fog
sudo chown -R fog.fog /images
And I set the permissions to 775
sudo chmod -R 775 /images
Whether or not that’s the best security practice is arguable, and since your permissions are already 777 the owner being root probably doesn’t matter. But changing the owner might be worth a shot.
-
RE: HP Z640 - NVME PCI-E Drive
Awesome! I’m giving it a try right now. And my apologies if I seemed impatient in my asking, I was just curious and getting all excited.
I currently only have one image and it is a resizable one, we’ll see what happens.Thanks,
-JJ -
RE: Migrate PM to VM 1.2.0
Assuming that your original is still up and running, you could try doing a manual database dump and import. This is essentially what the gui is doing so that you don’t have to worry about the command line aspects, but it doesn’t
In the command line on the original server, assuming you went with the default blank password…
mysqldump -u root fog > /home/fog/fog.sql
Then open up winscp, filezilla, cyberduck, or whatever ftp client you prefer (or just use scp in command line to copy straight to the vm) and download the fog.sql file you created.
Then in the new vm server put the fog.sql file in /home/fog and then run this command
mysql -u root fog < /home/fog/fog.sql
And that should take care of it.
Alternatively, especially if it looks like the gui export is working, you can just use the downloaded file from the gui for the second part there.
Hope that helps
-
RE: HP Z640 - NVME PCI-E Drive
@Sebastian-Roth @Tom-Elliott
I updated to the latest trunk since I noticed one of the commits said that it had adjusted the way partition numbers were found. But sadly it still isn’t working. I will try making it a raw image to see if that works as a workaround like it did for the other person with this problem with the m.2 drive. The debug information lsblk and for variable dump all still have the same information.Thanks,
-JJ -
RE: Automating Git Updates for FOG
Also, if you happen to have btsync set up, which I just gave a try using the guide here https://wiki.fogproject.org/wiki/index.php/Upgrade_to_trunk (Side note, I didn’t have to untar anything in a downloads folder and someone should probably edit the typo that says chrmod instead of chmod)
I set up my btsync to a folder called /home/fog/fogInstalls/btsync/fog Which is in a variable in the following script if you put it somewhere else and need to change it.
Another side note/question. For git my git pulls are around 300 MB or so, but the BTSYNC is a little less than 20 MB, am I getting all the files or is there something wrong with my config? That size difference is slightly concerning to me is all.
Anyway, here’s the btsync version of the update script that you could run whenever you get a btsync update.
#!/bin/bash clear # ------------------------------------------- # Fog Git Updater # ------------------------------------------- # ------------------------------------------- # Script Purpose # ------------------------------------------- # This script is designed to run an automated update of the latest FOG Git dev build and it's cron friendly # ------------------------------------------- # ------------------------------------------- # Some prereqs for this script # ------------------------------------------- # 1. Already have an existing working install/configuration of FOG 1.0 or later # # 2. Have git installed and setup. You can do that by doing.... # sudo apt-get install git # mkdir /home/fog/fogInstalls/git # git clone https://github.com/FOGProject/fogproject.git /home/fog/fogInstalls/git # # 3. A script to echo the encrypted version of your sudo password, create one with this function # just put in your password into the following in place of your_super_secret_password (leave the quotes) # and then uncomment and copy paste the function into a terminal and then run it with just the name of the function pw # pw(){ # touch /home/fog/fogInstalls/.~ # ossl=`echo "your_super_secret_password" | openssl enc -des -a -e -pass pass:PASSWORD` # echo 'echo "$(echo '$ossl' | openssl enc -des -a -d -pass pass:PASSWORD)"' >> /home/fog/fogInstalls/.~ # sudo chown fog.root /home/fog/fogInstalls/.~ # sudo chmod 700 /home/fog/fogInstalls/.~ # } # ------------------------------------------- # ------------------------------------------- # Variables # ------------------------------------------- # ------------------------------------------- echo "Creating Script variables..." fogInstalls='/home/fog/fogInstalls' btsyncPath="$fogInstalls/btsync/fog" backup="$fogInstalls/backups" pw=`sh $fogInstalls/.~` # ------------------------------------------- # ------------------------------------------- # Functions # ------------------------------------------- # ------------------------------------------- perms(){ sudo chmod -R 775 $1 sudo chown -R fog.fog $1 } srvUpdate(){ # Enter sudo mode aand do some quick server maintenance update fun times # First, enter sudo mode by echoing the output of decrypting your encrypted password and pipe that into an apt-get update # Don't worry, it doesn't output the password into the terminal # Now that the password is in once the terminal will keep it stored for the next bunch of sudo commands echo "Running Sever updates!..." echo $pw | sudo -S apt-get update -y sudo apt-get upgrade -y # install any upgrades you just downloaded } backupConfig(){ # Backup custom config and other files # Copy the latest versions of any files you've changed that will be overwritten by the update and backup the database just in case. # For example you may want to back up... # Config.php # To be on the safe side your config file in the /opt folder that has may have a corrected webroot for ubuntu 14.04 and may have stored encrypted credentials (i.e mysql) # I think that the installer uses this file and keeps it anyway, but I like to be careful # Exports file # Because this runs the installer with a yes pipe, it ends up telling it that the image path is "y", # simply backing up and restoring your current one avoids the issue of fog not finding your precious images. # Custom pxe boot background # If you have a custom background for the pxe menu, the bg.png file # Mysql database dump # It would be rather troublesome if something went horribly wrong in the update and your database goes kaboom, it's unlikely but backups are a good thing # Just a note, It's a good policy to also have backups of these outside of your server, which you could add to this script with an scp command or something like that # ------------------------------------------- echo "make sure backup dir exists..." if [ ! -d $backup ]; then mkdir $backup fi echo "Dumping the database..." mysqldump -u root --all-databases --events > $backup/DatabaseBeforeLastUpdate.sql #backup database echo "Backing up config and custom files..." echo "config.php..." sudo cp /opt/fog/service/etc/config.php $backup/config.php echo "fog settings..." sudo cp /opt/fog/.fogsettings $backup/.fogsettings echo "nfs exports..." sudo cp /etc/exports $backup/exports echo "custom pxe background..." sudo cp /var/www/html/fog/service/ipxe/bg.png $backup/bg.png } updateFOG(){ echo "running FOG installer..." perms $btsyncPath cd $btsyncPath/bin sudo bash installfog.sh -Y } restoreConfig(){ # Restore backed up files # Restore the backed up files to their proper places and make sure they're formatted correct too. echo "restoring custom pxe background..." sudo cp $backup/bg.png /var/www/html/fog/service/ipxe # Restore Custom Background # I found that I needed to do this in some configurations, but it may no longer be neccesarry... echo "Creating undionly for iPxe boot in ipxe folder, just in case..." sudo cp /tftpboot/undionly.kpxe /tftpboot/undionly.0 # backup original then rename undionly sudo cp /tftpboot/undionly.0 /var/www/html/fog/service/ipxe/undionly.0 sudo cp /var/www/html/fog/service/ipxe/undionly.0 /var/www/html/fog/service/ipxe/undionly.kpxe } fixPerms(){ echo "Changing Permissions of webroot..." perms '/var/www/html/fog' echo "Changing permissions of images...." perms '/images' echo "Changing permissions of tftpboot...." perms '/tftpboot' } # ------------------------------------------- # ------------------------------------------- # Run the script # ------------------------------------------- # ------------------------------------------- srvUpdate backupConfig updateFOG restoreConfig fixPerms echo "Done!"
-
RE: GIT 5676 Disk Information in web UI is incorrect
In the storage management of the fog gui is everything set correctly? IP address, interface, user/password and all that?
on the server what does your /etc/exports look like?cat /etc/exports
What does fog say about your hard drive in the gui?
https://{fogip}/fog/management/index.php?node=hwinfo&id=1I’ve seen mine mess up before and one of those configurations was just messed up. Maybe your ip isn’t static and it changed on you?
Other times a quick restart of apache refreshed and fixed the issuesudo service apache2 restart
or
sudo /etc/init.d/apache2 restart
Or just a good old fashioned restart of the whole server can fix it to if it’s an odd bug in your server.
Though it’s most likely a simple configuration issue in the ui based on when this has happened to me a few times and since your nfs server is probably working since you were able to upload an image.
How big does the gui say your image is? You will need to enable “FOG_FTP_IMAGE_SIZE” under “General Settings” in the FOG Configuration then go to image management and list all images.Hopefully one of those ideas helps.
-
RE: m.2 PCIe SSD not recognised in FOG
@Toby777 I think it might still work on larger drives and you can manually expand the partition. That’s pretty good for RAW from what I’ve seen. Trouble with RAW is that it does every sector of the drive no matter how much space is taken up. Before multiple and extended linux partitions became supported I only used it for very specialized images for computers that were always the same size hard drive.
-
RE: m.2 PCIe SSD not recognised in FOG
@Toby777 Awesome! That’s good to know. Hopefully they’ll be able to fix it so that nvme works without using raw but the fact that it works at all is awesome! How fast is it uploading? RAW is typically much slower but those pci based drives are supposed to be crazy fast, just curious how fast.
-
RE: Unable to Register Optiplex 780
How is your FOG server configured for dhcp?
Do you use a dnsmasq dhcp proxy?
Is fog the dhcp server?
Do you have a linux or windows dhcp server externally? -
RE: m.2 PCIe SSD not recognised in FOG
This is like my problem, except I’m trying to download not upload.
What happens if you set the “Host Primary Disk” for that client in the fog web gui to “/dev/nvme0n1” ?
That should get it past the cannot find HDD on system, but for me it wasn’t entering partclone for a download, who knows maybe it would work for upload.Thanks,
-JJ -
RE: HP Z640 - NVME PCI-E Drive
@Sebastian-Roth @Tom-Elliott Is there a timeline on when you think this issue will be fixed?
Please and thank you
-
Automating Git Updates for FOG
In the past I made a script for automating svn updates. Since sourceforge has been making a habit of crashing lately, I decided to start using the git repo instead and adjusted my script to work with git.
I figured others might benefit from it so why not share…#!/bin/bash clear # ------------------------------------------- # Fog Git Updater # ------------------------------------------- # ------------------------------------------- # Script Purpose # ------------------------------------------- # This script is designed to run an automated update of the latest FOG Git dev build and it's cron friendly # ------------------------------------------- # ------------------------------------------- # Some prereqs for this script # ------------------------------------------- # 1. Already have an existing working install/configuration of FOG 1.0 or later # # 2. Have git installed and setup. You can do that by doing.... # sudo apt-get install git # mkdir /home/fog/fogInstalls/git # git clone https://github.com/FOGProject/fogproject.git /home/fog/fogInstalls/git # # 3. A script to echo the encrypted version of your sudo password, create one with this function # just put in your password into the following in place of your_super_secret_password (leave the quotes) # and then uncomment and copy paste the function into a terminal and then run it with just the name of the function pw # pw(){ # touch /home/fog/fogInstalls/.~ # ossl=`echo "your_super_secret_password" | openssl enc -des -a -e -pass pass:PASSWORD` # echo 'echo "$(echo '$ossl' | openssl enc -des -a -d -pass pass:PASSWORD)"' >> /home/fog/fogInstalls/.~ # sudo chown fog.root /home/fog/fogInstalls/.~ # sudo chmod 700 /home/fog/fogInstalls/.~ # } # ------------------------------------------- # ------------------------------------------- # Variables # ------------------------------------------- # ------------------------------------------- echo "Creating Script variables..." fogInstalls='/home/fog/fogInstalls' gitPath="$fogInstalls/git" backup="$fogInstalls/backups" pw=`sh $fogInstalls/.~` # ------------------------------------------- # ------------------------------------------- # Functions # ------------------------------------------- # ------------------------------------------- perms(){ sudo chmod -R 775 $1 sudo chown -R fog.fog $1 } srvUpdate(){ # Enter sudo mode aand do some quick server maintenance update fun times # First, enter sudo mode by echoing the output of decrypting your encrypted password and pipe that into an apt-get update # Don't worry, it doesn't output the password into the terminal # Now that the password is in once the terminal will keep it stored for the next bunch of sudo commands echo "Running Sever updates!..." echo $pw | sudo -S apt-get update -y sudo apt-get upgrade -y # install any upgrades you just downloaded } backupConfig(){ # Backup custom config and other files # Copy the latest versions of any files you've changed that will be overwritten by the update and backup the database just in case. # For example you may want to back up... # Config.php # To be on the safe side your config file in the /opt folder that has may have a corrected webroot for ubuntu 14.04 and may have stored encrypted credentials (i.e mysql) # I think that the installer uses this file and keeps it anyway, but I like to be careful # Exports file # Because this runs the installer with a yes pipe, it ends up telling it that the image path is "y", # simply backing up and restoring your current one avoids the issue of fog not finding your precious images. # Custom pxe boot background # If you have a custom background for the pxe menu, the bg.png file # Mysql database dump # It would be rather troublesome if something went horribly wrong in the update and your database goes kaboom, it's unlikely but backups are a good thing # Just a note, It's a good policy to also have backups of these outside of your server, which you could add to this script with an scp command or something like that # ------------------------------------------- echo "make sure backup dir exists..." if [ ! -d $backup ]; then mkdir $backup fi echo "Dumping the database..." mysqldump -u root --all-databases --events > $backup/DatabaseBeforeLastUpdate.sql #backup database echo "Backing up config and custom files..." echo "config.php..." sudo cp /opt/fog/service/etc/config.php $backup/config.php echo "fog settings..." sudo cp /opt/fog/.fogsettings $backup/.fogsettings echo "nfs exports..." sudo cp /etc/exports $backup/exports echo "custom pxe background..." sudo cp /var/www/html/fog/service/ipxe/bg.png $backup/bg.png } gitP(){ perms $gitPath echo "git pull...." cd $gitPath git pull } updateFOG(){ echo "running FOG installer..." cd $gitPath/bin sudo bash installfog.sh -Y } restoreConfig(){ # Restore backed up files # Restore the backed up files to their proper places and make sure they're formatted correct too. echo "restoring custom pxe background..." sudo cp $backup/bg.png /var/www/html/fog/service/ipxe # Restore Custom Background # I found that I needed to do this in some configurations, but it may no longer be neccesarry... echo "Creating undionly for iPxe boot in ipxe folder, just in case..." sudo cp /tftpboot/undionly.kpxe /tftpboot/undionly.0 # backup original then rename undionly sudo cp /tftpboot/undionly.0 /var/www/html/fog/service/ipxe/undionly.0 sudo cp /var/www/html/fog/service/ipxe/undionly.0 /var/www/html/fog/service/ipxe/undionly.kpxe } fixPerms(){ echo "Changing Permissions of webroot..." perms '/var/www/html/fog' echo "Changing permissions of images...." perms '/images' echo "Changing permissions of tftpboot...." perms '/tftpboot' } # ------------------------------------------- # ------------------------------------------- # Run the script # ------------------------------------------- # ------------------------------------------- srvUpdate backupConfig gitP updateFOG restoreConfig fixPerms echo "Done!"