well - I’ve found it’s not the script/code as I get the same result using:-
“cp * -rbuv”
“cp * -au”
… from a terminal so it looks like its something with the cp command?
If anyone has a clue then please shout up as I’m at a loss!o_O
well - I’ve found it’s not the script/code as I get the same result using:-
“cp * -rbuv”
“cp * -au”
… from a terminal so it looks like its something with the cp command?
If anyone has a clue then please shout up as I’m at a loss!o_O
Hey folks…
I’m not sure how many folks are using the FOG backup script? I have it setup to backup the following:-
/images
/snapins
/reports
/tftpboot
/utils
/fog-setup (I keep my installers here incase of system boom and no web!)
/var/www/index.html
The above locations are backed up to a network drive (which we back up other massive data files to also without problem), the network drive is mounted in the main FOG server users home drive.
The problem I’m having is when I run the script it runs through the process of backing up the images folder then at around 89% through the images folder it deletes everyting in the backup location and leaves just a couple of images.
If I REM out the images section I ge the same problem with the snapins folder.
If I REM out the snapins and the images folders then the script completes as expected.
Is there a maximum file/data/buffer size limitation the shell script can handle which might make is remove everything it’s put in the backup location?