• Recent
    • Unsolved
    • Tags
    • Popular
    • Users
    • Groups
    • Search
    • Register
    • Login
    1. Home
    2. dang_that_guy
    D
    • Profile
    • Following 0
    • Followers 0
    • Topics 2
    • Posts 4
    • Best 1
    • Controversial 0
    • Groups 0

    dang_that_guy

    @dang_that_guy

    1
    Reputation
    189
    Profile views
    4
    Posts
    0
    Followers
    0
    Following
    Joined Last Online

    dang_that_guy Unfollow Follow

    Best posts made by dang_that_guy

    • RE: AWK: fatal: cannot open file error

      Thank for your answers. Due to the nature of our work and why we use this FOG server it’s unfortunately air-gapped from the internet so updates have been hard to come by regularly. The only thing that makes sense on root-cause here was one severely borked imaging operation. Even then I’m grasping at straws for a good answer on what triggered the error.

      However, I pushed through our ITsec group and got the server updated to current stable release 1.4.4 this morning which did resolve this error and it’s deploying images normally again.

      Thanks for the info guys feel free to revoke my nerd card for it being resolved with a standard update.

      posted in FOG Problems
      D
      dang_that_guy

    Latest posts made by dang_that_guy

    • RE: NVMe Imaging Problems

      @x23piracy

      Fantastic advice and right on the money. The windows 10 hybrid shutdown is what caused the unclean file system error and explains why the Windows 10 DART disk never picked up the error because it saw it as “normal”. That let me successfully capture and deploy the image on the NVMe.

      @Tom-Elliott

      For full documentation: Fog version 1.4.4 - bzImage 4.12.3/bzImage32 4.12.3

      posted in Hardware Compatibility
      D
      dang_that_guy
    • NVMe Imaging Problems

      Hello awesome Fog Community,
      On my source machine I have a 512GB Samsung EVO M2 NVME SSD with a 475GB NTFS Partition (the partition has been shrunk to that size in an attempt to get around the issue below the drive only has about 200GB in use). This is just the data-drive of a build-machine who’s OS runs off a traditional SATA SSD so it doesn’t have any other partitions or special configurations.

      It’s failing to capture as a “Single Disk Resizeable” (gave error “disk contains an unclean file system” however I did perform every test possible and it IS good/clean). I have it currently captured as a “Multi-Partition Single Disk” capture (though it captures successfully as a RAW as well). When attempting to deploy the image to an identical M2 NVMe drive it fails immediately stating “Target partition size(500107 MB) is smaller than source(511060 MB). Use option -C to disable size checking (Dangerous).”

      TLDR; So I’m rather stuck, although i’m using identical drives – the image won’t deploy when captured, and I can’t capture it as a re-sizable partition.

      My thought is I either need to capture it as resizeable, or determine how to shrink the image ON the fog server itself. I’d really appreciate any insight you guys might have.

      posted in Hardware Compatibility
      D
      dang_that_guy
    • RE: AWK: fatal: cannot open file error

      Thank for your answers. Due to the nature of our work and why we use this FOG server it’s unfortunately air-gapped from the internet so updates have been hard to come by regularly. The only thing that makes sense on root-cause here was one severely borked imaging operation. Even then I’m grasping at straws for a good answer on what triggered the error.

      However, I pushed through our ITsec group and got the server updated to current stable release 1.4.4 this morning which did resolve this error and it’s deploying images normally again.

      Thanks for the info guys feel free to revoke my nerd card for it being resolved with a standard update.

      posted in FOG Problems
      D
      dang_that_guy
    • AWK: fatal: cannot open file error
      Server
      • FOG Version: 1.3.0-RC-20
      • OS: Ubuntu 14.04 LTS
      Description

      I started seeing new behavior on deploying an image today. First the deployment changes the UUID for the disks (previously it didn’t do this) then after “Resetting swap systems” I get the error:

      awk: fatal: cannot open file ‘/images/XXXXX/d2.original.swapuuids’ for reading (no such file or directory)

      This error happens 4 times in a row on-screen and then the system reboots. The fog server seems fine itself, permissions/file ownership is as per standard, and I have tried different target hardware and different image files and ran the script to update the init.xz and bzImage as well as their 32 versions via update script but so far have not had any success.

      Any suggestions on why my fog server just stopped working?

      Thank you,
      Joseph Donovan

      posted in FOG Problems
      D
      dang_that_guy