How do you create your image/s?
I’d just like to hear how everyone on here is doing their image/s in terms of pushing them out in your environment to different machines?
In my environment, I’ve got a mixture between HP laptops, Surface and even some desktops. Now we typically focus on the same model although we have everyone on a 3 year rolling upgrade where we try to do a 1/3 of the business every year which means we get the latest of ‘that’ model every year (so Surface 4/5/6 in the recent 3 years).
I’ve then also got some users that require programs x (finance) and others that require program y with the rest just having a ‘generic’ image.
What I find myself doing is setting up say a surface 4 ‘general’ image which I push out for most of the staff then adding my finance programs ontop and creating another image for finance. I do the same for the HP models where I have a 2017 version, 2018 version etc.
All in all, this works fairly well for me but when I image something later on in the year, I obviously have to run updates and add any extra changes that have happened during the year that I now require. It also means when I get a new model of a laptop, I’m having to install everything from scratch to get it ready.
Again, this isn’t really an issue for me but I’ve been asking myself the question of is this the best way to be doing this? Should I somehow be having a machine in a VM environment where I can apply these updates and take a image of that? What do I do about the additional programs? Just have a few images based on those? Surely there are driver issues with a ‘universal image’ when going between surfaces and different models of laptops etc?
I’ve seen a little on sysprep mentioned but mostly for Windows 7. Never done it before so I’m not sure if this is the ‘common’ practice or if there is another way most approach this? My method works, just questioning if there’s a better way I should be approaching for this? Especially as I wait for a new model to arrive, I can’t really prepare anything until the device arrives so I can do my above method.
Wayne Workman last edited by
@dylan You would probably be really interested in this post:
Though, I haven’t used FOG at work for about 2 years now. I use it at home to create images of my Linux systems - and I do a lot of automated testing with fog. But I’m not doing much imaging anymore as I have no reason to do so.
Our workflow is similar to @Quazz 's in that we build a golden image, sysprep it and capture/deploy it with FOG.
To expand a bit on the workflow, we build the golden image in a VM using MDT. MDT allows us to create a standard and repeatable golden image using the lite touch process (i.e. start the MDT deployment, walk away and come back when its done). MDT installs the OS, custom applications and current windows updates to the golden image during MDT deployment. We have a custom unattend.xml that gets deployed to the golden image just before we sysprep it. The golden image is sysprep’d and then captured with FOG. Then FOG is used for deployment to the final target computers. We also use FOG to “copy over” the target computer’s driver just after the image is installed on the target hard drive. This makes our single golden image in FOG work on multiple hardwares (we only use Dell and recycle every 4 years, so that gives about 12 models we have to supports)
Post install we deploy applications using PDQ Deploy and setup application policies with PDQ Deploy and PDQ Inventory. We don’t use FOG Snapins for this because we have a well established process with PDQ tools.
Using sysprep you could create a ‘golden image’ (hardware neutral) that would install on any device. If you use an unattend file you can push it even further and configure the computer (create accounts, networks, etc).
The software packages could be delivered through snapins instead post install. This will also make it easier to keep them all up to date anyway.
Your option isn’t bad of course, it takes time to set up a more universal system.