How big can Snapins be now?
-
Its been a number of years since I used FOG, and back then there was a limit on the size of Snapins (2GB). Has this changed now?
I’m using SCCM for OS deployment at the moment, and it is great when it works, but it has a habit of breaking and then I have to spend hours trawling through logs to fix! The only reason I stopped using FOG was the 2GB Snaping problem. Would love to start using it again.
-
OK, I do have to ask, what the hell are you deploying where the application is bigger than 2GB?
The way I would approach this task is to place the software on a network share and then mount the share and then launch the installer from a snapin.
While I don’t use snapins, that is how we install Autocad Inventor with PDQ Deploy. There is no reason to copy a 35GB worth of installer files (I know I just contradicted my previous question) to the client just to install them then delete the install files afterwards.
-
Back in the day, it was the Adobe suite IIRC. Was a 3.5GB installer. There’s a few packages like it though now - SMART Notebook as an example.
Copying, then installing and deleting is more efficient in Windows - the installation happens faster, as the OS doesn’t have to check the installer over the network connection - it is checked locally.
You do give me an idea though - just host the file on a server, and have the snapin run a script to copy it to local Temp and run it instead.
Though, it complicates things of course.
-
@Tony-Ayre If it’s a large snapin as such, I would do either as @george1421 suggested (store the installer on a share, use snapins to install from the share) or install them in the base image itself.
Theoretically, however, you should be good to go with whatever filesize you want. Timeouts are still going to be an issue but you can make adjustments to your own configurations for this. In particular you would need to find your relevant php.ini file and adjust the max_upload_size and post_max_filesize variables for at least the largest file you think you’ll need to upload, (I currently default it to 3000M (approximately 3GB)). You will also need to change your max execution time.
-
@Tony-Ayre said in How big can Snapins be now?:
Copying, then installing and deleting is more efficient in Windows - the installation happens faster, as the OS doesn’t have to check the installer over the network connection - it is checked locally.
Understand I’m not poking you here, but it would be interesting to know the
actual time to deployment completion
differences between downloading and installing vs having the snapin launch the install over the network via a network share. I’m thinking from the snapin issuance as the start, deploying form a share will finish faster. But that is only speculation until its measured. -
@george1421 said in How big can Snapins be now?:
OK, I do have to ask, what the hell are you deploying where the application is bigger than 2GB?
Autocad
Adobe Creative Cloud
Smart Notebook -
@Tony-Ayre said in How big can Snapins be now?:
Its been a number of years since I used FOG, and back then there was a limit on the size of Snapins (2GB). Has this changed now?
When you get into these huge snapins (like I have), lower your max clients number. If you don’t your snapins will go belly up due to just timing out, the fog server will get slammed, unresponsive, and things just stop working for a long, long time.
Inside of the web interface in Storage Management -> Storage Node -> Max Clients, I would recommend setting this to 2, absolutely no more than 3, for these massive snapins.
-
@Wayne-Workman Why is that? 2+GB files aren’t really that big.
-
@Wayne-Workman said in How big can Snapins be now?:
Autocad
Adobe Creative Cloud
Smart NoteboolkWhich version of Smart notebook are you using? It is already installed on our teacher image. It’s not too bad, you just have to log in and repair the license. Another option is to use the web based installer, though I don’t know if you can do that silently. As of 16.2, the web installer is only 26 MB.
I guess I am missing the point, you guys want to use snapins. I understand that, but in my environment, we have model/scenario specific images with programs included in the images (we are pretty model standardized). If all else fails, you can do that.
-
Lets put it this way. I am now running IT for 6 schools and 3 nurseries. Each of those schools has a dozen PC types, with about 3 different roles for each. This number is likely to grow as more schools join our trust also.
So, to create a full image type for each machine with each role would be rather a lot of unnecessary work.
Instead, a single general image (containing the base level of software), combined with Snapins will reduce that work tremendously, along with the storage needs of images.
-
@Tony-Ayre While I understand what you’re saying, it think you’re thinking too much into it.
What you do is up to you, of course, but if you have 6 schools and 3 nurseries, (for now) wouldn’t it still make sense to put software that is large on the general image? I mean, unless licensing is that much of a mess, having “extra” software in the general image shouldn’t be a big issue. It allows you to take ANY system from ANY location and use it ANY where you might need to use it (after ensuring inventories are updated and what not).
The way I handled things like this was:
1 “super base” image (This was the basis for any of the individual images I had). This contained all of the systems NIC and Chipset drivers and was sysprepped. This would be the basis of all of my images. Office would be installed, smart notebook, adobe reader/flash/shockwave, etc… All windows updates, etc…As we got into the “specificities” of needs for each school, I created one image for each level of school. We had an elementary school image, a middle school image, and a high school image. The only reason I separated things out in such a fashion was due to licensing differences between the three. So in total I had 4 base images that were properly defined and labelled so our techs could image as/where appropriate.
I could’ve done snap-ins, but that actually would have ended up using more bandwidth and time. Not to mention troubleshooting if something had gone wrong. So I took the time to build the base images as needed.
I limited my snap in usages to simple updates and small software installations. Any “custom” needs would be handled per system through snapins.
-
@Tony-Ayre said in How big can Snapins be now?:
Lets put it this way. I am now running IT for 6 schools and 3 nurseries. Each of those schools has a dozen PC types, with about 3 different roles for each. This number is likely to grow as more schools join our trust also.
So, to create a full image type for each machine with each role would be rather a lot of unnecessary work.
Instead, a single general image (containing the base level of software), combined with Snapins will reduce that work tremendously, along with the storage needs of images.
I know not everyone’s environment is the same. I figured I’d throw it out there just in case. After some thought, I also assume that you use an .msi for deployment of Smart Notebook, so the web based installer is probably out. As much as I figure you’d like to avoid it, .bat files may be your only bet. What are your network shares like?
I want to add one more thing: The way we make different images is to create a base or “student” image. We upload that and then base our other images off of it. For example, the “teacher” image is made by deploying the already uploaded student image to a pc, then simply installing the Smart Notebook software, and boom, re-upload as the teacher image. It’s a little time consuming at first, but don’t think you must build from scratch, though the non-standard models may make it a little more complex. If you don’t have sufficient storage, I guess that won’t work. I use 500GB of space for my images. Just a thought.
Like @Tom-Elliott said, if you have a Smart Notebook site license, throwing it on a general image won’t hurt. It could get messy, but you may be able to control icons with GPO’s.
-
@Tom-Elliott Licensing in schools is done by site license usually, meaning each school has its own key for things. There are then limits on how many installs of various packages can be performed. So, yes, licensing is a major reason why we can’t do this.
We also don’t really want to put everything on every computer that would be absurd, especially when we have plenty of computers with 128GB SSDs.
All of this is somewhat beside the point though - we want to do things in a certain way, so need snapins to be able to handle it.
-
@Tony-Ayre I thought site licenses are typically unlimited while volume licensing is the limited type?
Listen, I have 4 elementary schools, 2 middle schools, a high school, a community center, an administrative building, etc. all containing different software. I have at most 15 commonly used images that take up under 500GB of space. I use snapins for anything that fits between the size threshold. Anything bigger I make an image for. Do what you see fit, but in my opinion, snapins are not meant for large pieces of software. If you don’t agree, perhaps your current solution fits your needs better.
-
Like I said, I understand why you might want to do things this way, I’m just saying how I managed it. With that still, if the snapins are “huge” (2gb is still a lot of data going a cross a network) I don’t see making images that contain what’s needed as “unnecessary” work. Using snapins will do what you want, but the method you take to handle the installation is still very important.
Here’s my thoughts on why:
Let’s say you have a lab of 30 hosts. All of those 30 hosts need the same snapin. This snapin is small, relatively speaking, at just 2GB.Using snapins means:
2GB Snap-in needs to be downloaded on EVERY host. (30 x 2 = 60 GB total data transferred.) After this you still need to configure it (granted this might happen during the installation.) Error handling, however, you have 30 chances for a problem to happen during the installation.Using images means:
Snap-in is already installed and configured. You already know it’s going to work. The image is stored in a compressed form so data transfer is less. Is it really that much “more” time?I hope you understand, I fully get what you’re saying. I disagree, however, that creating separate images is performing “unnecessary” work. You already have the general image, so use that as the “basis” and simply install the program(s) you need and upload.
-
@Tony-Ayre said in How big can Snapins be now?:
@Wayne-Workman Why is that? 2+GB files aren’t really that big.
If you’re deploying just 1, then it isn’t big. Deploying snapins to 200 computers, that’s 400GB. Deploy that to 550 computers, that’s over 1TB, I’ve done bigger than this. And the best way to do it is to limit max clients.
-
@fry_p Licensing depends on the software. However, all the education site licenses I’ve come across in the last 10 years have said the same thing - it applies to a single “site”. So, in our trust that’d be a single school.
-
@Tony-Ayre Thinking a bit more, if licensing is the reason for “needing” snap-ins, might I suggest another approach? While snap-ins can do some pretty amazing things, their implementation and intentions were not meant, as far as I understand them, to be software installation items. Granted it can do these things, but it does require a lot more thought.
I wonder if it would be “simpler” to keep track of which devices/systems should NOT have a specific software/license and use a snap in to remove that item from those systems? If know it would mean a lot more management (registering hosts you can somewhat simplify this by adding a host to a group), but it would also achieve the same level of results you’re requiring without having to transfer large amounts of data. The level of work (ultimately) would be greatly lessened as you would only need to write a snap-in script to uninstall the software once. While it would still have to be downloaded X number of times (X being the number of hosts that need the software uninstalled from), it’s much less taxing on IO for the server and bandwidth on the network.
Any approach you take should be fine in either case. Just giving thoughts on how I might think to do things.
-
Hi,
ever thougth about a local sync feature for snapins?
This could safe a lot of traffic in big environments.Look at dropbox they use a local sync feature for the client synchronisation.
This needs to be implemented and i am sure it’s a lot of work but could be a solution for this problem.
Fog clients have to talk to each other and they need to know which snapin should be deployed to whom.Regards X23
-
Wow this thread has made a big loop. (I admit I didn’t read the entire thread so this may have been already solved)
How I see it there is 2 (maybe 4) options.
- Open up the php settings (size and timeout) to allow bigger snapins to be uploaded to the FOG server, with all the negative impacts.
- Deploy a snapin that calls the installer from a common share. (my preference)
- Deploy s snapin that spawns out a script to copy the install files local to the target computer (such as c:\windows\temp) and then launches the installer from there. Then cleans up (deletes) the install files afterwards.
- Use a third party tool like PDQ Deploy to deploy the applications using one of the three above methods. The advantage of PDQ Deploy is that you can use a manual list or an AD OU as a selection source to deploy applications. Actually you could call a PDQ Deploy package from a snapin.