Windows 10 Imaging over low BW Internet/VPN
-
There are a few questions you need to answer.
- How many users are you talking about?
- How are the remote users connected to HQ? Is it via a vpn or some other tech?
- Is this to upgrade a remote site (small office) or just a single user?
- If its a single user, how tech savvy is the user?
- How low of bandwidth are you talking about?
- How big is your reference image? (GB)
-
This post is deleted! -
Hi George,
Appreciate your inputs Please find my response for the 6 points.
1.How many users are you talking about?
Approx 6500 users scattered in US and Canada2.How are the remote users connected to HQ? Is it via a vpn or some other tech?
Its Mix network environment, User working from Office on
High Network Bandwidth DSL Network
Medium Bandwidth 10 MBPS
Low Bandwidth 1.5 to 3 MBPS
Work from Home via VPN ( 3- 5 MBPS tunnel)3.Is this to upgrade a remote site (small office) or just a single user?
Upgrade from Win 10 to Win 10 -15%
Upgrade from Win 7 to Win 10 -45%
Hardware Refresh - 40 % - (Planned via Fog(depot location) and ship to the user)4.If its a single user, how tech savvy is the user?
User is Semi Tech, I have a team of 6 who can assist5.How low of bandwidth are you talking about?
Low bandwidth range 1.5 to 3 MBPS6.How big is your reference image? (GB)
Image size is approx 20 GB -
@Suhel Ok a few more questions based on your answers.
-
You have about 6500 users, how many sites are you talking about?
-
Based on sites, what is the smallest site (in computer count) and what is the largest site (in computer count)?
-
Do you have one base image for all 6500 computers or do you have multiple base images (i.e. one for office, one for labs, one for plant floor, etc)?
-
-
Hi George, Please find the response on 3 questions
-
You have about 6500 users, how many sites are you talking about?
Total 490 Sites -
Based on sites, what is the smallest site (in computer count) and what is the largest site (in computer count)?
Correction on users - its 7600
6 Site with more then 200 Users contributing 2500 Computers
6 Site with more then 100 Computers contributing 780 Computers
14 Site with more then 50 Computers Contributing 900 Computers
40 Site with more then 20 Computers Contributing 1100 Computers
90 Site with more then 10 Computers Contributing 1000 Computers
334 Sites with less then 10 Computers Contributing 1320 Computers
3.Do you have one base image for all 6500 computers or do you have multiple base images (i.e. one for office, one for labs, one for plant floor, etc)?
Base Image is as per the user profile and it differs from location to location, if needed we can exclude the heavy application to reduce the image size.
-
-
@Suhel Wow, 490 sites and 7600 computers! That is an enormous environment.
FOG wasn’t made for such a job. Don’t get me wrong, I am not saying it’s impossible to do but it needs thorough consideration! George is very good at asking the right questions. Just to make sure I would like to ask a few more:
I need help in deploying the image over the internet for the low internet BW
Do you intend to really image over the internet for all your sites? Imaging 20 GB over a 1.5 GBPS connection will take about 30 hours straight and that’s in an ideal situation where the link can be used exclusively. If the connection breaks at one point in between you can start from the beginning. I don’t see this is going to work! Even with 3 and 5 GBPS links this is not feasible! With 10 GBPS it takes 4.5 hours which might sound ok but this is per computer because you cannot send a multicast stream over the internet (unless you have a proper VPN setup for all sites). Makes 45 hours for only 10 computers and again this is in an ideal world with full speed and not connection loss.
High Network Bandwidth DSL Network
How many sites (with how many computers) have this connection and how fast is it?
Usually when bandwidth is low we suggest to build a FOG server including all the images needed and ship that to the remote sites for local deployment. But in your case that would mean you need to build 490 FOG servers. Probably not going to work either. But maybe this strategy can be used for some of your low bandwidth sites (depends on how many you have).
Base Image is as per the user profile and it differs from location to location
Does that mean you have 490 different base images? Probably not. Can you please explain further what image you plan to use for the computers. User profile is usually not part of a base image.
-
@Suhel Understand this is only my opinion of how you could implement FOG in your organization.
As Sebastian said, FOG isn’t really targeted to an organization of your size. Its main target is organizations with 500 or less computers (this point can cause an argument). I know of several universities that have over 2500 computers at their campus and FOG works just fine. But they have high bandwidth networks and are not distributed to multiple (many) sites like you. If you are a microsoft shop you may be better suited for SCCM than FOG.
With that said, FOG will work for your organization even if you have a diversified structure. It will require more than one FOG server/FOG Storage node.
So lets define some things.
- FOG serves come in two styles. The first is a full FOG server. This is a complete capture/deployment/management system. Its a stand alone server for imaging. The second style is a FOG Storage node. Storage nodes are deploy only FOG servers. These storage nodes don’t have local databases so they must work together with full FOG servers. In a typical multicampus network you would have one full FOG server and one or more FOG storage node servers. These storage node serves would be located at each remote location so you would not deploy across the site to site link. Images created/captured on the full FOG Server (master node) are replicated to all FOG storage nodes in each storage group (a collection of a master node and more than one storage node).
- You can not capture images to storage nodes, only master node can capture images. Each storage group can only have one master node.
- In a FOG imaging process the target computer does all of the work for imaging, the FOG server/storage node has a very impact on images. For imaging the FOG server needs to just move the image to the network adapter. The target computer does all of the data compression and writing to media. For small deployments I’ve heard of people using a raspberry pi3 as their FOG server.
- You can install the FOG client or not depending on what you want from FOG. If you are working as a system rebuilder in that you load the OS and then never see the target computer again, there is little value in installing the FOG Client on the target computer. The fog client is a service that is / could be used for target computer management post imaging. For hardware resellers they use a process I “load and go”. There is where they pxe boot into the FOG menu, pick deploy image and the image is transferred to the target computer. There is no need to register the computer with FOG since fog will not manage it post installation. The fog server just loads the OS and then forgets about the target computer. In this load and go process the target computer will need to name itself and connect to AD if needed via the unattend.xml file since that could be a role for the fog client that isn’t installed. One thing to remember if you use the FOG Clients, they must speak to the FOG server that is acting as the master node to look for tasks. This fact is also true if the target computers are on a remote campus with a FOG storage node. All clients need to talk back to the master node.
So based on how your organization is here are my recommendations.
For sites with 200 or more computers, install a full fog server at each location. These FOG servers will be full capture / deploy nodes. Each fog server will be managed independently. For sites with 100-200 nodes install either a full fog server or a fog storage node. Realize that the fog client at the sites with storage nodes will need to talk to the FOG master node across your site to site links, but imaging will be done locally at each site. For these sites (100 or less) you could use a modern desktop computer with an ssd or nvme drive as your fog server / storage node.
For the sites with 50 users, depending on how you plan on and frequency of image deployment you might be able to use an Intel nuc or Raspberry Pi3 for these locations. A lot would depending on if you use the fog client (service) at these locations. If no, then the smaller Pi3 would work. If yes, then you would want to use an i3 or better nuc.
For the sites with 10 or less, again depending on how often you deploy to them, clonezilla and a usb hard drive (or all on a big usb flash drive) may be the best choice here. No hardware or fog server is needed at the remote location. Image updates would be managed by just shipping a new usb flash drive to the remote sites.Now for a single image for all, if possible that would be the best way to go. You will have enough variations in your computer hardware that having multiple images would be difficult to manage. If you did use the fog client. You could use a single image, and depending on what fog group the computer was in, it could have its applications installed based on its profile. For example on my campus I say all computers will have adobe acrobat, 7zip, and a few other common applications like office and SAP. Those applications are installed on my base image. That base image is 25GB in size. While I don’t use the FOG clinet, I could create an application deployment group for the CAD computers where if the computer was dropped into the CAD group, fog would install Autocad on the computer after imaging. Not all computers need autocad, only computers with the CAD profile. This way I have one image for our entire organization by using application profiles that are called after imaging is complete.
-
@george1421
Thanks George and Sabastian for helping with the in detail plan for the rollout. Much Appreciated.Plan for High and Medium site: 1: Deploying via local master FOG server will be ideal for the environment, My plan is to set up standalone FOG server on the high-end laptop in Phase 1 and ship to the larger sites for the rollout and use the same laptop for other medium user locations in phase 2.
Instead of using the storage for image changes, I am planning to copy the image via SFTP on the local FOG Server ( Chopping the image via 7Zip Archive and merging back at destination)
Question: Can we append all the FOG database on the master server once the migration is done on the site for the future reference or to push the application update?Need your Suggestion for remote and low site users:
1: Can we build the live ubuntu fog server with image, which can USB boot on any other hardware for FOG imaging, with the only base image with Fog Service running.
2: Is there any way to make Clonezilla Zero-touch for the image deployment, as most of the remote users are not technical, with few clicks’ deployment will save Admin time and effort.
Does Domain Join work over the VPN via Fog considering all the necessary AD ports are opened, Plan is to deploy the image either step 1 or step 2, mostly step 2 and then domain join, and application push will be done over the VPN.
On the option to create the application group is fantastic this will help in avoiding multiple FOG images. I have never tried that, do we have any reference document on how to build the application and group.
Also, any thoughts on taking the user backup via Fog, Fog takes the image-level backup (Not recommended ,it will need lot of space), but can we do any customization which will help to take the folder level backup only and restore it back after imaging, I understand at this stage its too much for Fog, but just a thought on the backup
Please let know your inputs and suggestions, that will guide us to take the right decision. My Company leadership is proposing on SCCM with AzureIntune -
@Suhel I sent an answer yesterday but due to a server issue we had unfortunately my answer is lost. Are you still around and wondering about the same questions? I might find some time to answer today or tomorrow.
-
@MichMurray I cannot honestly tell if you’re trying to sell a different product or genuinely want to help the use in regards to vpn.
While fog uses different open source tools we use those tools for a specific reason. Adding vpn to the mix is a bit on the advanced side of things and would require a highly technical understanding of what one is doing from a security and bandwidth perspective.
We typically ban spam accounts and this feels like it is but also feels like it could be a genuine interest I. Assisting others.
Maybe that’s the point?
Sorry if I’m too harsh but you fit the bill in spammer ideology, by trying to spring back old posts with seemingly nice posts and the. The bait. Going to remove the link. Please be cautious moving forward.
-
@george1421 Hi George - My name is Michael and I came across this thread while doing research on imaging corporate laptops over the internet. I find myself in a similar situation as to the original posters predicament.
What ever became of this thread and to your knowledge did deployments ever take place?
Thanks
MIchael
-
@michael-steiner said in Windows 10 Imaging over low BW Internet/VPN:
What ever became of this thread and to your knowledge did deployments ever take place
I don’t know if this deployment ever took place. From my perspective the OP of this thread has two different issues.
- Imaging over a low bandwidth link
- Imaging over the internet.
For imaging over a low bandwidth link, the real solution with FOG is a mobile deployment server running on a raspberry pi.
For imaging over the internet, I’ve been thinking about this to test with openvpn. The FOS Linux client would run the openvpn client and on the FOG server the openvpn server. Then tunnel all of the communications over the openvpn link. The issue is remote pxe booting. That can’t/shouldn’t be done over the internet. But if we switch to USB booting on the remote end then it might be possible. There are still several hurdles to overcome doing this and I wonder if a different imaging solution would be a better choice like clonezilla on a usb device.