Image Compression Association
It would be nice to have an option to associate different compression levels with either hosts, groups, or images. That way if/when you are needing to image a system that has crap for processing performance you can associate it with a lower compression level, allowing it to finish the task more quickly at the cost of disk space on the storage node. Yes, you can currently adjust the system wide compression level, but this effects every task running (that uses the compression obviously).
I am not sure on the feasibility of this, and ideally this wouldn’t be a long term issue since we should all be throwing systems with Atom (and similar) processors down a flight of stairs.
Alright, that answers that. I appreciate the consideration.
Tom Elliott last edited by
Adding the capability isn’t an issue by itself but it would only work for images as that’s the item that gets compressed. It would, also, only be viable for the upload. For this reason, it’d be recommended to adjust the global setting unless you’re uploading multiple images at the same time. I doubt this is the case. So, in my highest opinion, adjusting this value to quicken the upload of a problematic machine isn’t that difficult. Besides that, once it’s uploaded, the deploy should be faster. That and better compression on upload means less data to transfer during download.
Fair enough. The one problem machine was going to take 4+ hours for about 15 GB, and I know this is an odd case.
My uploads run about an hour on full compression on any of my machines, but it then allows for 11-13 minute deployments.
It’s worth the wait so i’d leave the compression maxed unless there is some reason you need to upload faster.
You only upload once in a great while, you deploy regularly.
The problem I can see is that unless you can change the compression level on the server after upload. you would need to upload multiple times to be able to suit those hosts with a different compression level.
How long is your ‘crap’ machine taking to upload?
Yes, typically you would want a decent machine to be your original image host, so that the whole process gets done faster for you. But there are situations like doing a pre-service backup with Fog on a crap machine where a better processor isn’t an option.
Clearly, this is not a dire need, but it would speed things up in a few situations. It also shouldn’t affect the default workflow for users if the default setting is to use the system-wide compression level setting.
Could you just use a powerful PC for uploading?
I’ve not had problems with the download being extremely slow…