loading and compressing files #10266
-
I use bicep to create some vm's and then I also use it to run Run commands seem fairly limited in that you can only send a single script file to execute or a single command. In my case I am using a run command that executes a script. The script accepts a single file as an argument. I am able to send the file as an argument to the run command by using the But now I need the ability to send multiple files to this script. The script should just take the file/files it's given and loop through them and do stuff with them. So how to transmit multiple files to the run command efficiently? I considered mediating these files via a third party - i.e like a shared disk, or azure blob storage. But this comes at additional complexity and resource cost. I prefer to keep this simple. So my thought is to define a custom json object like a dictionary object with the keys being the file paths, and the values being the base64 encoded file data
I haven't tried this yet but I suspect sending a json string as an argument to a run command may casue issues the other side where the argument is not passed correctly to the bash script. In this case I may have to take an approach where I load each file in plain text into the dictionary and then base64 encode the entire json dictionary object and send that as the parameter. I want to avoid base64 encoding each file, and then base64 encoding the json envelope because that's double base64 encoding on the file data! This can remain a side issue I am weary about for now. So this "envelope" would be sent as an argument to the This approach works, but it has me questioning if there is room for improvement especially when it comes to efficient use of bandwidth. So I am asking whether the following may be worth considering.. Add a new function to bicep that is able to load (multiple?) files in some sort of compressed wrapper / envelope. For example in my case, I am converting each file (or the entire json envelope containing each file in plain text) to base 64 for transmission because in bicep there is no way to load and compress the file data first, prior to the base64 encoding operation. Likewise I don't want to have to "pre-compress" these files prior to bicep deployment each time. The files themselves are things like config files in yml or json, and are maintained within the project, so having to compress them as a build step each time a change is made is not great. Ideally, there would be some sort of bicep function to load file data as base 64, but also the with the option or compressing the file data before the base 64 encoding operation. Not sure if it makes sense to have the ability to load multiple files at once with compression and get back some sort of envelope object?
Or just one file at a time?
However it should be possible for the calle to determine what the original file size wasand perhaps other metadata abotu the file, so returning just the raw base64 data does not seem sufficient - ideally it would be an object with the data but also additional metadata. This proposal is premised on the assumption that there is a worthwhile bandwidth saving to be had in some situations - enough to make it useful / warranted. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 4 replies
-
Can I ask what you are trying to configure ?
We just had a similar question on how to do similar
|
Beta Was this translation helpful? Give feedback.
Can I ask what you are trying to configure ?
We just had a similar question on how to do similar