-
-
Notifications
You must be signed in to change notification settings - Fork 564
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Move core dependencies to a function that templates can call #1319
base: main
Are you sure you want to change the base?
Conversation
I like the idea, but if we would consider this, i think this should either move in the update_os function, or we keep it as is. |
There's no reason why I think the entire preamble that's in every script could be moved to one function (call it setup or whatever), and authors of scripts just need to call "setup", add their own package dependencies, and then any custom scripting for things like github fetches, and then one function like "cleanup" to do all the cleanup. I might go so far as to say a templated script could look like:
There is a balance to be found between making everyone copy-paste everything needed to get a container up, and hiding so much that no one knows what's happening in the background, and I suppose it depends who the audience for creating the shell scripts is. Is it people with zero knowledge of scripting (and yet somehow know how to use git and github?) or people who are familiar with bash scripts, and would be comfortable reading the
? As for core deps in every script - sure, but everyone is already copy-pasting a bunch of other functions without necessarily knowing what they do (they could be installing packages, and indeed for Alpine there is a package installation of bash even before the install script is downloaded). |
I´m in no way against your idea here, dont get me wrong. I actually like it.
could and should absolutly be only one function call in my opinion. But i dont think there should be any more functions than the necessary things we need as boilerplate. |
Wouldn't it be possible to completely invert the relationship between the 'helper functions' in build.func and install.func? For example: #!/usr/bin/env bash
source <(curl -s https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/installer.sh)
...
APP_NAME="my-app"
APT_DEPENDENCIES=(curl gcc )
function app_pre_install() {
# Do things before actual app install, optional
mkdir /opt/$app
}
function app_install() {
# Do all commands to install the app itself, not optional
version=$(curl github.com/repo/releases | ...)
curl -sL github.com/repo/releases/app-$version.tar.gz
...
}
function app_post_install() {
# Cleanup after the installation or something, optional
rm $version.tar.gz
}
function app_pre_update() {
# Do things before actual app update, optional
app_pre_install
}
function app_update() {
# Do all commands to update the app itself, optional
# If this function is missing one can assume that the app is non-updateable and a message with that info can be shown
mv /download /opt/app
}
function app_post_update() {
# Cleanup after the update or something, optional
app_post_install
}
# Call the kick-off function
cs_run The function cs_run() {
header_info "$APP"
base_settings
variables
color
catch_errors
if command -v pveversion >/dev/null 2>&1; then
show_menus_etc
setting_up_container
network_check
update_os
install_core_packages
apt install $APT_DEPENDENCIES
command -v app_pre_install >/dev/null 2>&1 && app_pre_install
# Do stuff?
app_install
# Do stuff?
command -v app_post_install >/dev/null 2>&1 && app_post_install
else
if command -v app_update >/dev/null 2>&1; then
ask_for_confirmation
command -v app_pre_update >/dev/null 2>&1 && app_pre_update
# Do stuff?
check_container_storage
check_container_resources
app_update
# Do stuff?
command -v app_post_update >/dev/null 2>&1 && app_post_update
else
echo "The app cannot be updated because the app_update function does not exist."
fi
fi
} |
Yes :) I was thinking the same thing, but figured I'd start small by removing a bunch of duplicated code. To me, it's a good end goal, and one that can be reached in stages (or in a big bang, but there's enough churn on the repository that stages makes sense to me) with incremental PRs that do things like "add the pre-install function and use it directly in the script" - so to take a simple example: # shbang
APP="app"
PACKAGE_DEPENDENCIES=(tzdata)
function app_pre_install() {
mkdir /opt/${APP}/data -p
}
color
verb_ip6
catch_errors
setting_up_container
network_check
update_os
install_core_packages
app_pre_install
# everything else inline
RELEASE=$(curl... | awk '/tag_name/ {...}')
...
# (or apt install, whatever)
# config file setup
# systemd setup then becomes # shbang
APP="app"
PACKAGE_DEPENDENCIES=(tzdata)
INSTALL_DIR="/opt/${APP}"
DATA_DIR="${INSTALL_DIR}/data"
function app_pre_install() {
mkdir "${DATA_DIR}" -p
}
function app_install() {
# Downloads using the tagname extraction, unpacks, moves to install dir?
cs_download_github https://github/repo/releases/latest "${INSTALL_DIR}"
}
color
verb_ip6
catch_errors
setting_up_container
network_check
update_os
cs_install_core_packages
app_pre_install
app_install
# everything else inline
# config file setup
# systemd setup And after several iterations the file is basically "define the functions, call them directly", and the final iteration is "define the functions, call cs_run" where cs_run has been set up similar to your proposal. It's more work in some respects, but it's also incremental progress that's easier to test I think. |
Exactly :) |
Well, you folks are the stewards of this project - I'm up for doing iterations of work to start encapsulating individual pieces of work in the scripts into functions, with the install files calling the functions declared in the same file; it's work that can be picked up by other folks if I vanish (not saying I will, but $dayjob can consume a lot of my mental energy), so long as there's an agreed path/plan for how the project wants to go from scripts that inline "everything" to scripts that declare components of work in functions and use a runner to invoke the functions in the correct order. |
Thank you for your work, but it will take some time until all Contributors/Maintainer had a chance to take a look and discuss it. |
Oh, not saying I need an answer "now" - happy to leave this here, and when you've all had a chance to discuss the idea etc, let me know. I haven't done more on this stack since posting this initial PR. If you want all contributors to the project to have a read through, well, it'll be a while, but I'll wait :) |
Fine for me. One Thing, we need to update all 220 Scripts to :D |
@cricalix Can you prob come up with some code and one or two siripts updatetd to incorporate this system? I would createt you a branch where you could put this changes against for testing. |
I don't mind doing the work in stages, rather than in one fell swoop - consider it a slowish refactor where at each point everything still works, and by the end everything including documentation is migrated (assuming the contributing guide lands at some point soon). As @michelroegl-brunner suggests, I can come up with a few scripts that get the total treatment in a branch, and if it looks good, I can plan out an iterative approach to update all the scripts. In terms of merge conflicts and ability to split the load, perhaps a PR per script. Will be a lot of PRs, but each one is isolated and easy to revert if things break. I think I'd want overrideable URL in place too before starting down this path, because it'll make it so much easier to test - I can spin up a Proxmox VM in QEMU, clone the repository, and do |
Fine by me. I suggest make some small changes like you have in this PR but with the bigger goal in mind, including your URLs and one or two changed install scripts for testing, further iteration. When it works we should be able merge it relativly easy thanks to the overidable urls. i create a develop branch for this tommorow. And the docs should land soon. (Hopefully next week) |
@cricalix: core_dependencies_to_functions branch for you. |
✍️ Description
Every single install script installs a consistent set of additional packages, and every author of the scripts has to copy/paste the relevant
apt
orapk
lines into their script. For example, across all three base distros (Alpine, Debian, Ubuntu),sudo
,mc
, andcurl
are installed, and on every Alpine image,openssh
,newt
, andnano
are also installed.I think it makes sense to consolidate these manual installations into a single function for several reasons:
In this PR, I'm adding a new function -
install_core_packages
- to bothinstall.func
andinstall-alpine.func
, and replacing the manual dependency installation across a handful of scripts with the function for testing purposes.If this PR is acceptable, I'll do the work to update the other 200+ installation scripts.
🛠️ Type of Change
Please check the relevant options:
✅ Prerequisites
The following steps must be completed for the pull request to be considered:
Documentation updated (I have updated any relevant documentation)📋 Additional Information (optional)
I ran a locally modified checkout that replaced the URLs with pointers to my repository in ct/alpine.sh, misc/build.func, and ran the Alpine installer script.