Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move core dependencies to a function that templates can call #1319

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

cricalix
Copy link

@cricalix cricalix commented Jan 7, 2025

✍️ Description

Every single install script installs a consistent set of additional packages, and every author of the scripts has to copy/paste the relevant apt or apk lines into their script. For example, across all three base distros (Alpine, Debian, Ubuntu), sudo, mc, and curl are installed, and on every Alpine image, openssh, newt, and nano are also installed.

I think it makes sense to consolidate these manual installations into a single function for several reasons:

  1. Adding a new default/core package becomes a case of updating a maximum of two files (for instance, making gnupg a default/core package)
  2. Reduction in boilerplate across all scripts - complete removal in some cases, replaced by a single line

In this PR, I'm adding a new function - install_core_packages - to both install.func and install-alpine.func, and replacing the manual dependency installation across a handful of scripts with the function for testing purposes.

If this PR is acceptable, I'll do the work to update the other 200+ installation scripts.


🛠️ Type of Change

Please check the relevant options:

  • Bug fix (non-breaking change that resolves an issue)
  • New feature (non-breaking change that adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to change unexpectedly)
  • New script (a fully functional and thoroughly tested script or set of scripts)

✅ Prerequisites

The following steps must be completed for the pull request to be considered:

  • Self-review performed (I have reviewed my code to ensure it follows established patterns and conventions.)
  • Testing performed (I have thoroughly tested my changes and verified expected functionality.)
  • Documentation updated (I have updated any relevant documentation)

📋 Additional Information (optional)

I ran a locally modified checkout that replaced the URLs with pointers to my repository in ct/alpine.sh, misc/build.func, and ran the Alpine installer script.

    ___    __      _          
   /   |  / /___  (_)___  ___ 
  / /| | / / __ \/ / __ \/ _ \
 / ___ |/ / /_/ / / / / /  __/
/_/  |_/_/ .___/_/_/ /_/\___/ 
        /_/                   
  ⚙️  Using Default Settings on node pve
  🖥️  Operating System: alpine
  🌟  Version: 3.20
  📦  Container Type: Unprivileged
  💾  Disk Size: 0.1GB
  🧠  CPU Cores: 1
  🛠️  RAM Size: 512MiB
  🆔  Container ID: 112
  🚀  Creating a Alpine LXC using the above default settings
  
  ✔️  Using local for Template Storage.
  ✔️  Using local-lvm for Container Storage.
  ✔️  Updated LXC Template List
  ✔️  LXC Container 112 was successfully created.
  ✔️  Started LXC Container
  ✔️  Set up Container OS
  ✔️  Network Connected: 192.168.0.129
  ✔️  Internet Connected
  ✔️  DNS Resolved github.com to 4.208.26.197
  ✔️  Updated Container OS
  ✔️  Installed Core Packages
  ✔️  Customized Container
  ✔️  Completed Successfully!
alpine:~# apk list | grep -E '^(sudo|mc|newt|nano|curl|openssh)-[0-9]'
curl-8.11.1-r0 x86_64 {curl} (curl) [installed]
mc-4.8.32-r0 x86_64 {mc} (GPL-3.0-or-later) [installed]
nano-8.2-r0 x86_64 {nano} (GPL-3.0-or-later) [installed]
newt-0.52.24-r1 x86_64 {newt} (LGPL-2.0-only) [installed]
openssh-9.9_p1-r2 x86_64 {openssh} (SSH-OpenSSH) [installed]
sudo-1.9.16_p2-r0 x86_64 {sudo} (custom ISC) [installed]

@cricalix cricalix requested a review from a team as a code owner January 7, 2025 19:07
@github-actions github-actions bot added update script A change that updates a script high risk A change that can affect many scripts labels Jan 7, 2025
@cricalix cricalix changed the title Move core dependencies Move core dependencies to a function that templates can call Jan 7, 2025
@michelroegl-brunner
Copy link
Member

I like the idea, but if we would consider this, i think this should either move in the update_os function, or we keep it as is.
I see the upsite from this, it is a neat idea, but when there are to many funtion calls it will get cluttered and confusing for new scripter i think.
Also with the core deps in everey script everyone knows wich software gets installed without looking up the function.

@cricalix
Copy link
Author

cricalix commented Jan 7, 2025

I like the idea, but if we would consider this, i think this should either move in the update_os function, or we keep it as is. I see the upsite from this, it is a neat idea, but when there are to many funtion calls it will get cluttered and confusing for new scripter i think. Also with the core deps in everey script everyone knows wich software gets installed without looking up the function.

There's no reason why update_os couldn't call install_core_packages, separating out the logic just that tiny bit.

I think the entire preamble that's in every script could be moved to one function (call it setup or whatever), and authors of scripts just need to call "setup", add their own package dependencies, and then any custom scripting for things like github fetches, and then one function like "cleanup" to do all the cleanup. I might go so far as to say a templated script could look like:

#!/usr/bin/env bash
...
PACKAGED_DEPENDENCIES=(package1 package2 package3 app-package-name)
# cs prefix to indicate it's supplied by the community scripts project
cs_setup
cs_add_package_repository http://something.org/debian/appname
cs_install_packaged_dependencies  # a function that uses the dependencies array
# -- start custom install logic --
version=$(curl github.com/repo/releases | ...)
curl -sL github.com/repo/releases/app-$version.tar.gz
# unpack
# set up configs etc
# -- end custom install logic --
cs_cleanup

There is a balance to be found between making everyone copy-paste everything needed to get a container up, and hiding so much that no one knows what's happening in the background, and I suppose it depends who the audience for creating the shell scripts is. Is it people with zero knowledge of scripting (and yet somehow know how to use git and github?) or people who are familiar with bash scripts, and would be comfortable reading the setup function in build.func to see that it does

color
verb_ip6
catch_errors
setting_up_container
network_check
update_os
install_core_packages

?

As for core deps in every script - sure, but everyone is already copy-pasting a bunch of other functions without necessarily knowing what they do (they could be installing packages, and indeed for Alpine there is a package installation of bash even before the install script is downloaded).

@michelroegl-brunner
Copy link
Member

I´m in no way against your idea here, dont get me wrong. I actually like it.
This:

color
verb_ip6
catch_errors
setting_up_container
network_check
update_os
install_core_packages

could and should absolutly be only one function call in my opinion. But i dont think there should be any more functions than the necessary things we need as boilerplate.
About this part cs_add_package_repository http://something.org/debian/appname cs_install_packaged_dependencies # a function that uses the dependencies array
There is alredy a PR/Discussion wich focuses on this: LINK

@se-bastiaan
Copy link
Contributor

se-bastiaan commented Jan 8, 2025

Wouldn't it be possible to completely invert the relationship between the 'helper functions' in build.func and install.func?
That way the script that's contributed doesn't have to manage the installation lifecycle, only override hooks that are provided.

For example:

#!/usr/bin/env bash
source <(curl -s https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/installer.sh)
...
APP_NAME="my-app"
APT_DEPENDENCIES=(curl gcc )

function app_pre_install() {
    # Do things before actual app install, optional
    mkdir /opt/$app
}

function app_install() {
    # Do all commands to install the app itself, not optional
    version=$(curl github.com/repo/releases | ...)
    curl -sL github.com/repo/releases/app-$version.tar.gz
    ...
}

function app_post_install() {
    # Cleanup after the installation or something, optional
    rm $version.tar.gz
}

function app_pre_update() {
    # Do things before actual app update, optional
    app_pre_install
}

function app_update() {
    # Do all commands to update the app itself, optional
    # If this function is missing one can assume that the app is non-updateable and a message with that info can be shown
    mv /download /opt/app
}

function app_post_update() {
    # Cleanup after the update or something, optional
    app_post_install
}

# Call the kick-off function
cs_run

The cs_run is the one thing that then kicks off the installation. That function could contain the lifecycle management and it is outside of a user-script. In basic:

function cs_run() {
  header_info "$APP"
  base_settings
  variables
  color
  catch_errors

  if command -v pveversion >/dev/null 2>&1; then
    show_menus_etc
    setting_up_container
    network_check
    update_os
    install_core_packages
    apt install $APT_DEPENDENCIES
    command -v app_pre_install >/dev/null 2>&1 && app_pre_install
    # Do stuff?
    app_install
    # Do stuff?
    command -v app_post_install >/dev/null 2>&1 && app_post_install
  else
    if command -v app_update >/dev/null 2>&1; then
      ask_for_confirmation
      command -v app_pre_update >/dev/null 2>&1 && app_pre_update
      # Do stuff?
      check_container_storage
      check_container_resources
      app_update
      # Do stuff?
      command -v app_post_update >/dev/null 2>&1 && app_post_update
    else
      echo "The app cannot be updated because the app_update function does not exist."
    fi
  fi
}

@cricalix
Copy link
Author

cricalix commented Jan 9, 2025

Yes :) I was thinking the same thing, but figured I'd start small by removing a bunch of duplicated code. To me, it's a good end goal, and one that can be reached in stages (or in a big bang, but there's enough churn on the repository that stages makes sense to me) with incremental PRs that do things like "add the pre-install function and use it directly in the script" - so to take a simple example:

# shbang
APP="app"
PACKAGE_DEPENDENCIES=(tzdata)

function app_pre_install() {
  mkdir /opt/${APP}/data -p
}

color
verb_ip6
catch_errors
setting_up_container
network_check
update_os
install_core_packages
app_pre_install

# everything else inline
RELEASE=$(curl... | awk '/tag_name/ {...}')
...
# (or apt install, whatever)
# config file setup
# systemd setup

then becomes

# shbang
APP="app"
PACKAGE_DEPENDENCIES=(tzdata)
INSTALL_DIR="/opt/${APP}"
DATA_DIR="${INSTALL_DIR}/data"

function app_pre_install() {
  mkdir "${DATA_DIR}" -p
}

function app_install() {
  # Downloads using the tagname extraction, unpacks, moves to install dir?
  cs_download_github https://github/repo/releases/latest "${INSTALL_DIR}"
}

color
verb_ip6
catch_errors
setting_up_container
network_check
update_os
cs_install_core_packages
app_pre_install
app_install

# everything else inline
# config file setup
# systemd setup

And after several iterations the file is basically "define the functions, call them directly", and the final iteration is "define the functions, call cs_run" where cs_run has been set up similar to your proposal. It's more work in some respects, but it's also incremental progress that's easier to test I think.

@se-bastiaan
Copy link
Contributor

It's more work in some respects, but it's also incremental progress that's easier to test I think.

Exactly :)

@cricalix
Copy link
Author

cricalix commented Jan 9, 2025

Well, you folks are the stewards of this project - I'm up for doing iterations of work to start encapsulating individual pieces of work in the scripts into functions, with the install files calling the functions declared in the same file; it's work that can be picked up by other folks if I vanish (not saying I will, but $dayjob can consume a lot of my mental energy), so long as there's an agreed path/plan for how the project wants to go from scripts that inline "everything" to scripts that declare components of work in functions and use a runner to invoke the functions in the correct order.

@michelroegl-brunner
Copy link
Member

Thank you for your work, but it will take some time until all Contributors/Maintainer had a chance to take a look and discuss it.
I think we keep it open for further discussion and prob. merge it to a dev branch to test it if it gets approved by all :)

@cricalix
Copy link
Author

cricalix commented Jan 9, 2025

Thank you for your work, but it will take some time until all Contributors/Maintainer had a chance to take a look and discuss it. I think we keep it open for further discussion and prob. merge it to a dev branch to test it if it gets approved by all :)

Oh, not saying I need an answer "now" - happy to leave this here, and when you've all had a chance to discuss the idea etc, let me know. I haven't done more on this stack since posting this initial PR.

If you want all contributors to the project to have a read through, well, it'll be a while, but I'll wait :)

@MickLesk
Copy link
Member

MickLesk commented Jan 9, 2025

Fine for me. One Thing, we need to update all 220 Scripts to :D

@michelroegl-brunner
Copy link
Member

@cricalix Can you prob come up with some code and one or two siripts updatetd to incorporate this system? I would createt you a branch where you could put this changes against for testing.

@cricalix
Copy link
Author

cricalix commented Jan 9, 2025

Fine for me. One Thing, we need to update all 220 Scripts to :D

I don't mind doing the work in stages, rather than in one fell swoop - consider it a slowish refactor where at each point everything still works, and by the end everything including documentation is migrated (assuming the contributing guide lands at some point soon).

As @michelroegl-brunner suggests, I can come up with a few scripts that get the total treatment in a branch, and if it looks good, I can plan out an iterative approach to update all the scripts. In terms of merge conflicts and ability to split the load, perhaps a PR per script. Will be a lot of PRs, but each one is isolated and easy to revert if things break.

I think I'd want overrideable URL in place too before starting down this path, because it'll make it so much easier to test - I can spin up a Proxmox VM in QEMU, clone the repository, and do CSCRIPTS_BASE_URL=file:///path/to/repo ct/script.sh and it'll run totally from the local repository; can make changes if things aren't right, and re-test without the push/pull dance.

@michelroegl-brunner
Copy link
Member

michelroegl-brunner commented Jan 9, 2025

Fine by me. I suggest make some small changes like you have in this PR but with the bigger goal in mind, including your URLs and one or two changed install scripts for testing, further iteration. When it works we should be able merge it relativly easy thanks to the overidable urls.

i create a develop branch for this tommorow.

And the docs should land soon. (Hopefully next week)
There we can iterat as well then.

@michelroegl-brunner
Copy link
Member

@cricalix: core_dependencies_to_functions branch for you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
high risk A change that can affect many scripts update script A change that updates a script
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants