-
Notifications
You must be signed in to change notification settings - Fork 36
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Create pipeline to push zip file with dependencies to an S3 bucket #683
Comments
Hey Constança, thanks for opening this issue. Some comments below.
I don't think there is any need to create a Buildkite pipeline, since ESF does not need to be released as part of the Elastic stack. So feel free to keep using Github Actions as we already do, unless you find some benefit in moving to Buildkite.
We currently track releases with git tags, so the workflow could be triggered by the creation of a new git tag. We also track the version in version.py, which is currently updated manually. There is already a related issue about how to automate updates on this file and how to handle version bumps in general #540. I'd consider it as a preliminary task for this issue. I would also make sure that the solution is extensible enough to be able to add automated deployment to SAR as well, in a future release.
This is more like an option, the current AWS Lambda Terraform module
You can use the same account where we store SAR artifacts.
Technically no, the AWS Lambda Python runtime already includes some of them (e.g boto3). However, we should stick to what is on |
Thank you @girodav for such a detailed answer. I am working on setting a workflow on github actions like you mentioned. It seems a bit tricky to test, so I will do it in a private repository first and then I will open a PR and link it to this issue as well as to #540. It won't be taking of the SAR currently but it seems easy to adapt the workflow if setting the right trigger: on:
push:
branches:
- 'main'
paths:
- 'version.py' |
Description
This issue comes from this comment thread of a PR to use terraform to install ESF.
The current approach for the terraform files:
The desired approach: have all dependencies in a zip file and push this to an S3 bucket.
Steps
Step 1
Create a new buildkite pipeline in this directory.
Each version release (or commit?) triggers the creation of a new zip file with all the dependencies. This zip file needs to be pushed to an S3 bucket that will be used by customers. The S3 bucket needs to be read only.
The zip file will have the following structure:
pip install --target ./package <REQUIREMENT>
.Reference: https://docs.aws.amazon.com/lambda/latest/dg/python-package.html#python-package-create-dependencies.
Step 2
Refactor the terraform files:
aws_lambda_function
that reads from the S3 bucket with the zip file:Tasks
The text was updated successfully, but these errors were encountered: