Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding svn validations basic flow #1

Closed
wants to merge 25 commits into from
Closed
Show file tree
Hide file tree
Changes from 6 commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
10d2269
adding svn validations basic flow
gopidesupavan Dec 1, 2024
f77a114
update names
gopidesupavan Dec 1, 2024
164a1d1
remove echo
gopidesupavan Dec 1, 2024
ca41ee7
update tests
gopidesupavan Dec 1, 2024
add8003
adding signature check
gopidesupavan Dec 1, 2024
9c8703a
fix workspace name
gopidesupavan Dec 1, 2024
bb946f5
adding subsection actions
gopidesupavan Dec 7, 2024
0779890
add unit tests for svn check
gopidesupavan Dec 7, 2024
94b2215
add unit tests to check-sum
gopidesupavan Dec 7, 2024
7eea894
add pypi publish and tests
gopidesupavan Dec 8, 2024
bf17e47
update variables
gopidesupavan Dec 8, 2024
9db2dc5
update paths
gopidesupavan Dec 8, 2024
c43752f
update repo path
gopidesupavan Dec 8, 2024
b9d3465
add initial readme file
gopidesupavan Dec 8, 2024
cce1d2f
add tests to signature check
gopidesupavan Dec 9, 2024
d180e37
add python-gnupg module to tests
gopidesupavan Dec 9, 2024
45e96b9
add requests module to tests
gopidesupavan Dec 9, 2024
03e48d6
use pytest-unordered
gopidesupavan Dec 9, 2024
8025fa5
use pytest-unordered
gopidesupavan Dec 9, 2024
6e1ec93
log updates
gopidesupavan Dec 9, 2024
481a1b6
adding doc string and fixing lints
gopidesupavan Dec 9, 2024
1edb0d4
remove fromjson for non json outputs
gopidesupavan Dec 10, 2024
6883637
update sample workflow usage as per best practice suggestion
gopidesupavan Dec 10, 2024
0259bcf
update sample workflow usage as per best practice suggestion
gopidesupavan Dec 10, 2024
5af7723
rename action
gopidesupavan Dec 12, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 21 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
name: Test Gh publish
on:
workflow_dispatch:
pull_request:
branches:
- main
jobs:
test:
runs-on: ubuntu-latest
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v3

- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: '3.9'
- name: "Run tests"
run: |
python3 -m pip install pytest
pytest ./tests
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
__pycache__/
*./__pycache__/.*
72 changes: 71 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1 +1,71 @@
# gh-svn-pypi-publisher
# GH SVN PyPI Publisher

## Description
`Gh Publish` is a GitHub Action designed to validate artifacts and publish to PyPI. It includes steps for setting up Python, parsing configuration files, checking out SVN repositories, performing SVN and checksum checks, and publishing to PyPI.

## Inputs
- `publish-config` (required): Path to the publish config file. Default is `publish-config.yml`.
- `temp-dir` (optional): Temporary directory to checkout SVN repo. Default is `temp-svn-repo`.
- `mode` (optional): Mode to run the action. Default is `verify`.

## Usage
To use this action, include it in your workflow YAML file:

```yaml
name: Publish to PyPI

on:
workflow_dispatch:


jobs:
publish:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v2

- name: Gh Publish
uses: ./gopidesupavan/gh-pub@main
with:
publish-config: 'path/to/your/publish-config.yml'
temp-dir: 'temp-svn-repo'
mode: 'publish'
```
# About publish-config.yml

The `publish-config.yml` file has composed of multiple rules to validate the artifacts and publish them to PyPI. The configuration file is structured as follows:

```yaml
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This config file required to be configured in repos.

project:
Name: example-pub
Description: Example project for publishing to PyPI

publishers:
name: providers
url: https://dist.apache.org/repos/dist/dev/airflow/
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Plan is to checkout the project and use the path filed to workout validations

path: "airflow/providers/pypi"
version_pattern: '^(.*?)-(rc\d+-)?\d+'
extensions:
- .tar.gz
- .tar.gz.asc
- .tar.gz.sha512
- -py3-none-any.whl
- -py3-none-any.whl.asc
- -py3-none-any.whl.sha512
rules:
svn-check:
name: "SVN Check"
type: "svn"
enabled: "false"
checksum-check:
name: "SHA512 Check"
type: "512"
enabled: "true"
script: "/home/runner/work/example-pub/example-pub/scripts/checksum_check.sh"
```
svn-check: This rule is used to validate the package extension. It checks each package have the required extension or not. eg: .tar.gz, .tar.gz.asc, .tar.gz.sha512, -py3-none-any.whl, -py3-none-any.whl.asc, -py3-none-any.whl.sha512, total 6 extensions are required.

checksum-check: This rule is used to validate the checksum of the package. It checks the checksum of the package with the provided checksum type. eg: SHA512 checksum is required for each package.

script: Script use to validate the rules. if there is no script provided in the publish-config.yml file, the default script will be used to validate the rules. default scripts are under the src/scripts directory.
84 changes: 83 additions & 1 deletion action.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,18 @@

name: 'GH SVN PyPI Publisher'
description: 'Publishes artifacts to pypi'
inputs:
publish-config:
description: 'Path to the publish config file'
required: true
default: 'publish-config.yml'
temp-dir:
description: 'Temporary directory to checkout svn repo'
required: false
default: 'temp-svn-repo'
mode:
description: 'Mode to operate publish or verify'
required: false
default: 'verify'

runs:
using: "composite"
Expand All @@ -9,3 +21,73 @@ runs:
uses: actions/setup-python@v4
with:
python-version: '3.9'

- name: "Config parser"
shell: bash
id: config-parser
env:
PUBLISH_CONFIG: ${{ inputs.publish-config }}
GITHUB_ACTION_PATH: ${{ github.action_path }}
run: |
mkdir -p ${{ inputs.temp-dir }}
python3 -m pip install pyyaml
python3 $GITHUB_ACTION_PATH/src/scripts/config_parser.py "${PUBLISH_CONFIG}"

- name: Checkout SVN
shell: bash
env:
repo_url: ${{ fromJSON(steps.config-parser.outputs.pub_config).publishers.url }}
run: |
echo "Checking out SVN repo at $repo_url"
svn co $repo_url
echo "SVN repo checked out"
working-directory: "./${{ inputs.temp-dir }}"

- name: SVN check
shell: bash
if: ${{ fromJSON(steps.config-parser.outputs.pub_config).publishers.rules.svn-check.enabled == 'true' }}
env:
FILE_EXTENSIONS: ${{ toJson(fromJSON(steps.config-parser.outputs.pub_config).publishers.extensions) }}
PATH: ${{ fromJSON(steps.config-parser.outputs.pub_config).publishers.path }}
NAME: ${{ fromJSON(steps.config-parser.outputs.pub_config).publishers.rules.svn-check.name }}
VERSION_FORMAT: ${{ fromJSON(steps.config-parser.outputs.pub_config).publishers.version_pattern }}
SCRIPT: ${{ fromJSON(steps.config-parser.outputs.pub_config).publishers.rules.svn-check.script }}
working-directory: ./${{ inputs.temp-dir }}/${{ fromJSON(steps.config-parser.outputs.pub_config).publishers.path }}
run: |
echo "Verifying $NAME"
python3 $SCRIPT

- name: Checksum check
shell: bash
if: ${{ fromJSON(steps.config-parser.outputs.pub_config).publishers.rules.checksum-check.enabled == 'true' }}
env:
NAME: ${{ fromJSON(steps.config-parser.outputs.pub_config).publishers.rules.checksum-check.name }}
type: ${{ fromJSON(steps.config-parser.outputs.pub_config).publishers.rules.checksum-check.type }}
SCRIPT: ${{ fromJSON(steps.config-parser.outputs.pub_config).publishers.rules.checksum-check.script }}
working-directory: ./${{ inputs.temp-dir }}/${{ fromJSON(steps.config-parser.outputs.pub_config).publishers.path }}
run: |
echo "Verifying $NAME"
chmod +x $SCRIPT
$SCRIPT "$type"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure if I like this approach whith $SCRIPT. I think much better (and more composable) approach would be to have separate - very small actions for each of those scripts and compose this action out of smaller composable ones. This way we will not have to configure "scripts" in the configuration, but we would configure particular "actions" to be enabled. While it's less flexible, it would be a better abstraction IMHO. I somehow feel being able to provide a "script" to run is almost the same as writing your own action, where what we want, is to provide a user a set of predefined "actions" they can use. We do not want them to write new scripts.

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, I see what you mean. That makes sense—using small, composable actions. So, I can split these steps into multiple actions within the same repository. What do you think?

As for the scripts I initially had in mind: the default script we provide would validate the rule. If someone wants to override it and use their custom script, they can specify the script location as a parameter in the config file. For example, in this workflow, the script is sourced from the workflow repository itself https://github.com/gopidesupavan/example-workspace/blob/main/publish-config.yml#L26.

On the other hand, for svn-check, no script is explicitly defined in the workflow https://github.com/gopidesupavan/example-workspace/blob/main/publish-config.yml#L18, so it falls back to the default script available in the action repository: https://github.com/gopidesupavan/gh-svn-pypi-publisher/blob/basic-feat/src/scripts/svn_checker.py.

Does this behavior okay to have?

Copy link

@potiuk potiuk Dec 1, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, I see what you mean. That makes sense—using small, composable actions. So, I can split these steps into multiple actions within the same repository. What do you think?

Yep.

As for the scripts I initially had in mind: the default script we provide would validate the rule. If someone wants to override it and use their custom script, they can specify the script location as a parameter in the config file. For example, in this workflow, the script is sourced from the workflow repository itself https://github.com/gopidesupavan/example-workspace/blob/main/publish-config.yml#L26.

I see, but if we we have small composable sub-actions, then it's even less effort to replace such "sub-action" from "our" repo, with your own "sub-action" you write - following the same pattern and writing your own hard-coded script. So in this case the users will get the "customizability" at the "sub-action" level, not the "script" level. Which I think is much better.

Copy link

@potiuk potiuk Dec 1, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Say (conceptually):

Our "composed" action:

Action "Publish to PyPI"
    Sub action read configuration
    Sub action verify SVN
    sub action verify Cheksum
    Sub action verify Signature
    Sub action Push To PyPI

And someone who would like to customize it, would copy our action to their repo and do this

Action "Publish to PyPI (my own)"
    Sub action read configuration
    Sub action verify SVN (my own)
    sub action verify Cheksum
    Sub action verify Signature
    Sub action Push To PyPI

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ideally the "composed" action will be very simple and easy to copy and replace parts of it.

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes small composable actions will definitely help here better way to handle these subsections and lot more easier for me to add this aswell :) great suggestion small composable action 🚀


- name: Signature check
shell: bash
if: ${{ fromJSON(steps.config-parser.outputs.pub_config).publishers.rules.signature-check.enabled == 'true' }}
env:
NAME: ${{ fromJSON(steps.config-parser.outputs.pub_config).publishers.rules.signature-check.name }}
SCRIPT: ${{ fromJSON(steps.config-parser.outputs.pub_config).publishers.rules.signature-check.script }}
KEYS: ${{ fromJSON(steps.config-parser.outputs.pub_config).publishers.rules.signature-check.keys }}
KEYS_FILE_LOCATION: ${{ github.workspace }}/${{ inputs.temp-dir }}/KEYS
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can download KEYS from SVN

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah ... I see we do :) .

working-directory: ./${{ inputs.temp-dir }}/${{ fromJSON(steps.config-parser.outputs.pub_config).publishers.path }}
run: |
curl -o $KEYS_FILE_LOCATION $KEYS
gpg --import $KEYS_FILE_LOCATION
echo "Verifying $NAME"
chmod +x $SCRIPT
$SCRIPT

- name: Publish to PyPI
shell: bash
if: ${{ inputs.mode == 'publish' }}
run: |
echo "TBD Publishing to PyPI"
30 changes: 30 additions & 0 deletions publish-config.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
project:
Name: gh-svn-pypi-publisher
Description: Example project for publishing to PyPI

publishers:
name: providers
url: https://gh-svn-pypi-publisher/
path: "airflow/providers/"
version_pattern: '^(.*?)-(rc\d+-)?\d+'
extensions:
- .tar.gz
- .tar.gz.asc
- .tar.gz.sha512
- -py3-none-any.whl
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We Likely need to make it a bit more flexible and allow regexp here. While Airlow only publishes one wheel, many of other projects that have some binary components will have multiple wheel files for many architectures/ ABIs - https://peps.python.org/pep-0491/#file-name-convention

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not hardcoded one, any file extension can be provided from the config file. Sure, agree can update with regex support. :)

- -py3-none-any.whl.asc
- -py3-none-any.whl.sha512
rules:
svn-check:
name: "SVN Check"
type: "svn"
enabled: "false"
checksum-check:
name: "SHA512 Check"
type: "512"
enabled: "false"
signature-check:
name: "Signature Check"
enabled: "false"
script: "/home/runner/work/gh-svn-pypi-publisher/gh-svn-pypi-publisher/scripts/signature_check.sh"
keys: "https://dist.apache.org/repos/dist/release/airflow/KEYS"
Empty file added src/__init__.py
Empty file.
Empty file added src/scripts/__init__.py
Empty file.
15 changes: 15 additions & 0 deletions src/scripts/checksum_check.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
#!/bin/bash
gopidesupavan marked this conversation as resolved.
Show resolved Hide resolved
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also... One experience from the past... While it is seemingly easier to write BASH scripts in simple cases, it's VERY quickly when the scripts become too complex/difficult to understand and maintain. I'd rather rewrite all those scripts in Python and use uv to run them in github actions with the nice support for standalone scripts that uv already has:

https://docs.astral.sh/uv/guides/scripts/#declaring-script-dependencies

This is PEP723 - https://peps.python.org/pep-0723/ that uv already supports and sooner or later it will be natively supported in Python environment (because that PEP is already approved). I think it's the right time to start using it in our automation.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lack of this "inline dependency declarations" was for me the one reason in the past I prefered to choose bash over Python, but since it is already there, and we have uv support for running such scripts and automatically creating and using the venv that is needed to do it, has removed the last obstacle for me where python scripts were not as good as bash scripts. With this one, I have absolutely no reason to use bash for scripting.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(one thing it allows for example to add colors by declaring rich as dependency in such inline declaration).

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes sure, I like the python scripts to do these tasks. as this is initial draft and tried existing process in this. :). agree bash is bit complex to understand when it grows to bigger 😄


EXIT=0

for i in *.asc
do
echo -e "Checking $i\n"; gpg --verify $i
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice to have... add colors.

done

if [ $EXIT -eq 1 ]; then
echo "One or more checksums did not match."
exit 1
else
echo "All checksums match."
fi
44 changes: 44 additions & 0 deletions src/scripts/config_parser.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
import os
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about declaring the config structure via json schema (and adding inline requirements to the script to do so).

I think in this case, rather than parsing the file with dedicated parser, we could add validaton in the schema and convert the .yml file into outputs in a generic way:

x:
   y:
     z

=> outputs.x.y.z

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure thats possible we can do that, and thats easy to extract output aswell :) in other steps

Copy link
Owner Author

@gopidesupavan gopidesupavan Dec 2, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Was scratching my head to set some config structure. with the subsection approach we dont need bother about script handling which is really good thing and it simplifies :) how about this config structure?. anyone uses these actions they have to send input config like this to action . Any suggestion :) ?

project:
    Name: some-project
    Description: Example project for publishing to PyPI

publisher:
    name: providers
    url: https://dist.apache.org/repos/dist/release/airflow/
    path: "airflow/providers/"

checks:
  svn-check:
    - id: "extension-check"
      description: "Extension check"
      identifiers:
        - type: "regex"
          pattern: '.*'
          
    - id: "package-name-check"
      description: "Package name check"
      identifiers:
          - type: "regex"
            pattern: 'apache-airflow-providers-(.*?)-python3-none-any.whl'

  checksum-check:
    - id: "checksum-check"
      description: "SHA512 Check"
      algorithm: "512"
      
  signature-check:
    - id: "signature-check"
      description: "Signature Check"
      method: "gpg"
      keys: "https://dist.apache.org/repos/dist/release/airflow/KEYS"
  
  publish:
    - id: "publish"
      description: "Publish to PyPI"
      exclude:
        - ".*\\.asc"
        - ".*\\.sha512"

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Believe one more check in svn to validate the number file count per package?
eg: our repo has

The following files should be present (6 files):

.tar.gz + .asc + .sha512 (one set of files per provider)
-py3-none-any.whl + .asc + .sha512 (one set of files per provider)

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • Config looks good!
  • I'd say we should just check if each file has .asc and .sha512 - we do not need to fully verify the number of files, just making sure that all of them are signed and checksumed and that they are right - should be enough, so likely what we have in "checksum" and "signature" check will be enough.

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool :) will use same config then and use Json to validate schema.

import sys
import json
import yaml

DEFAULT_SVN_CHECKER_SCRIPT = "{github_action_path}/src/scripts/svn_checker.py"
DEFAULT_CHECK_SUM_SCRIPT = "{github_action_path}/src/scripts/checksum_check.sh"
DEFAULT_SIGNATURE_CHECK_SCRIPT = "{github_action_path}/src/scripts/signature_check.sh"


def set_default_config(config_data: dict):
svn_checker_script = config_data.get("publishers", {}).get("rules", {}).get("svn-check", {}).get("script")
if not svn_checker_script:
config_data["publishers"]["rules"]["svn-check"]["script"] = DEFAULT_SVN_CHECKER_SCRIPT.format(github_action_path=os.environ.get("GITHUB_ACTION_PATH"))

check_sum_script = config_data.get("publishers", {}).get("rules", {}).get("checksum-check", {}).get("script")
if not check_sum_script:
config_data["publishers"]["rules"]["checksum-check"]["script"] = DEFAULT_CHECK_SUM_SCRIPT.format(github_action_path=os.environ.get("GITHUB_ACTION_PATH"))

signature_check_script = config_data.get("publishers", {}).get("rules", {}).get("signature-check", {}).get("script")
if not signature_check_script:
config_data["publishers"]["rules"]["signature-check"]["script"] = DEFAULT_SIGNATURE_CHECK_SCRIPT.format(github_action_path=os.environ.get("GITHUB_ACTION_PATH"))

return config_data


def parse_config(path: str):
with open(path, 'r') as file:
config_data = yaml.safe_load(file)

updated_config_data = set_default_config(config_data)

def set_multiline_output(name, updated_data):
with open(os.environ['GITHUB_OUTPUT'], 'a') as f:
value = json.dumps(updated_data)
f.write(f'{name}={value}')

set_multiline_output("pub_config", updated_config_data)



if __name__ == '__main__':
config_path = sys.argv[1]
parse_config(config_path)
19 changes: 19 additions & 0 deletions src/scripts/signature_check.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
#!/bin/bash

EXIT=0

for i in *.asc
do
echo -e "Checking $i\n"
if ! gpg --verify $i 2>&1 | grep -q "Good signature"; then
echo "Signature check failed for $i"
EXIT=1
fi
done

if [ $EXIT -eq 1 ]; then
echo "One or more signature checks did not match."
exit 1
else
echo "All signature checks match."
fi
73 changes: 73 additions & 0 deletions src/scripts/svn_checker.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
import ast
import os
import re
import sys
from collections import Counter

unknown_files = []
unknown_file_extensions = []
valid_files = []
failed_count_check = []

def check_extension(file, extensions):
for extension in extensions:
if file.endswith(extension):
print(f"File extension {extension} matched with {file}")
return True
return False

def extract_name(file_name_pattern, file):
match = re.match(file_name_pattern, file)
if match:
name_before_version = match.group(1)
return name_before_version
return None

def validate_package_name_count(file_with_count, extensions):
for package_name, count in file_with_count.items():
if not (count == len(extensions)):
failed_count_check.append(f"package name: {package_name}, count: {count}, expected count: {len(extensions)}")

def check_files(version_pattern: str, extensions: list[str]):
exit_code = 0
files = os.listdir()
print(f"Found total files in {os.getcwd()}: ", len(files))
for file in files:
if not check_extension(file, extensions):
unknown_file_extensions.append(file)
continue

package_name = extract_name(version_pattern, file)
if not package_name:
unknown_files.append(file)
continue

# Just to make sure that we are counting to same package name, ex: apache-airflow and apache_airflow in the dist folder
valid_files.append(package_name.replace("-", "_"))

file_with_count = Counter(valid_files)
validate_package_name_count(file_with_count, extensions)

if failed_count_check:
print(f"Following packages are not matching the count: {failed_count_check}")
exit_code = 1

if unknown_files:
exit_code = 1
print(f"Following files are not matching the pattern: {unknown_files}")

if unknown_file_extensions:
exit_code = 1
print(f"Following files are not matching the extensions: {unknown_file_extensions}")

if exit_code == 0:
print("SVN check passed successfully.")

sys.exit(exit_code)



if __name__ == "__main__":
file_extensions = ast.literal_eval(os.environ.get("FILE_EXTENSIONS"))
version_format = os.environ.get("VERSION_FORMAT")
check_files(version_format, file_extensions)
Empty file added tests/__init__.py
Empty file.
Loading