Skip to content

Commit

Permalink
SnAr Benchmark Tests (#60)
Browse files Browse the repository at this point in the history
  • Loading branch information
marcosfelt authored Jul 26, 2020
1 parent db72c83 commit 5e4539d
Show file tree
Hide file tree
Showing 37 changed files with 32,602 additions and 12,187 deletions.
146 changes: 146 additions & 0 deletions .dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,146 @@
# Summit specific
tmp_files/
*.png
*.xlsx
*.csv
.git
data/

# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class

# C extensions
*.so

# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST

# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec

# Installer logs
pip-log.txt
pip-delete-this-directory.txt

# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
cover/

# Translations
*.mo
*.pot

# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal

# Flask stuff:
instance/
.webassets-cache

# Scrapy stuff:
.scrapy

# Sphinx documentation
docs/_build/

# PyBuilder
.pybuilder/
target/

# Jupyter Notebook
.ipynb_checkpoints

# IPython
profile_default/
ipython_config.py

# pyenv
# For a library or package, you might want to ignore these files since the code is
# intended to run in multiple environments; otherwise, check them in:
# .python-version

# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock

# PEP 582; used by e.g. github.com/David-OConnor/pyflow
__pypackages__/

# Celery stuff
celerybeat-schedule
celerybeat.pid

# SageMath parsed files
*.sage.py

# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/

# Spyder project settings
.spyderproject
.spyproject

# Rope project settings
.ropeproject

# mkdocs documentation
/site

# mypy
.mypy_cache/
.dmypy.json
dmypy.json

# Pyre type checker
.pyre/

# pytype static type analyzer
.pytype/

# Cython debug symbols
cython_debug/
6 changes: 3 additions & 3 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,11 +24,11 @@ jobs:
uses: abatilo/[email protected]
with:
python_version: 3.7.8
poetry_version: 1.0.3
args: install -E neptune
poetry_version: 1.0.10
args: install -E experiments
- name: Run pytest
uses: abatilo/[email protected]
with:
python_version: 3.7.8
poetry_version: 1.0.3
poetry_version: 1.0.10
args: run python -m pytest --doctest-modules --ignore=case_studies --ignore=experiments
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -116,3 +116,6 @@ venv.bak/
.TSEMO_DATA
tmp_files
Pytest*

# Snar benchmark
.snar_benchmark
10 changes: 10 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
FROM python:3.7

WORKDIR /summit_user
COPY setup.py requirements.txt ./
# Have to install numpy first due to Gryffin
RUN pip install numpy==1.18.0 && pip install -r requirements.txt
COPY summit summit/
RUN pip install .
ENTRYPOINT ["python"]

6 changes: 0 additions & 6 deletions Dockerfile.in_silico

This file was deleted.

18 changes: 0 additions & 18 deletions Dockerfile.main

This file was deleted.

23 changes: 23 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,29 @@ The documentation for summit can be found on the [wiki](https://github.com/susta
- All pull requests need one review.
- Tests will be run automatically when a pull request is created, and all tests need to pass before the pull request is merged.

### Docker
Sometimes, it is easier to run tests using a Docker container (e.g., on compute clusters). Here are the commands to build and run the docker containers using the included Dockferfile. The container entrypoint is python, so you just need to specify the file name.

To build the container and upload the container to Docker Hub.:
```
docker build . -t marcosfelt/summit:latest
docker push marcosfelt/summit:latest
```
You can change the tag from `latest` to whatever is most appropriate (e.g., the branch name). I have found that this takes up a lot of space on disk, so I have been running the commands on our private servers.
Then, to run a container, here is an example with the SnAr experiment code. The home directory of the container is called `summit_user`, hence we mount the current working directory into that folder. We remove the container upon finishing using `--rm` and make it interactive using `--it` (remove this if you just want the container to run in the background). [Neptune.ai](https://neptune.ai/) is used for the experiments so the API token is passed in. Finally, I specify the image name and the tag and before referencing the python file I want to run.
```
export token= #place your neptune token here
sudo docker run -v `pwd`/:/summit_user --rm -it --env NEPTUNE_API_TOKEN=$token summit:snar_benchmark snar_experiment_2.py
```
Singularity (for running Docker containers on the HPC):
```
export NEPTUNE_API_TOKEN=
singularity exec -B `pwd`/:/summit_user docker://marcosfelt/summit:snar_benchmark snar_experiment.py
```
### Releases
Below is the old process for building a release. In the future, we will have this automated using Github actions.
Expand Down
20 changes: 20 additions & 0 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -64,6 +64,26 @@ Commit Worfklow
* All pull requests need one review.
* Tests will be run automatically when a pull request is created, and all tests need to pass before the pull request is merged.

Docker
^^^^^^

Sometimes, it is easier to run tests using a Docker container (e.g., on compute clusters). Here are the commands to build and run the docker containers using the included Dockferfile. The container entrypoint is python, so you just need to specify the file name.

To build the container:

.. code-block::
docker build . -t summit:latest
You can change the tag from ``latest`` to whatever is most appropriate (e.g., the branch name).

Then, to run a container, here is an example with the SnAr experiment code. The home directory of the container is called ``summit_user``\ , hence we mount the current working directory into that folder. We remove the container upon finishing using ``--rm`` and make it interactive using ``--it`` (remove this if you just want the container to run in the background). `Neptune.ai <https://neptune.ai/>`_ is used for the experiments so the API token is passed in. Finally, I specify the image name and the tag and before referencing the python file I want to run.

.. code-block::
export token= #place your neptune token here
sudo docker run-v `pwd`/:/summit_user --rm -it --env NEPTUNE_API_TOKEN=$token summit:snar_benchmark snar_experiment.py
Releases
^^^^^^^^

Expand Down
19 changes: 0 additions & 19 deletions build.yml

This file was deleted.

Empty file.
Loading

0 comments on commit 5e4539d

Please sign in to comment.