You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
At the moment the python packages required by Davai (ial_build, ial_expertise, vortex, epygram, ecbundle + davai_taskutil) are links to pre-installed versions on given user accounts.
It could be better to git-clone them or to set up a virtualenv for each experiment ? (and in this case to publish these on PyPI).
Summary of a discussion with @AlexandreMary during the Davai WW 2024.
Currently it is expected that the davai-run_tests command is ran with the current working directory being a "experiment" directory as created by the davai-new_xp command.
The DAVAI-tests directory is a clone of the repository, performed on the fly by the davai-new_xp command.
One option is to work with one python virtual environment per experiment, instead of a directory containing a clone of DAVAI-tests and links to existing installations of the various dependencies. This virtual environment would be created by the davai-new_xp command.
davai-new_xp
$ ls
dv-0002-atos_bologna@sos
$ ls dv-0002-atos_bologna@sos
bin include lib lib64 pyvenv.cfg
If we assume the dependencies (epygram`, `ial_build`, `ial_expertise and mkjob) can be installed via pip, they would installed within this virtual environment.
Now, instead of cloning the DAVAI-tests repo, we could instead pip install the tasks` and `davai_taskutil packages it contains, potentially merging them into one. With this package installed, a job generated by mkjob would be able to import the task module from the tasks package, e.g.
import tasks.<module> as todo
Running a davai experiment is now a matter of using the davai-run_exp providing a experiment id, or possibly the path to a experiment virtualenv:
davai-run_exp dv-0002-atos_bologna@sos
This would run the mkjob entry point located in the virtualenv for the dv-0002-atos_bologna@sos experiment.
The above assumes that both mkjob and vortex are available as regular pip installable package, which is not yet the case.
At the moment the python packages required by Davai (ial_build, ial_expertise, vortex, epygram, ecbundle + davai_taskutil) are links to pre-installed versions on given user accounts.
It could be better to git-clone them or to set up a virtualenv for each experiment ? (and in this case to publish these on PyPI).
Status of sub-packages on PyPI:
davai_taskutil
(This issue is a duplicate of ACCORD-NWP/DAVAI-env#9)
The text was updated successfully, but these errors were encountered: