-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add a CPU check and CI #15
Conversation
55d7a79
to
d93f261
Compare
Similar to #13, we also add a CPU github action. This action will run `pytest` on the repo. Currently there is only one test, which is the Llama test in torch_xla_models. In order to run the test today, we need a HF_TOKEN. I created a personal read only token and #14 tracks avoiding the need for HF_TOKEN, after which I'll need to remember to invalidate the token.
- name: Run PyTest | ||
run: | | ||
# TODO(https://github.com/AI-Hypercomputer/torchprime/issues/14): Remove and burn the token. | ||
export HF_TOKEN=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 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think github also has a way to store secrets. We do that for pytorch/xla repo but probably not worth spending time on given that we are going to remove this soon.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Got it
Similar to #13, we also add a CPU github action. This action will run `pytest` on the repo. Currently there is only one test, which is the Llama test in torch_xla_models. In order to run the test today, we need a HF_TOKEN. I created a personal read only token and #14 tracks avoiding the need for HF_TOKEN, after which I'll need to remember to invalidate the token.
Similar to #13, we also add a CPU github action. This action will run `pytest` on the repo. Currently there is only one test, which is the Llama test in torch_xla_models. In order to run the test today, we need a HF_TOKEN. I created a personal read only token and #14 tracks avoiding the need for HF_TOKEN, after which I'll need to remember to invalidate the token.
Similar to #13, we also add a CPU github action. This action will run
pytest
on the repo.Currently there is only one test, which is the Llama test in torch_xla_models.
In order to run the test today, we need a HF_TOKEN. I created a personal read only token and
#14 tracks avoiding the need for HF_TOKEN, after which I'll need to remember to invalidate the token.
This should fix #8