Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue 102 - Allow setting runtimeClassName #291

Merged
merged 1 commit into from
Mar 19, 2024
Merged

Issue 102 - Allow setting runtimeClassName #291

merged 1 commit into from
Mar 19, 2024

Conversation

nvvfedorov
Copy link
Collaborator

No changes in the values.yaml, except comments added for better documentation.

Test steps:

  1. Install K3S
curl -sfL https://get.k3s.io | K3S_KUBECONFIG_MODE="644" sh -s -
  1. View available run times:
kubectl get runtimeclasses

You should expect to see:

NAME                  HANDLER               AGE
crun                  crun                  4h
lunatic               lunatic               4h
nvidia-experimental   nvidia-experimental   4h
slight                slight                4h
spin                  spin                  4h
wasmedge              wasmedge              4h
wasmer                wasmer                4h
wasmtime              wasmtime              4h
wws                   wws                   4h
nvidia                nvidia                4h
  1. Deploy the dcgm exporter
helm install --generate-name ./deployment/ --set runtimeClassName=nvidia --set serviceMonitor.enabled=false
  1. Check pod configuration:
kubectl get pods deployment-1710791202-dcgm-exporter-jrgc7 -o yaml

You should see: runtimeClassName: nvidia and that the pod is up and running.

@nvvfedorov nvvfedorov self-assigned this Mar 18, 2024
@nvvfedorov nvvfedorov merged commit 9cfb2a2 into main Mar 19, 2024
1 check passed
@nvvfedorov nvvfedorov deleted the issue-102 branch March 19, 2024 15:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants