Skip to content

Add vLLM inference provider for OpenAI compatible vLLM server #35

Add vLLM inference provider for OpenAI compatible vLLM server

Add vLLM inference provider for OpenAI compatible vLLM server #35

Triggered via pull request October 11, 2024 18:38
Status Success
Total duration 25s
Artifacts

pre-commit.yml

on: pull_request
pre-commit
16s
pre-commit
Fit to window
Zoom out
Zoom in