Skip to content

roxyrypler/StableLM-API

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 

Repository files navigation

This is a super simple little api to run with ur app or postman to play with locally. I used the example from StableLM and expanded it. https://github.com/Stability-AI/StableLM

Make sure Cuda is installed.

Install from here if you dont have it
https://pytorch.org/get-started/locally/

i would recoment to use anaconda or miniconda

install:

pip install -U pip
pip install accelerate bitsandbytes torch transformers
pip install flask

Api will be available on port 5555

POST endpoints

/llm

needs this bodyData

{
    "prompt": "What is life?"
}

About

A simple Flask API wrapper for Stable LLM

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages