App by Build Fast with AI
A lightweight, offline-capable chatbot powered by Deepseek R1 1.5B and Ollama, running completely on your local machine. No internet required!
- 🏃♂️ Runs 100% locally - no internet or cloud services needed
- 💨 Streaming responses like ChatGPT
- 🧠 Shows AI thinking process in expandable sections
- 🎯 Uses the tiny but mighty Deepseek R1 1.5B model
- 🚀 Built with Streamlit for a clean, modern UI
- Python 3.10+
- Ollama installed on your system
- Deepseek R1 1.5B model pulled in Ollama
-
Install Ollama and pull the Deepseek model:
-
Clone this repository:
git clone [your-repo-url]
cd [your-repo-name]
- Install dependencies:
pip install -r requirements.txt
- Run the app:
streamlit run deepseek-r1-streamlit.py
The app uses:
- Ollama for running the Deepseek R1 1.5B model locally
- LangChain for model interaction
- Streamlit for the web interface
- Streaming responses for real-time interaction
- Expandable "thinking process" sections to see the AI's reasoning
The project uses three main Python packages:
streamlit
langchain
langchain-community
After starting the app:
- Open your browser at
http://localhost:8501
- Type your question in the chat input
- Watch the AI respond in real-time
- Click "Show AI thinking process" to see the reasoning
This chatbot is designed to work completely offline. Once you have:
- Installed Ollama
- Downloaded the Deepseek model
- Installed Python dependencies
You can disconnect from the internet and continue using the chatbot!
- Ollama for the local model serving
- Deepseek for the amazing 1.5B model
- LangChain for the framework
- Streamlit for the web interface
Made with ❤️ for the local AI community