Skip to content

Commit

Permalink
Restructuring (#124)
Browse files Browse the repository at this point in the history
* refactor: move packages into crates folder

* feat: parameters should be send

* refactor: move submodule

* refactor: fix submodule

* refactor: flatten crates structure

* refactor: set correct version of llama

* fmt: lint fixes

* add cargo-toml

* better crate docs

* chore: add release plz
  • Loading branch information
williamhogman authored May 10, 2023
1 parent 73cdd99 commit 52bc6ff
Show file tree
Hide file tree
Showing 110 changed files with 145 additions and 17 deletions.
4 changes: 4 additions & 0 deletions .github/workflows/cicd.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,10 @@ jobs:
with:
command: fmt
args: --all -- --check
- uses: EmbarkStudios/cargo-deny-action@v1
with:
log-level: warn
command: check

build_and_test:
strategy:
Expand Down
27 changes: 27 additions & 0 deletions .github/workflows/release-plz.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
name: Release-plz

permissions:
pull-requests: write
contents: write

on:
push:
branches:
- main

jobs:
release-plz:
name: Release-plz
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v3
with:
fetch-depth: 0
- name: Install Rust toolchain
uses: dtolnay/rust-toolchain@stable
- name: Run release-plz
uses: MarcoIeni/[email protected]
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
CARGO_REGISTRY_TOKEN: ${{ secrets.CARGO_REGISTRY_TOKEN }}
4 changes: 2 additions & 2 deletions .gitmodules
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
[submodule "llm-chain-llama/sys/llama.cpp"]
path = llm-chain-llama/sys/llama.cpp
[submodule "crates/llm-chain-llama/sys/llama.cpp"]
path = crates/llm-chain-llama-sys/llama.cpp
url = https://github.com/ggerganov/llama.cpp.git
9 changes: 1 addition & 8 deletions Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,12 +1,5 @@
[workspace]
members = [
"llm-chain",
"llm-chain-openai",
"llm-chain-llama",
"llm-chain-llama/sys",
"llm-chain-local",
"llm-chain-qdrant",
]
members = ["crates/*"]

[workspace.metadata.release]
shared-version = true
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ repository = "https://github.com/sobelio/llm-chain/"
[dependencies]
anyhow = "1.0.71"
async-trait = "0.1.68"
llm-chain-llama-sys = { path = "./sys", version = "0.9" }
llm-chain-llama-sys = { path = "../llm-chain-llama-sys", version = "0.9" }
llm-chain = { path = "../llm-chain", version = "0.9.1" }
serde = { version = "1.0.160", features = ["derive"] }
thiserror = "1.0.40"
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -189,7 +189,7 @@ impl ExecutorTrait for Executor {
&self,
options: Option<&Self::PerInvocationOptions>,
prompt: &Prompt,
is_streaming: Option<bool>,
_is_streaming: Option<bool>,
) -> Result<Self::Output, Self::Error> {
let config = match options {
Some(options) => options.clone(),
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@ impl llm_chain::traits::Executor for Executor {
&self,
options: Option<&Self::PerInvocationOptions>,
prompt: &Prompt,
is_streaming: Option<bool>,
_is_streaming: Option<bool>,
) -> Result<Self::Output, Self::Error> {
let parameters = match options {
None => Default::default(),
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
1 change: 0 additions & 1 deletion llm-chain/Cargo.toml → crates/llm-chain/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,6 @@ license = "MIT"
keywords = ["llm", "langchain", "chatgpt", "chain"]
categories = ["science"]
authors = ["William Rudenmalm <[email protected]>"]
readme = "../docs/README.md"
repository = "https://github.com/sobelio/llm-chain/"

[features]
Expand Down
69 changes: 69 additions & 0 deletions crates/llm-chain/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,69 @@
# llm-chain 🚀

`llm-chain` is a collection of Rust crates designed to help you create advanced LLM applications such as chatbots, agents, and more. As a comprehensive LLM-Ops platform we have strong support for both cloud and locally-hosted LLMs. We also provide robust support for prompt templates and chaining together prompts in multi-step chains, enabling complex tasks that LLMs can't handle in a single step. We also provide vector store integrations making it easy to give your model long-term memory and subject matter knowledge. This empowers you to build sophisticated applications.

This crate is the main crate for `llm-chain`. You will need driver crate such as `llm-chain-openai`, or `llm-chain-local`

[![Discord](https://dcbadge.vercel.app/api/server/kewN9Gtjt2?style=for-the-badge)](https://discord.gg/kewN9Gtjt2)
[![Crates.io](https://img.shields.io/crates/v/llm-chain?style=for-the-badge)](https://crates.io/crates/llm-chain)
![License](https://img.shields.io/github/license/sobelio/llm-chain?style=for-the-badge)
[![Docs: Tutorial](https://img.shields.io/badge/docs-tutorial-success?style=for-the-badge&logo=appveyor)](https://sobelio.github.io/llm-chain/docs/getting-started-tutorial/index)

## Examples 💡

To help you get started, here is an example demonstrating how to use `llm-chain`. You can find more examples in the [examples folder](/llm-chain-openai/examples) in the repository.

```rust
let exec = executor!()?;
let res = prompt!(
"You are a robot assistant for making personalized greetings",
"Make a personalized greeting for Joe"
)
.run(parameters()!, &exec)
.await?;
println!("{}", res);
```

[➡️ **tutorial: get started with llm-chain**](https://sobelio.github.io/llm-chain/docs/getting-started-tutorial/index)
[➡️ **quick-start**: Create project based on our template](https://github.com/sobelio/llm-chain-template/generate)

## Features 🌟

- **Prompt templates**: Create reusable and easily customizable prompt templates for consistent and structured interactions with LLMs.
- **Chains**: Build powerful chains of prompts that allow you to execute more complex tasks, step by step, leveraging the full potential of LLMs.
- **ChatGPT support**: Supports ChatGPT models, with plans to add OpenAI's other models in the future.
- **LLaMa support**: Provides seamless integration with LLaMa models, enabling natural language understanding and generation tasks with Facebook's research models.
- **Alpaca support**: Incorporates support for Stanford's Alpaca models, expanding the range of available language models for advanced AI applications.
- **Tools**: Enhance your AI agents' capabilities by giving them access to various tools, such as running Bash commands, executing Python scripts, or performing web searches, enabling more complex and powerful interactions.
- **Extensibility**: Designed with extensibility in mind, making it easy to integrate additional LLMs as the ecosystem grows.
- **Community-driven**: We welcome and encourage contributions from the community to help improve and expand the capabilities of `llm-chain`.

## Getting Started 🚀

To start using `llm-chain`, add it as a dependency in your `Cargo.toml`:

```bash
cargo add llm-chain llm-chain-openai
```

The examples for `llm-chain-openai` require you to set the `OPENAI_API_KEY` environment variable which you can do like this:

```bash
export OPENAI_API_KEY="sk-YOUR_OPEN_AI_KEY_HERE"
```

Then, refer to the [documentation](https://docs.rs/llm-chain) and [examples](/llm-chain-openai/examples) to learn how to create prompt templates, chains, and more.

## Contributing 🤝

**We warmly welcome contributions from everyone!** If you're interested in helping improve `llm-chain`, please check out our [`CONTRIBUTING.md`](/docs/CONTRIBUTING.md) file for guidelines and best practices.

## License 📄

`llm-chain` is licensed under the [MIT License](/LICENSE).

## Connect with Us 🌐

If you have any questions, suggestions, or feedback, feel free to open an issue or join our [community discord](https://discord.gg/kewN9Gtjt2). We're always excited to hear from our users and learn about your experiences with `llm-chain`.

We hope you enjoy using `llm-chain` to unlock the full potential of Large Language Models in your projects. Happy coding! 🎉
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -64,12 +64,12 @@ pub trait Param: Send + Sync {
#[doc(hidden)]
pub trait ParamFull: Param + Debug + Send + Sync {
#[doc(hidden)]
fn boxed_clone(&self) -> Box<dyn ParamFull>;
fn boxed_clone(&self) -> Box<dyn ParamFull + Send>;
}

impl<T: Param + Debug + Clone + 'static> ParamFull for T {
#[doc(hidden)]
fn boxed_clone(&self) -> Box<dyn ParamFull> {
fn boxed_clone(&self) -> Box<dyn ParamFull + Send> {
Box::new(self.clone())
}
}
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
35 changes: 35 additions & 0 deletions deny.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
all-features = false
no-default-features = false
feature-depth = 1


[advisories]
db-path = "~/.cargo/advisory-db"
db-urls = ["https://github.com/rustsec/advisory-db"]
vulnerability = "deny"
unmaintained = "warn"
yanked = "warn"
notice = "warn"

[licenses]

unlicensed = "deny"
allow = [
"MIT",
"Apache-2.0",
"ISC",
"Unicode-DFS-2016",
"BSD-3-Clause",
"OpenSSL",
]
copyleft = "deny"
allow-osi-fsf-free = "neither"
default = "deny"
confidence-threshold = 0.8


[[licenses.clarify]]
name = "ring"
version = "*"
expression = "MIT AND ISC AND OpenSSL"
license-files = [{ path = "LICENSE", hash = 0xbd0eed23 }]
3 changes: 2 additions & 1 deletion docs/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# llm-chain 🚀

`llm-chain` is a collection of Rust crates designed to help you work with Large Language Models (LLMs) more effectively. Our primary focus is on providing robust support for prompt templates and chaining together prompts in multi-step chains, enabling complex tasks that LLMs can't handle in a single step. This includes, but is not limited to, summarizing lengthy texts or performing advanced data processing tasks.
`llm-chain` is a collection of Rust crates designed to help you create advanced LLM applications such as chatbots, agents, and more. As a comprehensive LLM-Ops platform we have strong support for both cloud and locally-hosted LLMs. We also provide robust support for prompt templates and chaining together prompts in multi-step chains, enabling complex tasks that LLMs can't handle in a single step. We also provide vector store integrations making it easy to give your model long-term memory and subject matter knowledge. This empowers you to build sophisticated applications.

[![Discord](https://dcbadge.vercel.app/api/server/kewN9Gtjt2?style=for-the-badge)](https://discord.gg/kewN9Gtjt2)
[![Crates.io](https://img.shields.io/crates/v/llm-chain?style=for-the-badge)](https://crates.io/crates/llm-chain)
Expand Down Expand Up @@ -32,6 +32,7 @@ println!("{}", res);
- **ChatGPT support**: Supports ChatGPT models, with plans to add OpenAI's other models in the future.
- **LLaMa support**: Provides seamless integration with LLaMa models, enabling natural language understanding and generation tasks with Facebook's research models.
- **Alpaca support**: Incorporates support for Stanford's Alpaca models, expanding the range of available language models for advanced AI applications.
- **`llm.rs` support**: Use llms in rust without dependencies on C++ code with our support for `llm.rs`
- **Tools**: Enhance your AI agents' capabilities by giving them access to various tools, such as running Bash commands, executing Python scripts, or performing web searches, enabling more complex and powerful interactions.
- **Extensibility**: Designed with extensibility in mind, making it easy to integrate additional LLMs as the ecosystem grows.
- **Community-driven**: We welcome and encourage contributions from the community to help improve and expand the capabilities of `llm-chain`.
Expand Down

0 comments on commit 52bc6ff

Please sign in to comment.