Skip to content

Commit

Permalink
docs: Add a quickstart guide for Llama (#91)
Browse files Browse the repository at this point in the history
## Description
As discussed internally, adding a quickstart with running LLMs to the
README - #17

### Type of change
- [ ] Bug fix (non-breaking change which fixes an issue)
- [ ] New feature (non-breaking change which adds functionality)
- [ ] Breaking change (fix or feature that would cause existing
functionality to not work as expected)
- [x] Documentation update (improves or adds clarity to existing
documentation)

### Tested on
- [x] iOS
- [x] Android

### Testing instructions
<!-- Provide step-by-step instructions on how to test your changes.
Include setup details if necessary. -->

### Screenshots
<!-- Add screenshots here, if applicable -->

### Related issues
<!-- Link related issues here using #issue-number -->

### Checklist
- [x] I have performed a self-review of my code
- [x] I have commented my code, particularly in hard-to-understand areas
- [x] I have updated the documentation accordingly
- [x] My changes generate no new warnings

### Additional notes
<!-- Include any additional information, assumptions, or context that
reviewers might need to understand this PR. -->
  • Loading branch information
chmjkb authored Jan 30, 2025
1 parent cbe2b55 commit 2a98ffa
Showing 1 changed file with 46 additions and 0 deletions.
46 changes: 46 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,52 @@ To run any AI model in ExecuTorch, you need to export it to a `.pte` format. If
Take a look at how our library can help build you your React Native AI features in our docs:
https://docs.swmansion.com/react-native-executorch


# 🦙 **Quickstart - Running Llama**

**Get started with AI-powered text generation in 3 easy steps!**

### 1️⃣ **Installation**
```bash
# Install the package
yarn add react-native-executorch
cd ios && pod install && cd ..
```

---

### 2️⃣ **Setup & Initialization**
Add this to your component file:
```tsx
import {
LLAMA3_2_1B_QLORA,
LLAMA3_2_3B_TOKENIZER,
useLLM
} from 'react-native-executorch';

function MyComponent() {
// Initialize the model 🚀
const llama = useLLM({
modelSource: LLAMA3_2_1B_QLORA,
tokenizerSource: LLAMA3_2_1B_TOKENIZER
});
// ... rest of your component
}
```

---

### 3️⃣ **Run the model!**
```tsx
const handleGenerate = async () => {
const prompt = "The meaning of life is";

// Generate text based on your desired prompt
const response = await llama.generate(prompt);
console.log("Llama says:", response);
};
```

## Minimal supported versions
The minimal supported version is 17.0 for iOS and Android 13.

Expand Down

0 comments on commit 2a98ffa

Please sign in to comment.