TextGrad: Automatic ''Differentiation'' via Text -- using large language models to backpropagate textual gradients.
-
Updated
Dec 15, 2024 - Python
TextGrad: Automatic ''Differentiation'' via Text -- using large language models to backpropagate textual gradients.
End-to-end Generative Optimization for AI Agents
Unified Go interface for Language Model (LLM) providers. Simplifies LLM integration with flexible prompt management and common task functions.
Awesome-LLM-Prompt-Optimization: a curated list of advanced prompt optimization and tuning methods in Large Language Models
A very fast, very minimal prompt optimizer
PromptCraft is a prompt perturbation toolkit from the character, word, and sentence levels for prompt robustness analysis. PyPI Package: pypi.org/project/promptcraft
Official implementation for "GLaPE: Gold Label-agnostic Prompt Evaluation and Optimization for Large Language Models" (stay tuned & more will be updated)
This repository serves as a central hub for discovering tools and services focused on automated prompt engineering. Whether you're looking to optimize your prompts for generative AI models or enhance the capabilities of your agents, you'll find a wide range of resources here.
🎨 NovelAI Muse: Your pocket-sized creative companion! This Telegram bot serves up a delightful mix of random NovelAI artists and inspiring prompts, igniting your artistic spark. Discover new styles, overcome creative blocks, and generate stunning AI art with ease. ✨
Inspired by the paper: "Searching for Best Practices in Retrieval-Augmented Generation" by Wang et al. This repository is dedicated to search for the best RAG strategy.
Work in progress, will prolly never finish this
Unofficial langchain implementation of "Large Language Models as Optimizers".
An LLM-RAG powered application where you can talk with your own pdf.
Used DsPY and TextGrad to optimize prompts on GPQA Evaluation Benchmark. Tested on GPT-3.5, gemini- pro and Llama-3.
Add a description, image, and links to the prompt-optimization topic page so that developers can more easily learn about it.
To associate your repository with the prompt-optimization topic, visit your repo's landing page and select "manage topics."