-
Notifications
You must be signed in to change notification settings - Fork 233
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Patch for Gaudi Text-Generation Pipeline (#690)
- Loading branch information
1 parent
44be555
commit 8841452
Showing
2 changed files
with
52 additions
and
1 deletion.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -31,6 +31,11 @@ If you plan to use [DeepSpeed-inference](https://docs.habana.ai/en/latest/PyTorc | |
pip install git+https://github.com/HabanaAI/[email protected] | ||
``` | ||
|
||
If you would like to use the pipeline with LangChain classes, you can install LangChain as follows: | ||
```bash | ||
pip install langchain==0.0.191 | ||
``` | ||
|
||
## Usage | ||
|
||
To run generation with DeepSpeed-inference, you must launch the script as follows: | ||
|
@@ -125,3 +130,40 @@ python ../../gaudi_spawn.py --use_deepspeed --world_size 8 run_pipeline.py \ | |
--top_p 0.95 \ | ||
--prompt "Hello world" "How are you?" "Here is my prompt" "Once upon a time" | ||
``` | ||
|
||
### Usage with LangChain | ||
|
||
The text-generation pipeline can be fed as input to LangChain classes via the `use_with_langchain` constructor argument. Here is a sample snippet that shows how the pipeline class can be used with LangChain. | ||
```python | ||
from langchain.llms import HuggingFacePipeline | ||
from langchain.prompts import PromptTemplate | ||
from langchain.chains import LLMChain | ||
|
||
# Initialize the pipeline | ||
pipe = GaudiTextGenerationPipeline(args, logger, use_with_langchain=True) | ||
|
||
# Create LangChain object | ||
llm = HuggingFacePipeline(pipeline=pipe) | ||
|
||
template = """Use the following pieces of context to answer the question at the end. If you don't know the answer,\ | ||
just say that you don't know, don't try to make up an answer. | ||
Context: Large Language Models (LLMs) are the latest models used in NLP. | ||
Their superior performance over smaller models has made them incredibly | ||
useful for developers building NLP enabled applications. These models | ||
can be accessed via Hugging Face's `transformers` library, via OpenAI | ||
using the `openai` library, and via Cohere using the `cohere` library. | ||
Question: {question} | ||
Answer: """ | ||
|
||
prompt = PromptTemplate(input_variables=["question"], template=template) | ||
llm_chain = LLMChain(prompt=prompt, llm=llm) | ||
|
||
# Use LangChain object | ||
question = "Which libraries and model providers offer LLMs?" | ||
response = llm_chain(prompt.format(question=question)) | ||
print(f"Question: {question}") | ||
print(f"Response: {response['text']}") | ||
``` | ||
> The pipeline class has been validated for LangChain version 0.0.191 and may not work with other versions of the package. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters