-
Notifications
You must be signed in to change notification settings - Fork 538
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add from_file
class method to the Prompt
object
#1355
Add from_file
class method to the Prompt
object
#1355
Conversation
b2605b3
to
df0eb2c
Compare
Yes, please open an issue 😄 |
f11f10c
to
98cef53
Compare
Added a couple comments, although since this is a draft PR you are probably addressing them already. |
f75319f
to
2954197
Compare
2954197
to
ee5c4ef
Compare
Working on this PR made me wonder: why do we integrate Jinja2 (or any specific templating engine) into Outlines? I mean, we could have just provided a Prompt interface that takes an already rendered |
I am not sure what you mean by "taking an already rendered |
TL;DR: I mean passing the output/result of the templating process to the import outlines
# Using Jinja2 ...
import jinja2
env = jinja2.Environment(loader=jinja2.FileSystemLoader("."))
template = env.get_template("prompt.jinja2")
# ... or, Using Mako ...
import Mako
template = mako.template.Template(filename="prompt.mako")
@outlines.prompt
def fancy(examples, question):
return template.render(examples=examples, question=question)
examples = [
{"question": "What is the capital of France?", "answer": "Paris"},
{"question": "What is 2 + 2?", "answer": "4"},
]
question = "What is the Earth's diameter?"
prompt = fancy(examples, question)
# ... but why not just?
rendered = template.render(examples=examples, question=question)
prompt = outline.Prompt(rendered)
# EDIT: Nevermind, while writing this down, I realized that this would
# essentially mean that arguments are evaluated too early, leaving the prompt
# without any arguments and simply storing a `str`... |
You can already use any templating engine you want with Outlines, by rendering the template and passing the returned string to a generator: import outlines
model = outlines.models.openai("gpt-4o")
# Input
examples = [
{"question": "What is the capital of France?", "answer": "Paris"},
{"question": "What is 2 + 2?", "answer": "4"},
]
question = "What is the Earth's diameter?"
# Using Jinja2 ...
import jinja2
env = jinja2.Environment(loader=jinja2.FileSystemLoader("."))
template = env.get_template("prompt.jinja2")
rendered = template.render(examples=examples, question=question)
generator = outlines.generate.text(model)
results = generator(rendered)
# ... or, Using Mako ...
import Mako
template = mako.template.Template(filename="prompt.mako")
rendered = template.render(examples=examples, question=question)
generator = outlines.generate.text(model)
results = generator(rendered) The role of the The confusion may come from poor naming on my part; this object should probably be called |
Thank you very much @rlouf for the clarification. I'm not sure about the naming either, but if it's extensible, that already sounds like a great design! Apologies for sharing ideas maybe a bit too quickly as they pop from my mind, I probably should have checked the documentation first! |
5aad80d
to
d318ac2
Compare
(Do not merge this PR yet, I will rewrite the commit history a bit once we've settled on deprecating or removing |
2b87ecb
to
32587ee
Compare
Help! I don't understand the failed test coverage. It says that only line 90 of |
It seems that tests don't cover that part of the flow, where if is True and
This is one of the cases, I suppose, where one of the old
|
9f1dca8
to
38d920f
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Turned out nicely, looking good! 🚀
38d920f
to
6286db7
Compare
Use `is` and `is not` for type comparisons, or `isinstance()` for isinstance checks
6286db7
to
066912f
Compare
I've cleaned up the git history, so it's ready to be merged now :) |
Great job! 🎉 |
Fix #1345: For now, I'm just trying to make sure I'm editing the right part of the codebase. I haven't managed to run the tests yet…
EDIT: I opened a separate issue about the falling tests 🙂
I noticed that the project use
pytest
, so I assumed that runningpip install ".[test]"
would install all the dependencies needed to run the full test suite. However, it seems that's not the case (it's currently complaining that I don't havevllm
). Should I open an issue (I can add the missing dependency under thetest
section of thepyproject.toml
)?#1356