Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cid/prompt #160

Merged
merged 3 commits into from
Dec 13, 2023
Merged

Cid/prompt #160

merged 3 commits into from
Dec 13, 2023

Conversation

gustavocidornelas
Copy link
Contributor

Summary

  • Refactored schemas. Now, we have clearly separated schemas for development datasets, reference datasets, and production data.
  • Added prompt (a List[Dict[str, str]) as a valid field in the LLM production data config.
  • The LLM monitor now uploads a prompt and formats the data uploaded using input variables. This will allow the display of multiple rounds of interaction with an LLM to be displayed on the platform. For instance, the two rounds of interaction represented as:
[{"role": "system", "content": "You are a helpful assistant."},
 {"role": "user", "content": "How are you doing today?"},
 {"role": "assistant", "content": "I'm doing ok, what about you?"},
{"role": "user", "content": "I asked first!"}]

>> Response: "Apologies for that. As an AI, I don't have feelings."

is uploaded as:

# in the data config:
...
prompt: [
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "{{ message_1 }}"},
    {"role": "assistant", "content": "I'm doing ok, what about you?"},
    {"role": "user", "content": "{{ message_3 }}"}
],
inputVariableNames: ["message_1", "message_3"],
...
# data (stream) uploaded:
[{
    "message_1": "How are you doing today?",
    "message_3": "I asked first!",
    "output": "Apologies for that. As an AI, I don't have feelings.",
    "tokens": ...
}]

Important

https://github.com/openlayer-ai/openlayer/pull/1419 must be tested and merged first.

Testing

Re-ran several notebook examples to ensure refactor did not introduce bugs. Tested LLM monitor locally, but couldn't stream data yet because of the backend PR.

@whoseoyster whoseoyster merged commit fe583ce into main Dec 13, 2023
2 checks passed
@whoseoyster whoseoyster deleted the cid/prompt branch December 13, 2023 22:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants