Skip to content

Commit

Permalink
Merge pull request #146 from miurla/gpt-4o
Browse files Browse the repository at this point in the history
Update default model to gpt-4o
  • Loading branch information
miurla authored May 13, 2024
2 parents 88f485b + b59e7a6 commit 5500bb7
Show file tree
Hide file tree
Showing 5 changed files with 6 additions and 6 deletions.
4 changes: 2 additions & 2 deletions .env.local.example
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,8 @@ UPSTASH_REDIS_REST_TOKEN=[YOUR_UPSTASH_REDIS_REST_TOKEN]
# OPENAI_API_BASE=

# Used to set the model for OpenAI API requests.
# If not set, the default is gpt-4-turbo.
# OPENAI_API_MODEL='gpt-4-turbo'
# If not set, the default is gpt-4o.
# OPENAI_API_MODEL='gpt-4o'

# Only writers can set a specific model. It must be compatible with the OpenAI API.
# USE_SPECIFIC_API_FOR_WRITER=true
Expand Down
2 changes: 1 addition & 1 deletion lib/agents/inquire.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ export async function inquire(

let finalInquiry: PartialInquiry = {}
await streamObject({
model: openai.chat(process.env.OPENAI_API_MODEL || 'gpt-4-turbo'),
model: openai.chat(process.env.OPENAI_API_MODEL || 'gpt-4o'),
system: `As a professional web researcher, your role is to deepen your understanding of the user's input by conducting further inquiries when necessary.
After receiving an initial response from the user, carefully assess whether additional questions are absolutely essential to provide a comprehensive and accurate answer. Only proceed with further inquiries if the available information is insufficient or ambiguous.
Expand Down
2 changes: 1 addition & 1 deletion lib/agents/query-suggestor.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ export async function querySuggestor(

let finalRelatedQueries: PartialRelated = {}
await streamObject({
model: openai.chat(process.env.OPENAI_API_MODEL || 'gpt-4-turbo'),
model: openai.chat(process.env.OPENAI_API_MODEL || 'gpt-4o'),
system: `As a professional web researcher, your task is to generate a set of three queries that explore the subject matter more deeply, building upon the initial query and the information uncovered in its search results.
For instance, if the original query was "Starship's third test flight key milestones", your output should follow this format:
Expand Down
2 changes: 1 addition & 1 deletion lib/agents/researcher.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ export async function researcher(

let isFirstToolResponse = true
const result = await nonexperimental_streamText({
model: openai.chat(process.env.OPENAI_API_MODEL || 'gpt-4-turbo'),
model: openai.chat(process.env.OPENAI_API_MODEL || 'gpt-4o'),
maxTokens: 2500,
system: `As a professional search expert, you possess the ability to search for any information on the web.
For each user query, utilize the search results to their fullest potential to provide additional information and assistance in your response.
Expand Down
2 changes: 1 addition & 1 deletion lib/agents/task-manager.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ export async function taskManager(messages: CoreMessage[]) {

try {
const result = await generateObject({
model: openai.chat(process.env.OPENAI_API_MODEL || 'gpt-4-turbo'),
model: openai.chat(process.env.OPENAI_API_MODEL || 'gpt-4o'),
system: `As a professional web researcher, your primary objective is to fully comprehend the user's query, conduct thorough web searches to gather the necessary information, and provide an appropriate response.
To achieve this, you must first analyze the user's input and determine the optimal course of action. You have two options at your disposal:
1. "proceed": If the provided information is sufficient to address the query effectively, choose this option to proceed with the research and formulate a response.
Expand Down

0 comments on commit 5500bb7

Please sign in to comment.