Skip to content

Commit

Permalink
More documentation.
Browse files Browse the repository at this point in the history
  • Loading branch information
ioquatix committed Jul 5, 2024
1 parent 23631d3 commit 03410d1
Show file tree
Hide file tree
Showing 2 changed files with 9 additions and 0 deletions.
3 changes: 3 additions & 0 deletions lib/async/ollama/client.rb
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,11 @@ module Async
module Ollama
# Represents a connection to the Ollama service.
class Client < Async::REST::Resource
# The default endpoint to connect to.
ENDPOINT = Async::HTTP::Endpoint.parse('http://localhost:11434')

# Generate a response from the given prompt.
# @parameter prompt [String] The prompt to generate a response from.
def generate(prompt, **options, &block)
options[:prompt] = prompt
options[:model] ||= 'llama3'
Expand Down
6 changes: 6 additions & 0 deletions lib/async/ollama/generate.rb
Original file line number Diff line number Diff line change
Expand Up @@ -9,18 +9,24 @@
module Async
module Ollama
class Generate < Async::REST::Representation[Wrapper]
# The response to the prompt.
def response
self.value[:response]
end

# The conversation context. Used to maintain state between prompts.
def context
self.value[:context]
end

# The model used to generate the response.
def model
self.value[:model]
end

# Generate a new response from the given prompt.
# @parameter prompt [String] The prompt to generate a response from.
# @yields {|response| ...} Optional streaming response.
def generate(prompt, &block)
self.class.post(self.resource, prompt: prompt, context: self.context, model: self.model, &block)
end
Expand Down

0 comments on commit 03410d1

Please sign in to comment.