You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
root@ubuntu:/app/inside_container/llama_python_demo# python3 demo.py
ChatCompletion(id='chatcmpl-64ea1c64-a6fb-46b2-ba36-8942e9a17540', choices=[Choice(finish_reason='stop', index=0, logprobs=None, message=ChatCompletionMessage(content='Hello! How can I assist you today?', refusal=None, role='assistant', function_call=None, tool_calls=None))], created=1726667242, model='qwen2', object='chat.completion', service_tier=None, system_fingerprint=None, usage=CompletionUsage(completion_tokens=9, prompt_tokens=20, total_tokens=29, completion_tokens_details=None))
then when I used following code try to start using llama-cpp-agent
fromllama_cpp_agentimportLlamaCppAgentfromllama_cpp_agentimportMessagesFormatterTypefromllama_cpp_agent.providersimportLlamaCppServerProviderprovider=LlamaCppServerProvider("http://127.0.0.1:8000", llama_cpp_python_server=True)
agent=LlamaCppAgent(
provider,
system_prompt="You are a helpful assistant.",
predefined_messages_formatter_type=MessagesFormatterType.CHATML,
)
settings=provider.get_provider_default_settings()
settings.n_predict=512settings.temperature=0.65whileTrue:
user_input=input(">")
ifuser_input=="exit":
breakagent_output=agent.get_chat_response(user_input, llm_sampling_settings=settings)
print(f"Agent: {agent_output.strip()}")
I see the error
root@ubuntu:/app/inside_container/llama_python_demo# python3 demo.py
>hello
Traceback (most recent call last):
File "/app/inside_container/llama_python_demo/demo.py", line 21, in <module>
agent_output = agent.get_chat_response(user_input, llm_sampling_settings=settings)
File "/usr/local/lib/python3.10/dist-packages/llama_cpp_agent/llm_agent.py", line 334, in get_chat_response
for out in completion:
File "/usr/local/lib/python3.10/dist-packages/llama_cpp_agent/providers/llama_cpp_server.py", line 279, in generate_text_chunks
new_data = json.loads(decoded_chunk.replace("data:", ""))
File "/usr/lib/python3.10/json/__init__.py", line 346, in loads
return _default_decoder.decode(s)
File "/usr/lib/python3.10/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib/python3.10/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 3 (char 2)
Could you help me look at it?
Best wishes
The text was updated successfully, but these errors were encountered:
Hello,
I have trouble when I start using this llama-cpp-agent
I run serve using followed command
then I can use followed code the request
with excepted output
then when I used following code try to start using llama-cpp-agent
I see the error
Could you help me look at it?
Best wishes
The text was updated successfully, but these errors were encountered: