Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The examples mostly don’t work and are hard to understand. #101

Closed
JoseGuilherme1904 opened this issue Oct 29, 2024 · 3 comments
Closed
Labels

Comments

@JoseGuilherme1904
Copy link

🚀 The feature, motivation and pitch

I insist on trying to learn and study the llama-stack API and its examples, but they are complex. There’s no documentation, or I’m unable to access it.

I watched all the course videos:
https://learn.deeplearning.ai/courses/introducing-multimodal-llama-3-2/lesson/7/tool-calling

While I understood the lessons, I feel lost applying them locally. The course seemed to end before diving into llama-stack.

How can I use the following API calls?

Serving API telemetry
GET /telemetry/get_trace
POST /telemetry/log_event

Serving API models
GET /models/get
GET /models/list
POST /models/register

Serving API scoring
POST /scoring/score
POST /scoring/score_batch

Serving API datasets
GET /datasets/get
GET /datasets/list
POST /datasets/register

Serving API inference
POST /inference/chat_completion
POST /inference/completion
POST /inference/embeddings

Serving API agents
POST /agents/create
POST /agents/session/create
POST /agents/turn/create
POST /agents/delete
POST /agents/session/delete
POST /agents/session/get
POST /agents/step/get
POST /agents/turn/get

Serving API inspect
GET /health
GET /providers/list
GET /routes/list

Serving API scoring_functions
GET /scoring_functions/get
GET /scoring_functions/list
POST /scoring_functions/register

Serving API memory
POST /memory/insert
POST /memory/query

Serving API eval
POST /eval/evaluate
POST /eval/evaluate_batch
POST /eval/job/cancel
GET /eval/job/result
GET /eval/job/status

Serving API memory_banks
GET /memory_banks/get
GET /memory_banks/list
POST /memory_banks/register

Serving API shields
GET /shields/get
GET /shields/list
POST /shields/register

Serving API datasetio
GET /datasetio/get_rows_paginated

I’ve been trying to study this Meta project for three months, but without clear documentation, it’s frustrating. If anyone here works with Meta or can offer guidance, I would greatly appreciate the help.

Alternatives

python examples/memory/client.py localhost 5000

Traceback (most recent call last):
File "/home/guilherme/angular/llama/llama-stack-apps/examples/memory/client.py", line 131, in
fire.Fire(main)
File "/home/guilherme/.local/lib/python3.10/site-packages/fire/core.py", line 135, in Fire
component_trace = _Fire(component, args, parsed_flag_args, context, name)
File "/home/guilherme/.local/lib/python3.10/site-packages/fire/core.py", line 468, in _Fire
component, remaining_args = _CallAndUpdateTrace(
File "/home/guilherme/.local/lib/python3.10/site-packages/fire/core.py", line 684, in _CallAndUpdateTrace
component = fn(*varargs, **kwargs)
File "/home/guilherme/angular/llama/llama-stack-apps/examples/memory/client.py", line 127, in main
asyncio.run(run_main(host, port, stream))
File "/usr/lib/python3.10/asyncio/runners.py", line 44, in run
return loop.run_until_complete(main)
File "/usr/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete
return future.result()
File "/home/guilherme/angular/llama/llama-stack-apps/examples/memory/client.py", line 40, in run_main
providers = client.providers.list()
AttributeError: 'LlamaStackClient' object has no attribute 'providers'

Additional context

python -m examples.agents.vacation localhost 5000

created agents with agent_id=cbe4fa59-0601-484b-b47d-20225256a130
Traceback (most recent call last):
File "/usr/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/usr/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/home/guilherme/angular/llama/llama-stack-apps/examples/agents/vacation.py", line 58, in
fire.Fire(main)
File "/home/guilherme/.local/lib/python3.10/site-packages/fire/core.py", line 135, in Fire
component_trace = _Fire(component, args, parsed_flag_args, context, name)
File "/home/guilherme/.local/lib/python3.10/site-packages/fire/core.py", line 468, in _Fire
component, remaining_args = _CallAndUpdateTrace(
File "/home/guilherme/.local/lib/python3.10/site-packages/fire/core.py", line 684, in _CallAndUpdateTrace
component = fn(*varargs, **kwargs)
File "/home/guilherme/angular/llama/llama-stack-apps/examples/agents/vacation.py", line 54, in main
asyncio.run(run_main(host, port, disable_safety))
File "/usr/lib/python3.10/asyncio/runners.py", line 44, in run
return loop.run_until_complete(main)
File "/usr/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete
return future.result()
File "/home/guilherme/angular/llama/llama-stack-apps/examples/agents/vacation.py", line 37, in run_main
await execute_turns(
File "/home/guilherme/angular/llama/llama-stack-apps/examples/agents/multi_turn.py", line 40, in execute_turns
agent = await get_agent_with_custom_tools(
File "/home/guilherme/angular/llama/llama-stack-apps/common/client_utils.py", line 190, in get_agent_with_custom_tools
session_response = client.agents.session.create(
AttributeError: 'AgentsResource' object has no attribute 'session'. Did you mean: 'sessions'?

@ashwinb
Copy link
Contributor

ashwinb commented Oct 29, 2024

I am so sorry! We hear your frustration @JoseGuilherme1904. Well first off, thank you so much for giving us a try. We are working on making sure these issues are ironed out as soon as possible and update our documentation pages. Hopefully this will land in the next couple of weeks. Please stay tuned.

@dineshyv
Copy link
Contributor

dineshyv commented Oct 31, 2024

@JoseGuilherme1904 I am assuming you already have the llama stack distribution running at localhost 5000. If not, please follow instructions at https://github.com/meta-llama/llama-stack/tree/main/distributions to get started running a distribution.
Once you the llama stack distribution running, there are multiple ways to access the API.

You can always directly use the REST API you mentioned above. You can also use the python SDK to start building apps: https://github.com/meta-llama/llama-stack-client-python.

Coming to your specific examples, these are running python apps using the python SDK linked. The error seems to indicate that the SDK objects might be out of date. Can you try reinstalling the requirements.txt modules of this repo once again and see if that helps?
something like this should do it:

pip install --upgrade --force-reinstall -r requirements.txt

@dineshyv dineshyv added the acked label Oct 31, 2024
@heyjustinai
Copy link
Member

No response for a long time, so closing this issue. Feel free to reopen or create a new one if needed, thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants