Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No client found for model (Anthropic) #408

Open
britannio opened this issue Dec 17, 2024 · 2 comments
Open

No client found for model (Anthropic) #408

britannio opened this issue Dec 17, 2024 · 2 comments

Comments

@britannio
Copy link

Setup

uv init && uv venv && source .venv/bin/activate && uv add -U "ell-ai[all]"

Code

import ell
print(ell.__version__)
@ell.simple("claude-3-5-sonnet-latest", max_tokens=1000)
def language_comparison():
    return "Python vs Mojo?"
print(language_comparison())

Logs

Using python main.py

0.0.15
WARNING: Model `claude-3-5-sonnet-latest` is used by LMP `language_comparison` but no client could be found that supports `claude-3-5-sonnet-latest`. Defaulting to use the OpenAI client `None` for `claude-3-5-sonnet-latest`. This is likely because you've spelled the model name incorrectly or are using a newer model from a provider added after this ell version was released. 
                            
* If this is a mistake either specify a client explicitly in the decorator:
```python
import ell
ell.simple(model, client=my_client)
def language_comparison(...):
    ...
```
or explicitly specify the client when the calling the LMP:

```python
ell.simple(model, client=my_client)(...)
```

Traceback (most recent call last):
  File "/private/tmp/ell-repro/main.py", line 6, in <module>
    print(language_comparison())
          ^^^^^^^^^^^^^^^^^^^^^
  File "/private/tmp/ell-repro/.venv/lib/python3.12/site-packages/ell/lmp/_track.py", line 70, in tracked_func
    res = func_to_track(
          ^^^^^^^^^^^^^^
  File "/private/tmp/ell-repro/.venv/lib/python3.12/site-packages/ell/lmp/complex.py", line 53, in model_call
    merged_client = _client_for_model(model, client or default_client_from_decorator)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/private/tmp/ell-repro/.venv/lib/python3.12/site-packages/ell/lmp/complex.py", line 126, in _client_for_model
    raise ValueError(f"No client found for model '{model}'. Ensure the model is registered using 'register_model' in 'config.py' or specify a client directly using the 'client' argument in the decorator or function call.")
ValueError: No client found for model 'claude-3-5-sonnet-latest'. Ensure the model is registered using 'register_model' in 'config.py' or specify a client directly using the 'client' argument in the decorator or function call.
@apandy02
Copy link
Contributor

I can reproduce this problem with the setup specifed, but not on master... hmm

@apandy02
Copy link
Contributor

apandy02 commented Dec 20, 2024

After logging the exception raised during model registration:
Failed to create default Anthropic client: Client.__init__() got an unexpected keyword argument 'proxies'

I did some digging, and the issue comes from httpx. If you downgrade to httpx < 0.28, this problem should be resolved. I think we should closing this issue for now since it is not ell related. @alex-dixon

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants