You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
importellprint(ell.__version__)
@ell.simple("claude-3-5-sonnet-latest", max_tokens=1000)deflanguage_comparison():
return"Python vs Mojo?"print(language_comparison())
Logs
Using python main.py
0.0.15
WARNING: Model `claude-3-5-sonnet-latest` is used by LMP `language_comparison` but no client could be found that supports `claude-3-5-sonnet-latest`. Defaulting to use the OpenAI client `None` for `claude-3-5-sonnet-latest`. This is likely because you've spelled the model name incorrectly or are using a newer model from a provider added after this ell version was released.
* If this is a mistake either specify a client explicitly in the decorator:
```python
import ell
ell.simple(model, client=my_client)
def language_comparison(...):
...
```
or explicitly specify the client when the calling the LMP:
```python
ell.simple(model, client=my_client)(...)
```
Traceback (most recent call last):
File "/private/tmp/ell-repro/main.py", line 6, in <module>
print(language_comparison())
^^^^^^^^^^^^^^^^^^^^^
File "/private/tmp/ell-repro/.venv/lib/python3.12/site-packages/ell/lmp/_track.py", line 70, in tracked_func
res = func_to_track(
^^^^^^^^^^^^^^
File "/private/tmp/ell-repro/.venv/lib/python3.12/site-packages/ell/lmp/complex.py", line 53, in model_call
merged_client = _client_for_model(model, client or default_client_from_decorator)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/private/tmp/ell-repro/.venv/lib/python3.12/site-packages/ell/lmp/complex.py", line 126, in _client_for_model
raise ValueError(f"No client found for model '{model}'. Ensure the model is registered using 'register_model' in 'config.py' or specify a client directly using the 'client' argument in the decorator or function call.")
ValueError: No client found for model 'claude-3-5-sonnet-latest'. Ensure the model is registered using 'register_model' in 'config.py' or specify a client directly using the 'client' argument in the decorator or function call.
The text was updated successfully, but these errors were encountered:
After logging the exception raised during model registration: Failed to create default Anthropic client: Client.__init__() got an unexpected keyword argument 'proxies'
I did some digging, and the issue comes from httpx. If you downgrade to httpx < 0.28, this problem should be resolved. I think we should closing this issue for now since it is not ell related. @alex-dixon
Setup
uv init && uv venv && source .venv/bin/activate && uv add -U "ell-ai[all]"
Code
Logs
Using
python main.py
The text was updated successfully, but these errors were encountered: