You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Traceback (most recent call last):
File "/Users/nazrak/opt/miniconda3/envs/cross-embed/lib/python3.12/site-packages/huggingface_hub/utils/_errors.py", line 304, in hf_raise_for_status
response.raise_for_status()
File "/Users/nazrak/opt/miniconda3/envs/cross-embed/lib/python3.12/site-packages/requests/models.py", line 1024, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2/resolve/main/adapter_config.json
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/Users/nazrak/opt/miniconda3/envs/cross-embed/lib/python3.12/site-packages/transformers/utils/hub.py", line 403, in cached_file
resolved_file = hf_hub_download(
^^^^^^^^^^^^^^^^
File "/Users/nazrak/opt/miniconda3/envs/cross-embed/lib/python3.12/site-packages/huggingface_hub/utils/_deprecation.py", line 101, in inner_f
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/Users/nazrak/opt/miniconda3/envs/cross-embed/lib/python3.12/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/Users/nazrak/opt/miniconda3/envs/cross-embed/lib/python3.12/site-packages/huggingface_hub/file_download.py", line 1240, in hf_hub_download
return _hf_hub_download_to_cache_dir(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/nazrak/opt/miniconda3/envs/cross-embed/lib/python3.12/site-packages/huggingface_hub/file_download.py", line 1347, in _hf_hub_download_to_cache_dir
_raise_on_head_call_error(head_call_error, force_download, local_files_only)
File "/Users/nazrak/opt/miniconda3/envs/cross-embed/lib/python3.12/site-packages/huggingface_hub/file_download.py", line 1854, in _raise_on_head_call_error
raise head_call_error
File "/Users/nazrak/opt/miniconda3/envs/cross-embed/lib/python3.12/site-packages/huggingface_hub/file_download.py", line 1751, in _get_metadata_or_catch_error
metadata = get_hf_file_metadata(
^^^^^^^^^^^^^^^^^^^^^
File "/Users/nazrak/opt/miniconda3/envs/cross-embed/lib/python3.12/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/Users/nazrak/opt/miniconda3/envs/cross-embed/lib/python3.12/site-packages/huggingface_hub/file_download.py", line 1673, in get_hf_file_metadata
r = _request_wrapper(
^^^^^^^^^^^^^^^^^
File "/Users/nazrak/opt/miniconda3/envs/cross-embed/lib/python3.12/site-packages/huggingface_hub/file_download.py", line 376, in _request_wrapper
response = _request_wrapper(
^^^^^^^^^^^^^^^^^
File "/Users/nazrak/opt/miniconda3/envs/cross-embed/lib/python3.12/site-packages/huggingface_hub/file_download.py", line 400, in _request_wrapper
hf_raise_for_status(response)
File "/Users/nazrak/opt/miniconda3/envs/cross-embed/lib/python3.12/site-packages/huggingface_hub/utils/_errors.py", line 352, in hf_raise_for_status
raise RepositoryNotFoundError(message, response) from e
huggingface_hub.utils._errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-67a16fe0-6c8d26b009103a2c16bf76bc;c908df8a-fbac-445e-8a38-3fb67a7efc86)
Repository Not Found for url: https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2/resolve/main/adapter_config.json.
Please make sure you specified the correct `repo_id` and `repo_type`.
If you are trying to access a private or gated repo, make sure you are authenticated.
Invalid credentials in Authorization header
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/Users/nazrak/Documents/Work/repos/ml-studio/modules/src/ml_platform/mlp_sentence_transformer_finetune/included/scripts/sandbox_load.py", line 5, in <module>
model = SentenceTransformer(
^^^^^^^^^^^^^^^^^^^^
File "/Users/nazrak/opt/miniconda3/envs/cross-embed/lib/python3.12/site-packages/sentence_transformers/SentenceTransformer.py", line 308, in __init__
modules, self.module_kwargs = self._load_sbert_model(
^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/nazrak/opt/miniconda3/envs/cross-embed/lib/python3.12/site-packages/sentence_transformers/SentenceTransformer.py", line 1739, in _load_sbert_model
module = module_class(model_name_or_path, cache_dir=cache_folder, backend=self.backend, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/nazrak/opt/miniconda3/envs/cross-embed/lib/python3.12/site-packages/sentence_transformers/models/Transformer.py", line 80, in __init__
config, is_peft_model = self._load_config(model_name_or_path, cache_dir, backend, config_args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/nazrak/opt/miniconda3/envs/cross-embed/lib/python3.12/site-packages/sentence_transformers/models/Transformer.py", line 121, in _load_config
find_adapter_config_file(
File "/Users/nazrak/opt/miniconda3/envs/cross-embed/lib/python3.12/site-packages/transformers/utils/peft_utils.py", line 88, in find_adapter_config_file
adapter_cached_filename = cached_file(
^^^^^^^^^^^^
File "/Users/nazrak/opt/miniconda3/envs/cross-embed/lib/python3.12/site-packages/transformers/utils/hub.py", line 426, in cached_file
raise EnvironmentError(
OSError: sentence-transformers/all-MiniLM-L6-v2 is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
If this is a private repository, make sure to pass a token having permission to this repo either by logging in with `huggingface-cli login` or by passing `token=<your_token>`
Bizarrely this is resolved fully if I set the HF_TOKEN environment variable or pass the token argument, despite sentence-transformers/all-MiniLM-L6-v2 being an ungated model.
In addition, I have no issue loading it with AutoModel:
This fails. In the past, an invalid or empty token did not cause problems with public models.
(I found similar reports in other projects as well: instructlab/instructlab#3075)
The problem is also that in certain CI environments, an empty environment variable is being set for the token instead of leaving it unset.
This issue is likely not specific to Sentence Transformers, but rather due to changes in Hugging Face Hub.
@tomaarsen It would be great if you could notify the maintainers of other related HF projects.
If this change is intentional, it should be documented accordingly. (I am encountering the same problem with transformers as well, maybe it can addressed in huggingface_hub).
Code to reproduce:
The above throws an error:
Bizarrely this is resolved fully if I set the
HF_TOKEN
environment variable or pass thetoken
argument, despitesentence-transformers/all-MiniLM-L6-v2
being an ungated model.In addition, I have no issue loading it with
AutoModel
:Some versioning info:
The text was updated successfully, but these errors were encountered: