We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
It seems to be possible to use the built-in tokeniser of Huggingface transformers to do word to token mapping: https://discuss.huggingface.co/t/generate-raw-word-embeddings-using-transformer-models-like-bert-for-downstream-process/2958/2
The question is whether this would be useful. spaCy offers a lot of other information as well that might be interesting...
The text was updated successfully, but these errors were encountered:
No branches or pull requests
It seems to be possible to use the built-in tokeniser of Huggingface transformers to do word to token mapping:
https://discuss.huggingface.co/t/generate-raw-word-embeddings-using-transformer-models-like-bert-for-downstream-process/2958/2
The question is whether this would be useful. spaCy offers a lot of other information as well that might be interesting...
The text was updated successfully, but these errors were encountered: