Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Fix] STSにおいてbfloat16 tensorが起こった不具合 #47

Merged
merged 1 commit into from
Jul 31, 2024

Conversation

lsz05
Copy link
Collaborator

@lsz05 lsz05 commented Jul 31, 2024

関連する Issue / PR

N/A

PR をマージした後の挙動の変化

STSにscipyの関数を使っており,その内部にtensor.numpy()の処理があり,tensortorch.bfloat16タイプの場合,不具合が起こります。

[rank2]:   File "~/JMTEB/src/jmteb/evaluators/sts/evaluator.py", line 74, in __call__
[rank2]:     val_results[sim_name], _ = self._compute_similarity(
[rank2]:   File "~/JMTEB/src/jmteb/evaluators/sts/evaluator.py", line 112, in _compute_similarity
[rank2]:     pearson = pearsonr(golden_scores, sim_scores)[0]
[rank2]:   File "/usr/local/lib/python3.10/dist-packages/scipy/stats/_stats_py.py", line 4727, in pearsonr
[rank2]:     y = np.asarray(y)
[rank2]:   File "/usr/local/lib/python3.10/dist-packages/torch/_tensor.py", line 1087, in __array__
[rank2]:     return self.numpy()
[rank2]: TypeError: Got unsupported ScalarType BFloat16

挙動の変更を達成するために行ったこと

sim_scorestorch.bfloat16である場合,.float()dtype変更

動作確認

  • テストが通ることを確認した
  • マージ先がdevブランチであることを確認した

@lsz05 lsz05 changed the base branch from main to dev July 31, 2024 05:50
@lsz05 lsz05 marked this pull request as ready for review July 31, 2024 05:50
@lsz05 lsz05 requested a review from akiFQC July 31, 2024 05:50
@akiFQC
Copy link
Collaborator

akiFQC commented Jul 31, 2024

LGTM

@lsz05 lsz05 merged commit 7881bd7 into dev Jul 31, 2024
3 checks passed
@lsz05 lsz05 mentioned this pull request Aug 19, 2024
1 task
@lsz05 lsz05 deleted the fix/sts_bfloat16_tensor branch August 29, 2024 16:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants