Skip to content

Commit

Permalink
Fix wrongly all_gather for mixtral finetune (huggingface#965)
Browse files Browse the repository at this point in the history
Co-authored-by: ccrhx4 <[email protected]>
  • Loading branch information
ccrhx4 and ccrhx4 authored May 8, 2024
1 parent aa175ea commit 3a14236
Showing 1 changed file with 1 addition and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -367,7 +367,7 @@ def gaudi_mixtral_block_sparse_moe_forward(self, hidden_states: torch.Tensor) ->
# router_logits: (batch * sequence_length, n_experts)
router_logits = self.gate(hidden_states)

if is_deepspeed_available():
if is_deepspeed_available() and (not self.training):
from deepspeed import comm as dist

if dist.is_initialized():
Expand Down

0 comments on commit 3a14236

Please sign in to comment.