Skip to content

Commit

Permalink
Fix potential off-by-one error in attention mask generation
Browse files Browse the repository at this point in the history
  • Loading branch information
dibyaghosh authored Apr 15, 2024
1 parent bd930f9 commit c4c222a
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion octo/model/components/block_transformer.py
Original file line number Diff line number Diff line change
Expand Up @@ -290,7 +290,7 @@ def generate_attention_mask(
self.verify_causality(prefix_groups, timestep_groups)

def _get_position(i, tokens_per_elem):
return np.searchsorted(np.cumsum(tokens_per_elem), i)
return np.searchsorted(np.cumsum(tokens_per_elem), i, side='right')

horizon = timestep_groups[0].tokens.shape[1]
tokens_per_prefix_group = [group.tokens.shape[1] for group in prefix_groups]
Expand Down

0 comments on commit c4c222a

Please sign in to comment.