Skip to content

Commit

Permalink
too many layernorms
Browse files Browse the repository at this point in the history
  • Loading branch information
lucidrains committed Nov 9, 2021
1 parent 01f75d7 commit c8d1ede
Showing 1 changed file with 1 addition and 4 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -194,10 +194,7 @@ def __init__(
norm_out = True
)

self.to_logits = nn.Sequential(
nn.LayerNorm(dim),
nn.Linear(dim, num_tokens)
)
self.to_logits = nn.Linear(dim, num_tokens)

def forward(self, x):
device = x.device
Expand Down

0 comments on commit c8d1ede

Please sign in to comment.