Skip to content

Commit

Permalink
Append epoch rather than best val. loss to val_loss (#744)
Browse files Browse the repository at this point in the history
  • Loading branch information
init27 authored Oct 24, 2024
2 parents d8b0eba + 2a94bff commit e2342c2
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/llama_recipes/utils/train_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -288,7 +288,7 @@ def train(model, train_dataloader,eval_dataloader, tokenizer, optimizer, lr_sche
print(f"best eval loss on epoch {epoch+1} is {best_val_loss}")
else:
print(f"best eval loss on epoch {epoch+1} is {best_val_loss}")
val_loss.append(float(best_val_loss))
val_loss.append(float(eval_epoch_loss))
val_prep.append(float(eval_ppl))
if train_config.enable_fsdp:
if rank==0:
Expand Down

0 comments on commit e2342c2

Please sign in to comment.