Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rework all tutorials #500

Merged
merged 2 commits into from
Nov 14, 2024
Merged

Rework all tutorials #500

merged 2 commits into from
Nov 14, 2024

Conversation

michaeldeistler
Copy link
Contributor

  • I removed tutorial 03_setting_parameters as it had very little content and I want to keep the number of tutorials small. I added its main content into tutorials 01_ and 02_.
  • There was a bug in the dataloader (it only ran one epoch)
  • make sure that all tutorials run and behave as expected

Copy link
Contributor

@jnsbck jnsbck left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Left two minor comments, but it looks good to me. Feel free to merge.

" for batch_ind, batch in enumerate(dataloader):\n",
" current_batch = batch[0].numpy()\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could also be changed to

" current_batch, label_batch = batch\n",

Comment on lines +833 to +834
" dataloader = Dataset(inputs, labels)\n",
" dataloader = dataloader.shuffle(seed=epoch).batch(batch_size)\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we change this to

dataset = Dataset(inputs, labels)
for epoch in range(10):
    epoch_loss = 0.0
    
    # Our simple dummy dataloader must be shuffled at every epoch.
    dataloader = dataset.shuffle(seed=epoch).batch(batch_size)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

alternatively, you can also increment the seed after the last yield or sth. than you would not have to re-init

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The above does not work. I will leave it as it is for now, feel free to change it later.

@michaeldeistler michaeldeistler merged commit 20dd48e into main Nov 14, 2024
1 check passed
@michaeldeistler michaeldeistler deleted the tutorial-fixups branch November 14, 2024 14:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants