Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

nan in the training loss #62

Open
Hanyi11 opened this issue Mar 4, 2024 · 0 comments
Open

nan in the training loss #62

Hanyi11 opened this issue Mar 4, 2024 · 0 comments

Comments

@Hanyi11
Copy link
Contributor

Hanyi11 commented Mar 4, 2024

  • Python version: 3.9
  • Operating System: linux

Description

When training the membrain-seg model, it can happen that sometimes the training loss and training surface dice become nan.

git clone https://github.com/teamtomo/membrain-seg.git
%cd membrain-seg
!pip install .
%cd membrain-seg
!membrain
!membrain train
conda activate memBrain
membrain train --data-dir /home/icb/hanyi.zhang/534_withnewSyntheticAndCollaborators
membrain train_advanced --num-workers 16 --data-dir /home/icb/hanyi.zhang/534_withnewSyntheticAndCollaborators

Epoch 21: 100%
EPOCH Training loss nan
EPOCH Training acc 0.9495545029640198
EPOCH Training surface dice nan
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant