Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The loss curve does not fall #27

Open
luoshuifeiyang opened this issue Aug 16, 2020 · 1 comment
Open

The loss curve does not fall #27

luoshuifeiyang opened this issue Aug 16, 2020 · 1 comment

Comments

@luoshuifeiyang
Copy link

I didn't change the network, but the loss didn't fall. I thought the lr was too big or small ,but even I set it bigger or smaller, the loss curve always not fall. the model didn't converge。 Could you please tell me the reason? Thanks a lot.

@Ugness
Copy link
Owner

Ugness commented Aug 16, 2020

  1. If you train the network from the pretrained one, it would be possible that the loss didn't fall.
  2. If you train the network from random initialization, please check your data in/out is correct (same name image file in each images / masks folder).

image

This is my loss graph (from random initialization), and you can see that loss does not look converging about the first 10,000 steps. In my opinion, please wait for the model to converge, and please look carefully if the saliency prediction masks seem reasonable or not. If the masks do not seem reasonable, please attach the image of your saliency mask and loss graph to let me help you more.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants