Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loss diverges on using ConvolutionDepthwise layer. #10

Open
krraush opened this issue Oct 27, 2017 · 1 comment
Open

Loss diverges on using ConvolutionDepthwise layer. #10

krraush opened this issue Oct 27, 2017 · 1 comment

Comments

@krraush
Copy link

krraush commented Oct 27, 2017

Hi,
I am using the imagenet pre-trained model to finetune for classification.
If I use the DepthWiseconvolution layer, the loss first decreases and then starts to increase within 100 iterations.

However, with everything else as in the above fine tuning (i.e Base LR, solver etc), if i replace the ConvolutionDepthwise with Convolution layer the loss converges.
I tried fiddling with learning rates and solvers, but it didn't help.

I want to use the ConvolutionalDepthwise, as it is ~4 times faster than convolution.

@SuSuXia
Copy link

SuSuXia commented Jan 16, 2018

Can you share the ConvolutionDepthwise layer? Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants