-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
'Optimizer' not initialized #1
Comments
It could be run when we use 'Adam' to initialize the optimizer, is this ok ? |
Hey Travis, I'm sorry I couldn't respond earlier. I was outstation for a few days. That is bad from my side. I'll update the patch as soon as possible. |
I believe Adam optimizer would work without any issues. |
Hey @TravisTianqing, I updated the code. If you again face an issue please let me know. You can also check out the kaggle notebook |
Traceback (most recent call last):
File "main.py", line 67, in
main()
File "main.py", line 42, in main
loss, acc = train_step(protonet, trainx, trainy, 5, 60, 5)
File "/home/serversys005/wangtianqing/mnt/prototypical/prototypicalNet.py", line 158, in train_step
optimizer.zero_grad()
NameError: name 'optimizer' is not defined
The text was updated successfully, but these errors were encountered: