-
Notifications
You must be signed in to change notification settings - Fork 80
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
saving the trained model for future use - #4
Comments
I also need to save the trained model, I tried to save "NTWK" but gives serialization error: maximum recursion depth exceeded. Has anyone experienced this? thanks |
hello do you have done that,save/predict the model? @vijaymanikandan @lucianaqueiroz098 |
I don't plan to. As it is not meant to be a full-fledged product. Only a proof of concept. It is a very bare bones implementation, not an end-to-end product. May be someone has it on a fork? |
I think saving the outputs required function "tester": the layer 2 and layer 1 work, @rakeshvar can confirm this? |
the predict I think to use the teser function . I get this repo: https://github.com/mosessoh/CNN-LSTM-Caption-Generator @rakeshvar Can this repo handle this digit recognition ? |
Thanks for posting your code, I played with it to understand LSTM implementation. I just have a quick question regarding saving the models for later use. Right now, when I run train.py, I can see the model getting trained and I see some outputs. But is there a way to save the model to file and later use it to retrain/predict on future data? I tried using pickle but I get an error saying maximum recursion reached. Please post your thoughts.
Thanks!
The text was updated successfully, but these errors were encountered: