Skip to content

LSTM-MATLAB is Long Short-term Memory (LSTM) in MATLAB, which is meant to be succinct, illustrative and for research purpose only. It is accompanied with a paper for reference: Revisit Long Short-Term Memory: An Optimization Perspective, NIPS deep learning workshop, 2014.

Notifications You must be signed in to change notification settings

AQU-YULU/LSTM-MATLAB

This branch is 1 commit behind huashiyiqike/LSTM-MATLAB:master.

Folders and files

NameName
Last commit message
Last commit date

Latest commit

4969467 · Dec 28, 2015

History

7 Commits
Mar 25, 2015
Mar 25, 2015
Mar 25, 2015
Dec 28, 2015
Mar 25, 2015
Mar 25, 2015
Mar 25, 2015
Mar 25, 2015
Mar 25, 2015
Mar 25, 2015
Mar 25, 2015
Mar 25, 2015
Mar 25, 2015

Repository files navigation

LSTM-MATLAB

LSTM-MATLAB is Long Short-term Memory (LSTM) in MATLAB, which is meant to be succinct, illustrative and for research purpose only. It is accompanied with a paper for reference: Revisit Long Short-Term Memory: An Optimization Perspective, NIPS deep learning workshop, 2014.

Creater & Maintainer Qi Lyu

#FEATURES

  • original Long short-term Memory
  • all connect peephole
  • support optimization methods like LBFGS and CG
  • CPU or GPU acceleration
  • Mapreduce parallelization
  • gradient checking
  • easy configuration
  • baseline experiment

#ACKNOWLEDGEMENTS The minFunc code folder included is provided by Mark Schmidt (http://www.cs.ubc.ca/~schmidtm). MATLAB Mapreduce is provided by Quoc V. Le(http://cs.stanford.edu/~quocle/optimizationWeb/index.html).

#USAGE To run the code, start from aStart.m. Data is generated by scripts in data directory on-the-fly. For faster LSTM implementation with complete features, see 'LSTMLayer' defined in C++ version. The dataset and labels etc follows the original LSTM paper in 1997.

License

MIT

About

LSTM-MATLAB is Long Short-term Memory (LSTM) in MATLAB, which is meant to be succinct, illustrative and for research purpose only. It is accompanied with a paper for reference: Revisit Long Short-Term Memory: An Optimization Perspective, NIPS deep learning workshop, 2014.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • MATLAB 87.3%
  • C 11.0%
  • Shell 1.4%
  • Other 0.3%