Skip to content
This repository has been archived by the owner on Aug 23, 2021. It is now read-only.
/ neural-networks Public archive

MATLAB implementation of several neural network models

License

Notifications You must be signed in to change notification settings

tjr16/neural-networks

Repository files navigation

neural-networks

single-layer neural network

Dataset: cifar-10

Optimization method: mini batch gradient descent

Loss function: cross entropy for softmax; SVM multi-class

Regularization: L2

New features: eg. learning rate decay, Xavier initialization

Test accuracy (highest): 40.66%

double-layer neural network

Dataset: cifar-10

Optimization method: mini batch gradient descent

Loss function: cross entropy

Regularization: L2

New features: eg. cyclical learning rate, ensemble learning, dropout

Test accuracy (highest): 54.84%

multi-layer neural network

Dataset: cifar-10

Optimization method: mini batch gradient descent

Loss function: cross entropy

Regularization: L2

New features: eg. batch normalization, He initialization, data augmentation

Test accuracy (highest): 58.66%

recurrent neural network

Dataset: Text from Harry Potter and the Goblet of Fire

Optimization method: AdaGrad

Loss function: cross entropy

Goal: synthesize text

About

MATLAB implementation of several neural network models

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages