Skip to content
This repository has been archived by the owner on Aug 23, 2021. It is now read-only.

Latest commit

 

History

History
53 lines (27 loc) · 977 Bytes

README.md

File metadata and controls

53 lines (27 loc) · 977 Bytes

neural-networks

single-layer neural network

Dataset: cifar-10

Optimization method: mini batch gradient descent

Loss function: cross entropy for softmax; SVM multi-class

Regularization: L2

New features: eg. learning rate decay, Xavier initialization

Test accuracy (highest): 40.66%

double-layer neural network

Dataset: cifar-10

Optimization method: mini batch gradient descent

Loss function: cross entropy

Regularization: L2

New features: eg. cyclical learning rate, ensemble learning, dropout

Test accuracy (highest): 54.84%

multi-layer neural network

Dataset: cifar-10

Optimization method: mini batch gradient descent

Loss function: cross entropy

Regularization: L2

New features: eg. batch normalization, He initialization, data augmentation

Test accuracy (highest): 58.66%

recurrent neural network

Dataset: Text from Harry Potter and the Goblet of Fire

Optimization method: AdaGrad

Loss function: cross entropy

Goal: synthesize text