Skip to content

Latest commit

 

History

History
71 lines (43 loc) · 1.96 KB

README.md

File metadata and controls

71 lines (43 loc) · 1.96 KB

Neural Networks From Scratch

🌟 Implementation of Neural Networks from Scratch Using Python & Numpy 🌟

Uses Python 3.7.4

Activation Functions

Activation functions live inside each neuron in the neural network layers and modify the data they receive before passing it to the next layer. Activation functions give neural networks their power  by  introducing non-linearity, allowing them to model complex relationships.

x: input

  • Identity


  • BinaryStep


  • Linear


  • Sigmoid


  • Hyperbolic Tangent (tanh)


  • Rectified Linear Units (ReLU)


  • Leaky Rectified Linear Units (LeakyReLU)

    where b is a small constant


  • Softmax


  • Gaussian Error Linear Units (GeLU)

    err: error function