The purpose of this repository is to allow me to review and practice basic deep learning principles. In it, I implement a neural network from scratch in python (MNIST_Vanilla notebook). The initial implementation simply uses a sigmoid activation function and Mean Squared Error cost function. In the MNIST_Enhancements notebook, I use the more reasonable Cross Entropy cost function, ReLU activation function, and toy around with regularization techniques. Finally, I begin practicing with TensorFlow properly in MNIST_Tensorflow.
-
Notifications
You must be signed in to change notification settings - Fork 0
alex-service-ml/mnist
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
A python implementation of a Neural Network for the MNIST dataset.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published