This project implements an auto-differentiator in autograd.hs
, which can differentiate any function made up of the basic mathematical operations (*, +, /, sin, exp...).
It also implements newtons method and SGD to find the minimum of a function in optimizer.hs
.
The goal is to eventually implement a full neural network in haskell.