Different gradient descent algorithms written from scratch in Python.
Explore the docs »
View Demo
·
Report Bug
·
Request Feature
Table of Contents
Gradient Descent is a streamlit project that will create and train a logistic regression model in Python using the MNIST dataset (no PyTorch). To properly train the logistic regression model on the Training Set, stochastic gradient descent must be used. On the Test Set, its accuracy should be between 90 and 93%..
To get a local copy up and running follow these simple example steps.
This is an example of how to list things you need to use the package and how to install them.
- Clone the repo
git clone https://github.com/tuhinmallick/gradient_descent.git
- Go to the project directory
cd gradient_descent
- Install environment or dependecies
or
conda env create -n ENVNAME --file docs/environment.yml
pip install -r requirements.txt
- Run the file
setup.sh
- Run the streamlit app
streamlit run app/dashboard.py
For more examples, please refer to the Notebook
- Granger casaulty test
- AD Fuller test
- Feature 3
- Nested Feature
See the open issues for a full list of proposed features (and known issues).
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
) - Commit your Changes (
git commit -m 'Add some AmazingFeature'
) - Push to the Branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
This project is licensed under the terms of the MIT
license. See LICENSE for more details.
Your Name - @tuhinmallick - [email protected]
Project Link: https://github.com/tuhinmallick/gradient_descent