A Bishõ Autograd engine in python along with a lightweight deep neural-network library! (inspired from mircrograd by Andrej Karpathy)
微小 - Bishõ is Japanese word for 'tiny' since my implementation is very tinyyyyy ^_^ compared to PyTorch/Tensorflow
箱 - Hako is Japanese word for 'box' , here Hako signify the neurons in our network ;D
- Install via pip:
pip install bishograd==0.1.0
Wonder how this works? Checkout the examples.
- ReLU Activation function
- Add MLP.training() to automate the whole training code
- Add
Sigmoid, LeakyReLU & other activation functions - Add loss functions - categorical loss, mean-square loss etc
This project is open for contribution!
- Clone this repository:
git clone https://github.com/AK3847/Bishograd.git
Primary way to contribute is to either raise an issue or a pull request with proper description and code format. You can contribute to any Targets or suggest new features as well.
Drop a star if this project helped you in anyway! ( ノ ゚ー゚)ノ