Skip to content

Implementation of Neural Network from scratch using Sigmoid, tanh and ReLu activation functions.

Notifications You must be signed in to change notification settings

Deepesh-Rathore/Neural-Network-backpropagation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 

Repository files navigation

Neural-Network-backpropagation

Implementation of Neural Network from scratch, used Sigmoid, tanh and ReLu activation functions.

Coded a neural network (NN) having two hidden layers, besides the input and output layers. Implemented Sigmoid, tanh and ReLu activation functions. Implemented backpropagation algorithm for training the neural network.

The above neural network was then used to make predictions for three datasets below:

  1. Car Evaluation Dataset: https://archive.ics.uci.edu/ml/datasets/Car+Evaluation
  2. Iris Dataset: https://archive.ics.uci.edu/ml/datasets/Iris
  3. Adult Census Income Dataset: https://archive.ics.uci.edu/ml/datasets/Census+Income

About

Implementation of Neural Network from scratch using Sigmoid, tanh and ReLu activation functions.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages