Sigmoid Neuron

Parveen Khurana
13 min readJan 2, 2020

This article covers the content discussed in the Sigmoid Neuron module of the Deep Learning course and all the images are taken from the same module.

In this article, we discuss the 6 jars of the Machine Learning with respect to the Sigmoid Model but before beginning with that let’s see the drawback of the Perceptron Model.

Sigmoid Model and a drawback of the Perceptron Model:

The limitation of the perceptron model is that we have this harsh function(boundary) separating the classes on two sides as depicted below

And we would like to have a smoother transition curve which is closer to the way humans make decisions in the sense that something is not drastically changed, it slowly changes over a range of values. So, we would like to have something like the S-shaped function(in red in the below image).

And we have the Sigmoid family of functions in Deep Learning of which many of the functions are S-shaped. One such function is the logistic function(it is one smooth continuous function) and this function is defined by the below equation:

--

--

Parveen Khurana
Parveen Khurana

Written by Parveen Khurana

Writing on Data Science, Philosophy, Emotional Health | Grateful for the little moments and every reader | Nature lover at heart | Follow for reflective musings

Responses (1)