Perceptrons : Basic of Neural Networks

SHUBHAM YADAV
6 min readJan 21, 2021

As we know how atoms are the building blocks of matter and microprocessors are the building blocks of computer likewise perceptron are the building blocks of Neural Networks.

If you see the word Perceptron , it comprises of two words :

  • Perception (noun) the ability to sense something
  • Neuron (noun) a nerve cell in the human brain that turns sensory input into meaningful information

Now, we can understand that the perceptron is like a artificial neuron which stimulate the task of biological neuron to solve any problem.

Let’s undersatnd the concept of Perceptron by taking an example of a self driving car which is based on perceptron. Consider if the car is moving the somewhere any obstacle came on the left side of car then car would have to steer at right and vice-versa.

For the above eaxample here the obstacles are inputs and decision to turn left or right is output.

Perceptron’s also correct itself by the result for making better prediction in the future.

Representation of Perceptron

Perceptron consist of three main components :

  • Inputs: Each input corresponds to a feature. For example, in the case of a person, features could be age, height, weight etc.
  • Weights: Each input also has a weight. Weight means a certain amount of importance to the input. If an input’s weight is large, then it means this input plays a bigger role in determining the output.
  • Output: Finally, the perceptron uses the inputs and weights to produce an output. The type of the output varies depending on the nature of the problem. For example, to predict whether or not it’s going to rain, the output has to be binary — 1 for Yes and 0 for No. However, to predict the temperature for the next day, the range of the output has to be larger — say a number from -5 to 50.
class Perceptron:
def __init__(self, num_inputs=2, weights=[1,1]):
self.num_inputs = num_inputs
self.weights = weights
cool_perceptron = Perceptron()

In above code we have created a Perceptron class in which we have initialized two arguments num_inputs and weights. num_inputs specifies the number of inputs and weights specifies the weight of each inputs.

Step 1: Weighted Sum

Now, we have inputs and weights of each inputs so, how the output is being calculated by these two things. For calculating ouput we have to follow two steps. First is calculate the weighted sum of inputs.

weighted sum=x1​w1​+x2​w2​+…+xnwn

here, x’s are the inputs and w’s are the weights of each of inputs.

For calculating weighted sum :

  1. Start with a weighted sum of 0. Let’s call it weighted_sum.
  2. Start with the first input and multiply it by its corresponding weight. Add this result to weighted_sum.
  3. Go to the next input and multiply it by its corresponding weight. Add this result to weighted_sum.
  4. Repeat this process for all inputs.
class Perceptron:
def __init__(self, num_inputs=2, weights=[2,1]):
self.num_inputs = num_inputs
self.weights = weights

def weighted_sum(self, inputs):
# create variable to store weighted sum
weighted_sum = 0
for i in range(self.num_inputs):
weighted_sum += inputs[i] * self.weights[i]

return weighted_sum

cool_perceptron = Perceptron()
print(cool_perceptron.weighted_sum([24,55]))

Step 2: Activation Function

After calculating the weighted sum, the second step is to constrain the weighted sum to produce a desired output.

Imagine if the has the perceptron has inputs of range of 100–1000 but our goal is predict something would occur or not i.e. 1 for ‘Yes’ or 0 for ‘No’. This would result in very large weighted sum.

This is where the concept of Activation function comes in. These are the special functions which constrain the weighted sum to desired output.

For example, you want to train a Perceptron to detect weather a point is below or above the line. We want the output to be +1 or -1 label. For this we use the sign Activation Function to help the perceptron to make the decision.

  • If weighted sum is positive, return +1
  • If weighted sum is negative, return -1
class Perceptron:
def __init__(self, num_inputs=2, weights=[1,1]):
self.num_inputs = num_inputs
self.weights = weights

def weighted_sum(self, inputs):
weighted_sum = 0
for i in range(self.num_inputs):
weighted_sum += self.weights[i]*inputs[i]
return weighted_sum

def activation(self, weighted_sum):
if weighted_sum >= 0:
return 1
if weighted_sum < 0:
return -1
cool_perceptron = Perceptron()
print(cool_perceptron.weighted_sum([24, 55]))
print(cool_perceptron.activation(52))

Our perceptron can now make prediction to the inputs, but how do we know the prediction is right?

Our perceptron is now have the random weights so that it’s prediction is very bad. We haven’t taught anything to our perceptron show we can’t expect these correct classification.

To produce good prediction we have to train our perceptron using training sets.

In this example we will create our own training sets using random numbers.

import randomdef generate_training_set(num_points):
x_coordinates = [random.randint(0, 50) for i in range(num_points)]
y_coordinates = [random.randint(0, 50) for i in range(num_points)]
training_set = dict()
for x, y in zip(x_coordinates, y_coordinates):
if x <= 45-y:
training_set[(x,y)] = 1
elif x > 45-y:
training_set[(x,y)] = -1
return training_set
training_set = generate_training_set(5)

Now we have the training set we can use the training set to train the perceptron.For training we loop over all the values of training set and make prediction of each inputs and compare with its actual value in training sets.

class Perceptron:
def __init__(self, num_inputs=2, weights=[1,1]):
self.num_inputs = num_inputs
self.weights = weights

def weighted_sum(self, inputs):
weighted_sum = 0
for i in range(self.num_inputs):
weighted_sum += self.weights[i]*inputs[i]
return weighted_sum

def activation(self, weighted_sum):
if weighted_sum >= 0:
return 1
if weighted_sum < 0:
return -1

def training(self, training_set):
for inputs in training_set:
prediction = self.activation(self.weighted_sum(inputs))
actual = training_set[inputs]
error = actual - prediction

cool_perceptron = Perceptron()
print(cool_perceptron.weighted_sum([24, 55]))
print(cool_perceptron.activation(52))

Now we have the errors of training sets so what we have do with it? We slowly nudge the perceptron to better version of itself that eventually has zero error.

The only way to do that we have to change the parameters that defines perceptron. We can’t change the inputs so that the only things can be tweaked is the weights. As we change the weights ouputs also changed.

The goal is to find the optimal combination weights that produce the correct ouput.

For tweaking our weights we will use Perceptron Alogorithm. Here we aren’t going to talk about maths behind this algorithm, we are going to implement the algorithm to otimally tweak the weight and nudge the perceptron towards zero error.

The important part of the algorithm is where to update the weight’s

weight = weight + (error * input)

We will keep on tweaking the weights until we found the optimal weight for better prediction.

class Perceptron:
def __init__(self, num_inputs=2, weights=[1,1]):
self.num_inputs = num_inputs
self.weights = weights

def weighted_sum(self, inputs):
weighted_sum = 0
for i in range(self.num_inputs):
weighted_sum += self.weights[i]*inputs[i]
return weighted_sum

def activation(self, weighted_sum):
if weighted_sum >= 0:
return 1
if weighted_sum < 0:
return -1

def training(self, training_set):
foundLine = False
while not foundLine:
total_error = 0
for inputs in training_set:
prediction = self.activation(self.weighted_sum(inputs))
actual = training_set[inputs]
error = actual - prediction
total_error += abs(error)
for i in range(self.num_inputs):
self.weights[i] += error*inputs[i]
if total_error == 0:
foundLine = True

cool_perceptron = Perceptron()
small_training_set = {(0,3):1, (3,0):-1, (0,-3):-1, (-3,0):1}
cool_perceptron.training(small_training_set)
print(cool_perceptron.weights)

Now we know how perceptron can be trained to produce correct outputs by tweaking the weights.

However, there many times when we needed an minor adustment in our perceptron. We take the help of bias weights. This acts an supporting role for our perceptron. Its default input value is 1.

So, now our new eauation of weighted sum :

weighted sum=x1​w1​+x2​w2​+…+xnwn​+1wb

And now our code will be updated to this form:

class Perceptron:
def __init__(self, num_inputs=3, weights=[1,1,1]):
self.num_inputs = num_inputs
self.weights = weights

def weighted_sum(self, inputs):
weighted_sum = 0
for i in range(self.num_inputs):
weighted_sum += self.weights[i]*inputs[i]
return weighted_sum

def activation(self, weighted_sum):
if weighted_sum >= 0:
return 1
if weighted_sum < 0:
return -1

def training(self, training_set):
foundLine = False
while not foundLine:
total_error = 0
for inputs in training_set:
prediction = self.activation(self.weighted_sum(inputs))
actual = training_set[inputs]
error = actual - prediction
total_error += abs(error)
for i in range(self.num_inputs):
self.weights[i] += error*inputs[i]
if total_error == 0:
foundLine = True

cool_perceptron = Perceptron()

--

--

SHUBHAM YADAV

SDE - 1 @Trademo | Machine Learning Enthusiast | SIH 2020 Finalist