Implementation of Perceptron Learning Algorithm In Python: Code, Concept, Advantages, Disadvantages, and Applications

Implementation of Perceptron Learning Algorithm In Python: Code, Concept, Advantages, Disadvantages, and Applications
Implementation of Perceptron Learning Algorithm In Python: Code, Concept, Advantages, Disadvantages, and Applications


What is Perceptron Learning?
  1. Perceptron is an Artificial Neuron or a neural network unit.
  2. It was introduced by Frank Rosenblatt in 1957.
  3. Frank Rosenblatt proposed a perceptron learning rule based on the original MCP neuron.
  4. To identify business intelligence and features in the input data it performs certain computations.
  5. This algorithm is used for supervised learning of binary classifiers.
The two types of perceptron learning are listed below:
1) Single-layer 
2) Multilayer

Code:

import math


def sgn(x):   return 1 if x>0 else -1

def add(a, b):   return [round(ai+bi, 2) for ai, bi in zip(a, b)]

def var_mul(x, a):   return [round(x*ai, 2) for ai in a]

def mult(a, b):    return sum([ai*bi for ai, bi in zip(a, b)])

if __name__ == '__main__':

c = 0.1

n = int(input('Enter no of input:'))

x, d = [],[]

for i in range(n):

xi  = list(map(float, input(f'Enter x{i}: ').strip().split(' '))) di = int(input('Enter desired output:')) x.append(xi); d.append(di)

w = list(map(float, input('Enter initial weights:').split(' ')))

for xi, di in zip(x, d):

net = mult(xi, w)

if(sgn(net) != sgn(di)):

print("Correction Needed...")

w = add(w, var_mul(c*(di - sgn(net)), xi))

print("X: ", xi)

print(f"W: {w}\n")

else:

print("Correction Not Needed..")

print("X: ", xi)

print(f"W: {w}\n")

           continue

Advantages:
  1. It is very simple.
  2. It is very efficient for computation.
  3. It learns linear separable problems.
Disadvantages:
  1. It learns only linear separations.
  2. It has some limitations in its features.
Applications:
  1. Image Recognization.
  2. Speech Recognization.
  3. Cyber Security.
  4. Machine Translation.




Post a Comment

0 Comments