Back to Showcase

Perceptron Algorithm

Step through the perceptron learning algorithm as it iteratively discovers a linear decision boundary that separates two classes. Based on my SC2300 coursework implementation.

-1-1-0.5-0.5000.50.511x₁x₂Class +1Class -1
Step:0 / 59

Green ring = correct prediction, Red ring = misclassified (weight update)

Training Stats

Epoch1
Sample1 / 30
Updates1
Accuracy60.0%

Weights (θ)

θ₁ (x₁)0.8299
θ₂ (x₂)0.3974
θ₃ (bias)1.0000

Current Step

Label (y)+1
PredictionWrong
Actionθ += xy

Configuration

The Algorithm

The perceptron is one of the simplest binary classifiers. It learns a linear decision boundary by iterating through the training data and updating weights whenever it makes a misclassification.

# Perceptron Update Rule

for each sample (x, y):

if y * dot(x, θ) <= 0: # misclassified

θ += x * y # update weights

Key Concepts

Convergence Guarantee

If the data is linearly separable, the perceptron is guaranteed to converge to a separating hyperplane in a finite number of updates. The number of updates is bounded by (R/γ)² where R is the data radius and γ is the margin.

Averaged Perceptron

The averaged variant stores all intermediate weight vectors after each update and returns their average. This typically generalizes better to unseen data by reducing variance in the learned boundary.

Bias Term

By appending a 1 to each feature vector, the bias is absorbed into the weight vector. This allows the decision boundary to be offset from the origin: w₁x₁ + w₂x₂ + b = 0.

Online Learning

The perceptron processes one sample at a time and updates immediately when it makes an error. This makes it an online learning algorithm, suitable for streaming data or when the full dataset doesn't fit in memory.

Implementation Details

ParameterValue
ClassificationBinary (+1 / -1)
Features2D + bias
Update Ruleθ += xy (when y·(x·θ) ≤ 0)
Max Epochs100
CourseSC2300 Introduction to ML