| Epoch | 0 |
|---|---|
| Errors | — |
| 0.065 | |
| 0.017 | |
| -0.042 | |
| Updates | 0 |
The Perceptron Algorithm is a supervised learning algorithm for binary linear classifiers. Given input
The activation function applies a step function to produce binary output:
The perceptron geometrically defines a decision boundary (hyperplane) that separates the input space. Points on one side are classified as class 1, points on the other as class 0. The equation
The Learning Rule: For each training example
where
Training proceeds in epochs: each epoch processes all training examples once. The algorithm converges when an entire epoch produces zero errors. The Perceptron Convergence Theorem guarantees convergence in finite steps if the data is linearly separable. For non-separable data, the perceptron will never converge and continues updating indefinitely.
Developed by Kevin Yu & Panagiotis Angeloudis