> Perceptron_
1958 — Frank Rosenblatt
output = step(w1*x1 + w2*x2 + bias)
> Manual Adjustment
> Select Problem
| x1 | x2 | Target | Pred | |
|---|---|---|---|---|
| 0 | 0 | 0 | 0 | OK |
| 0 | 1 | 0 | 0 | OK |
| 1 | 0 | 0 | 0 | OK |
| 1 | 1 | 1 | 0 | X |
> What is a Perceptron?
The Perceptron (1958) is the simplest neural network. It multiplies inputs by weights, sums them, and outputs 1 if the sum exceeds a threshold, 0 otherwise. It draws a single straight line (decision boundary) to classify data into two groups.
> Why AND & OR Work
AND and OR are "linearly separable." A single straight line can perfectly divide the 0-output points from the 1-output points. The perceptron finds this line through training.
> Why XOR Fails
XOR (exclusive or) is not linearly separable. The points (0,0)=0 and (1,1)=0 are diagonal, as are (0,1)=1 and (1,0)=1. No single straight line can correctly separate all four points. This is a geometric limitation — no amount of training can overcome it.
> The First AI Winter (1969-1986)
In 1969, Marvin Minsky and Seymour Papert mathematically proved this limitation in their book "Perceptrons." This devastated funding and interest in neural network research, triggering the "First AI Winter." It took ~17 years before the solution (multi-layer perceptrons + backpropagation) gained widespread recognition.
Read more about the First AI Winter →