<- Back to Lab

> Perceptron_

1958 — Frank Rosenblatt

output = step(w1*x1 + w2*x2 + bias)

> Manual Adjustment

w1:-0.036
w2:0.387
b:-0.549

> Select Problem

Speed:5/s
Lear:0.10
Weights:
w1=-0.0356
w2=0.3868
Bias:
-0.5493
Epoch:
0
Accuracy:
3/4 correct
> SYSTEM LOG
Start training...
x1x2TargetPred
0000OK
0100OK
1000OK
1110X

> What is a Perceptron?

The Perceptron (1958) is the simplest neural network. It multiplies inputs by weights, sums them, and outputs 1 if the sum exceeds a threshold, 0 otherwise. It draws a single straight line (decision boundary) to classify data into two groups.

> Why AND & OR Work

AND and OR are "linearly separable." A single straight line can perfectly divide the 0-output points from the 1-output points. The perceptron finds this line through training.

> Why XOR Fails

XOR (exclusive or) is not linearly separable. The points (0,0)=0 and (1,1)=0 are diagonal, as are (0,1)=1 and (1,0)=1. No single straight line can correctly separate all four points. This is a geometric limitation — no amount of training can overcome it.

> The First AI Winter (1969-1986)

In 1969, Marvin Minsky and Seymour Papert mathematically proved this limitation in their book "Perceptrons." This devastated funding and interest in neural network research, triggering the "First AI Winter." It took ~17 years before the solution (multi-layer perceptrons + backpropagation) gained widespread recognition.

Read more about the First AI Winter