<- Back to Lab

> Neural Network_

1986 — Backpropagation

The perceptron couldn't solve XOR. Add a hidden layer and... it works! This is the power of backpropagation.

← See the perceptron's limitation
Epoch
0
Loss
0.0000
Accuracy
0.0%

Network Architecture

Decision Boundary

What is a Neural Network?

Neural networks were created to overcome the perceptron's limitations. By adding "hidden layers" between input and output, they can learn non-linear decision boundaries. Backpropagation calculates how much each weight contributes to the error, enabling efficient learning.

Revival from the AI Winter (1986)

In 1986, Rumelhart, Hinton, and Williams published backpropagation, enabling efficient training of multi-layer networks. The perceptron's limitations became history, and the second AI boom began.