← Back to Timeline
1966First Conversation

> ELIZA_

Weizenbaum created ELIZA at MIT.

> DEEP DIVE_

In 1966, Joseph Weizenbaum, a German-American computer scientist at MIT, created a program that would become one of the most famous and most misunderstood artifacts in the history of computing. ELIZA was a natural language processing program that simulated conversation by using pattern matching and substitution rules. Its most famous script, called DOCTOR, mimicked a Rogerian psychotherapist — the kind of therapist who reflects the patient's statements back as questions. If you typed "I am unhappy," ELIZA might respond "Do you think coming here will help you not to be unhappy?" If you typed "My mother hates me," it would reply "Who else in your family hates you?" The mechanism was shockingly simple: ELIZA scanned each input for keywords, applied transformation rules, and generated a response. It had no understanding whatsoever.

What stunned Weizenbaum was not how clever ELIZA was, but how completely people were fooled by it. His secretary, who knew the program was just a computer script, asked him to leave the room so she could have a "private conversation" with ELIZA. Students at MIT would sit at terminals for hours, pouring out their deepest anxieties to a program that was, in essence, a glorified text substitution engine. Some psychiatrists seriously proposed that ELIZA could help address the shortage of therapists by treating patients automatically. Weizenbaum was horrified.

The experience transformed Weizenbaum from a computer scientist into a philosopher and social critic. In 1976, he published "Computer Power and Human Reason: From Judgment to Calculation," a passionate argument that there are things computers should not be used for, even if they technically could be. He argued that the willingness of people to confide in a machine revealed something disturbing about modern society — a profound loneliness, a desperate need for connection that would grasp at any simulacrum of understanding. He spent the rest of his career warning about the dangers of artificial intelligence and the dehumanizing potential of technology, becoming one of the first prominent tech insiders to turn critic.

The phenomenon Weizenbaum discovered has come to be called the "ELIZA effect" — the human tendency to attribute understanding, empathy, and intelligence to computer programs that exhibit even superficial signs of these qualities. It persists today more powerfully than ever. When users tell ChatGPT about their personal problems, when they thank Siri for her help, when they feel a pang of guilt about being rude to Alexa, they are experiencing the ELIZA effect. Weizenbaum's simple pattern-matching program, written in 420 lines of code on an IBM 7094, anticipated the central psychological challenge of the AI age: not whether machines can think, but whether humans can resist the illusion that they do.