> Markov Chain Text Generator_
Text generation before GPT
Presets:
Source Text:
N-gram: 3
Creative
Speed: 30
> OUTPUT_
Press GENERATE to create text...
> What N-gram Order Means_
1: Nearly random. Only reflects character frequency.
2: Word-like patterns appear. Still nonsensical.
3: Recognizable phrases emerge. Balance of creativity and faithfulness.
4-5: Very close to source. Originality decreases.
> What is a Markov Chain?_
A Markov chain predicts the next character based on the previous N characters. It was conceived by Russian mathematician Andrey Markov in 1906.
Before modern AI like GPT and BERT, much of text generation was based on this principle. Even your phone's predictive text is essentially a Markov chain application.
Higher N-gram orders produce more faithful output but less creativity. This "faithfulness vs creativity" tradeoff is essentially the same concept as the temperature parameter in modern LLMs.