Infinite Monkeys


The “modularity” principle we found in new media is the basis of all languages: we combine a set of symbols to compose words, words to make sentences and so on.
The part of a text are clearly distinct and easy to recombine, and especially in poetry new unexpected meanings can emerge from arbitrary juxtapositions.
Many writers, poets and artists experimented with loss of control in writing.

Given enough time, a hypothetical monkey typing at random would, as part of its output, almost surely produce all of Shakespeare’s plays.
– the Infinite Monkey Theorem

Not to be taken literally…

In an experiment conducted in a Zoo in England, zookeepers left a computer keyboard in the cage of six macaques for a month. The monkeys produced only a five page document, consisting mostly of the letter S, until the alpha male bashed the keyboard with a stone and all the other monkeys urinated and defecated on it.
– Dario Maestripieri, Primatologist

Writers and artists have been fascinated with randomness and with the combinatorial properties of text since DADA.

To make a Dadaist poem
Take a newspaper.
Take a pair of scissors.
Choose an article as long as you are planning to make your poem.
Cut out the article.
Then cut out each of the words that make up this article and put them in a bag.
Shake it gently.
Then take out the scraps one after the other in the order in which they left the bag.
Copy conscientiously.
The poem will be like you.
And here you are a writer, infinitely original and endowed with a sensibility that is charming though beyond the understanding of the vulgar.
– Tristan Tzara, 1920

Writer William Burroughs in the ’50s applied this technique, dubbed cut-up technique, to his own writing and recordings. (And David Bowie, and Kurt Cobain, and Thom Yorke…)

At that point, artists were applying randomness not only to text:

In John Cage’s Music of Changes (1951) the composer selected duration, tempo, and dynamics by using the I-Ching, an ancient Chinese divination method for arriving at random numbers.

Principle of variability in new media

A new media object is not something fixed once and for all but can exist in different, potentially infinite, versions. This is another consequence of numerical coding of media and modular structure of a media object
Lev Manovich – The language of New Media

The first example of Computer based text generator is Loveletter by Christopher Strachey (1952)
Web version

A House of Dust – Alison Knowles

Context free grammars
Generation of text according to formal rules.

Taroko Gorge by Nick Momford. Derivative works using different databases and algorithms.

Text generation as parody:

Postmodernism generator

Automatic CS Paper Generator
Art critique generator
Artist Statement
Bad literature

And so on…

Machine imagined art
– based on the Tate database

Game Definitions

Jim Campbell – Formula for computer art

Meaning by Eugenio Tisselli, 2005
Each time this page is visited, one of the words in the following section will be replaced by a synonym. The replaced word is shown in bold. Refresh the page to replace another word.

Twitter Bots
Automating humor

You must be – from dictionary
Two headlines – from headlines
Amrite – from twitter trends and ryhmes
Bracketsmemebot – tournament meme generator with data from wikipedia
– from dictionary
Tiny Starfields
Every Word
Horse ebooks
(turned out to be fake)
Olivia Taters
in dialog with the Met museum bot.
Art assignment bot
Restroom Gender
Magic Realism Bot
Freeze Frame Bot
Shiv integer – randomly generated 3D object
Strange voyage – and some handmade illustrations responding to it

Markov Chain generation
The algorithm analyzes a source text and stores all the variations of a sequence of n letters.
Text generation starts from a random 5-character sequence present in the table, then out of the possible resolutions it randomly picks the next symbol.

'hello ' -> [m,w,p] chooses 'm' (randomly)
'ello m' -> [a,a,a,o,o] chooses 'a' (randomly)
'llo ma' -> [m]
'lo mam' -> [a,b]
'o mama' -> [' ']
' mama ' -> [w,!]
'mama w'....

The result is text that sounds right but doesn’t make any sense.

Example Dissociated press

The most common and utilitarian use of Markov chain for text generation is type of Spam

living with bubble bounce fairy from diskette.
Still eat her from cream puff
from, ignore her around widow with fairy living with freight train.cleavage
ignore from industrial complex.

Because is predestined friends,
Designer Shoes
we meet by chance in the space,
because is sincere,
we become the friend who separates the screen,
a blessing,

Wholesale designer shoes
our invariable subject
stays behind in your space
belongs to my footprint
to remain the regard your space
to wish your joyful happy each day!

From Spam Poetry Institute

What would I say
– based on your Facebook updates

Machine learning
Neural networks trained on a corpus of literature can produce even more convincing effect than markov chains.

p5 has an approachable ML library that can work with text

Memo Atken’s Word of math and word of math bias are two bots that explore the potential, humor, and bias of a word vector model created with machine learning algorithms.

Recognizing image content
Tag self bot 

Generating stories from images

TV comment bot – A bot that comments TV in real time – technical description here