Last year for National Novel Writing Month I trained a neural net called torch-rnn on 10,096 unique ways to begin a novel. It came up with some intriguing possibilities, my personal favorite being “I am forced to write to my neighbors about the beast.” But many of its sentences used made-up words, or had such weird grammar that they were difficult to read, or meandered too erratically. (“The first day of the world was born in the year 1985, in an old side of the world, and the air of the old sky of lemon and waves and berries.”) The neural net was struggling to write more than a few words at a time.
This year, I decided to revisit this dataset with a larger, more-powerful neural net called GPT-2. Unlike most of the neural nets that came earlier, GPT-2 can write entire essays with readable sentences that stay mostly on topic (even if it has a tendency to lose its train of thought or get very weird). I trained the largest size that was easily fine-tunable via GPT-2-simple, the 355M size of GPT-2. Would a more-powerful neural net produce better first lines?
One of the parameters I can tweak when I’m getting a trained neural net to generate text is temperature - this controls whether the neural net chooses the most likely next bit of text as it writes, or whether it’s permitted to use its less-likely predictions. At a default of 0.7, a relatively conservative temperature, the neural net’s first lines not only make grammatical sense, but they even have the rhythm of a novel’s first line. This is DRAMATICALLY better than torch-rnn did.
I am, or was.
At the mid-day meal the sun began to set and the quiet dragged on.
There was once a man who lived for a very long time; perhaps three thousand years, or perhaps a thousand million years, maybe a trillion or so, depending on how the scientists look at it.
He had the heart of a lion, and the fangs of a man-eater.
“I am Eilie, and I am here to kill the world.”
The old woman was sitting on a rock near the sea, smoking a pipe.
I have just been informed, that the debate over the question ‘is it right or wrong to have immortal souls’ has been finally brought to a conclusion.
When I was a boy, I was fond of the story of the pirate god.
He had a strange name, and he was a very big boy indeed.
The purple-haired woman came to the clearing in the plain, and without looking up from her book, said, “It’s too late to be thinking about baby names.”
The village of Pembrokeshire, in the county of Mersey, lies on a wide, happy plain, which, in a few years, was to become known as the “Land of the Endless Mountains.”
I don’t think the neural net plagiarized any of these? They are so good that I’m suspicious. But others of the neural net’s lines are even weirder, yet in an effective way that opens with an intriguing premise.
The moon had gone out.
I was playing with my dog, Mark the brown Labrador, and I had forgotten that I was also playing with a dead man.
The black stone was aching from the rain.
The short, dirty, and dirty-looking ship that weighed three tons and was three feet in diameter landed on a desolate and green plain.
How many times have I had the misfortune to die?
The first black dog in the park had been captured alive.
Behold the Sky Rabbits!
In the belly of the great beast that was the bovine Aurore there lived, upon the right hand of the throne, the Empress Penelope; and she had, as it were, a heart of gold.
The moon stood on its own two feet.
The reeking maw of the blood-drunk ship, the enemy’s flagship, was silent and empty.
The first day I met my future self, I was aboard the old dirigible that lay in wait for me on the far side of the moon.
The child of two cats, and a tiger, a clown, a horse, a bird, a ship, and a dragon, stood on either side of the threshold of the Gatehouse, watching the throng of travelers who came in from all around the world, before he had any idea what was going on.
I think it’s probably doing this accidentally, stringing likely words and phrases together without understanding what any of them really mean. It’s not that it’s good at science fiction or magical realism; it’s that it’s trying and failing to predict what would have fit in with the usual human-written stuff. Some of the neural net’s first lines really betray its lack of the understanding of the laws of physics. It really likes to describe the weather, but it doesn’t really understand how weather works. Or other things, really.
The moon was low in the sky, as though it had been shipped in from the farthest reaches of the solar system.
The first star I saw was a blue one, which became a scarlet one, and then a gold one, and green, and finally a yellow one, which for some years afterwards seemed to be an ebony one, or even a bubbling mass.
The sun rose slowly, like a mighty black cat, and then sank into a state of deep sleep.
The sea of stars was filled with the serenity of a million little birds.
The great blue field was all white, swept away by the blue-gold breeze that blew from the south.
The sky was cold and dark, and the cold wind, if it had not been for the clouds, would have lashed the children to the roof of the house.
The morning sun was shining brightly, but the sky was grey and the clouds aching.
The night that he finally made up his mind to kill the dog, the man was walking home from the store with his wife and child in the back seat.
Arthur the lion had been pretty much extinct for some time, until the time when he was petted by Abernathy the old woman, and her son, Mr. Popp.
One of the disadvantages of having a neural net that can string together a grammatical sentence is that its sentences now can begin to be terrible in a more-human sense, rather than merely incomprehensible. It ventures into the realm of the awful simile, or the mindnumbingly repetitive, and it makes a decent stab at the 19th century style of bombastic wordiness. I selected the examples above for uncomprehending brilliance but the utter tediousness below is more the norm.
The whites of my eyes shimmered, as if my mind were dancing.
I once went to a party where the dress code was as strict as a chicken coop with no leggings and no boots.
A black cloud drifted by, a mottled mass of hydrogen, a black cloud of hydrogen, with the definite characteristic of being black.
I say I am at sea, because I am standing upon the ocean, and look out across the barren, vast throng of the sea.
It is, of course, a trifling matter in the ordinary course of things, if a certain writer were to write a novel, which is a book of stories, which is a book of characters, wherein every detail of the story is stated, together with a brief description of the theme which it concerns.
There was a boy with blue eyes, with sandy hair and blue eyes that looked at all times like he had been pushed through a million compartments.
The Sun, with its rolling shaft of bright light, the brilliant blue of the distant golden sun, and the red glow of its waning corona, was shining.
The man who was not Jack the Ripper had been promoted four times in the last two years.
Felix the Paw was sitting at the table of his favorite restaurant, the “Bordeaux” in the town of Bordeaux, when his father, Cincinnata, came in to say good-by to the restaurant.
It, sir, gives me the greatest pleasure to hear that the Court be not too long in passing away: but that I may have leisure to prepare a new work for the publication of my friend and colleague, the renowned Epistemology, which is now finished; and in which I shall endeavour to show, that this very point is of the highest importance in the subject of the philosophy which I am about to treat of.
It was a rainy, drizzling day in the summer of 1869 and the people of New York, who had become accustomed to the warm, kissable air of the city, were having another bad one.
Repetitiveness is also common, especially at this conservative temperature setting. Once the neural net gets itself into a repetitive state, it doesn’t seem to rescue itself - it’s a problem that people have noticed in several versions of this algorithm. (It doesn’t help that I forgot to scrub the “title” that someone submitted to the dataset that consists of the word “sand” repeated 2,000 times)
The sky was blue and the stars were blue and the sun was blue and the water was blue and the clouds were blue and the blue sky was like a piece of glass.
At the end of the world, where the tides burst upon the drowned, there exists a land of dragons, of dragons, which is the land of the dragons.
It’s the end of the world, it’s the end of the world, it’s the end of the world, it’s the end of the world, it’s the end of the world, you’re dead.
There was once a land of sand, and sand, sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand sand
Increasing the temperature of the sampling would help the repetitiveness problem, in theory, letting the neural net venture into more interesting territory. But at a temperature of 1.0 the text tends to venture out of everyday surrealism and into wordy yet distractible incomprehensibility.
The praying mules on the top of the hills sounded the final klaxon, lifting their spiked front hoofs as they crept the last few feet of desert landscape past the crest of the enormous swathe of prehistoric sand.
In the glen of the Loch is a ladder that winds way up through a passage to a ledge with soft, moss-laden environmental standards.
Someone whipped a dead squash gibbet across the room, like some formidable war lord unleashing a heavy hunk of silver at home.
One blue eyed child stood up and cried out: “Douay, saurines, my Uncle – Fanny Pemble the loader!”
Jud - an elderly despot, or queen in emopheles, was sitting across the table from the king, looking very thoughtfully into the perplexions of the proceedings.
Oh, you’re a coward little fool, as if you couldn’t bear to leer at a Prunker or white-clad bodyguard quickly emerging from a shady, storm-damaged area of the city.
Hanging presently in his little bell-bottomed chamber on the landing-house, early in the morning, the iron traveler sat on a broad-blonde sandbricksannel blanket outside the gate of a vast and ancient island.
Long, glowing tongues trailed from your mouth as you listened to what was being said across this kingdom of ours, but growing a little more somber since the week that caused us to proclaim general war.
The night I first met Winnie the Pooh, I had sat in the Tasting-House and heard the Chef unpack the last of the poison upon his quiet dinnertable.
There is, of course, no perfect setting at which the neural net churns out sensible yet non-repetitive first lines. There are just varying shades of general awfulness, interspersed with accidental brilliance.
No matter how much you’re struggling with your novel, at least you can take comfort in the fact that AI is struggling even more.
I generated all the neural net sentences above using a generic “It” as the prompt that the neural net had to build on (it would usually go on to generate another 20-30 sentences at a time). But although the sentences are independent in my training data, GPT-2 is used to large blocks of text that go together. The result is if I prompt it instead with, say, a line from Harry Potter fanfic, the neural net will tend to stick with that vein for a while. AI Weirdness supporters can read the results as bonus content. Or become a free subscriber to get new AI Weirdness posts in your inbox!
Update: I now have a few thousand unfiltered examples of neural net-generated first lines at the GitHub repository where I have the original crowdsourced dataset. Themes include: Harry Potter, Victorian, My Little Pony, and Ancient Gods.