Exercises in Meta-cohesion
A linguistic and comutational investigation in creative writing with GPT-2.
In 2019, OpenAI released GPT-2, a language model capable of generating whole paragraphs of text at a time. GPT-2’s output, stripped of inhibition and ego, offers delightful linguistic surprises run after run. Eventually though, the novelty wears off. When it does, we’re left to wonder: how do we make this statistical trick— an assembly of words no longer contingent on an author's intention— mean something to us?
In Exercises in Meta-cohesion, my mechanical co-writer (GPT-2) and I tell a fictional tale of characters whose connections to each other build a society out of selves. Underneath this surface, we tell our own tale of human and machine working together through formulas, improv, and endless material to put words artfully together.
Exercises in Meta-cohesion uses GPT-2 by OpenAI, with the help of gpt-2-simple by Max Woolf. All custom tuning datasets were created using the Python Reddit API Wrapper (PRAW).