Adventures in AI Text Generation
Aug. 10th, 2021 10:21 amAfter my success--or "success"--with the AI art generation in previous posts, I thought about the text generator I'd been messing with a couple of years back (not my online generators--those work more on the basis of Mad Libs than true text generation). Decided to poke around and see what of the new generation of text generators available to the public was like, and I found one based on the GPT-2 neural network set up as a Google Colaboratory notebook by Max Woolf, saved a copy of it for myself, and set to work.
The text-generation model has been trained to start with on a bunch of text from the intartubes. You can use it as-is, or you can fine-tune it by feeding it a bunch of data and letting it chew over that data for a number of iterations, calculating probabilities.
( So naturally I scraped the Zoe Chant website and fed a bunch of shifter romance descriptions into it. )
The text-generation model has been trained to start with on a bunch of text from the intartubes. You can use it as-is, or you can fine-tune it by feeding it a bunch of data and letting it chew over that data for a number of iterations, calculating probabilities.