Aug. 10th, 2021

telophase: (Default)
After my success--or "success"--with the AI art generation in previous posts, I thought about the text generator I'd been messing with a couple of years back (not my online generators--those work more on the basis of Mad Libs than true text generation). Decided to poke around and see what of the new generation of text generators available to the public was like, and I found one based on the GPT-2 neural network set up as a Google Colaboratory notebook by Max Woolf, saved a copy of it for myself, and set to work.

The text-generation model has been trained to start with on a bunch of text from the intartubes. You can use it as-is, or you can fine-tune it by feeding it a bunch of data and letting it chew over that data for a number of iterations, calculating probabilities.

ExpandSo naturally I scraped the Zoe Chant website and fed a bunch of shifter romance descriptions into it. )
telophase: (Default)
...on the giant dump of generated shifter romance book descriptions I asked a neural network to give me (see previous post).

ExpandDuring one iteration, it got stuck on boots )
telophase: (Default)
I have done what nobody everybody was clamoring I do: I have downloaded three years of my Dreamwidth posts and trained an AI text generator on them!

ExpandDare you see what lurks beneath? )

Expand Cut Tags

Expand All Cut TagsCollapse All Cut Tags