Generating Dragon Ball Fan Fiction with OpenAI's GPT-3 and Twilio SMS

August 20, 2020
Written by
Sam Agnew
Twilion

Copy of Generic Blog Header 3(2).png

Fans of all types of media often have fun coming up with alternate stories that take place in the universe of their favorite pieces of fiction. Sometimes, particularly talented writers or artists working on fan fiction even end up being selected to work professionally on the licensed material.

OpenAI's new GPT-3 (Generative Pre-trained Transformer 3) model was trained on a massive corpus of text making it incredibly powerful. This can be used to generate full dialogue between characters with surprisingly little input text given to it as a prompt.

Let's walk through how to create a text-message powered bot to generate fan fiction in Python using Twilio Programmable Messaging and OpenAI's API for GPT-3. We'll use the Dragon Ball universe as an example because I had a lot of fun watching the series through my childhood to my adulthood.

Before moving on, you'll need the following:

Introduction to GPT-3

GPT-3 (Generative Pre-trained Transformer 3) is a highly advanced language model trained on a very large corpus of text. In spite of its internal complexity, it is surprisingly simple to operate: you feed it some text, and the model generates some more, following a similar style and structure.

We can see how versatile it is by giving it a tiny amount of input text. Just from a character's name followed by a colon, it's able to determine that this text is structured as character dialogue, figure out which series the character is from, and generate somewhat coherent scenes with other characters from that series. Here's an example from Dragonball Z, with the input text being simply "Goku:"

Example scenario

As mentioned above, this project requires an API key from OpenAI. At the time I’m writing this, the only way to obtain one is by being accepted into their private beta program. You can apply on their site.

Once you have an OpenAI account, you can use the Playground to play around with GPT-3 by typing in text and having it generate more text. The Playground also has a cool feature that allows you to grab some Python code you can run, using OpenAI's Python library, for whatever you used the Playground for.

Open AI Playground code export

 

Working with GPT-3 in Python using the OpenAI helper library

To test this out yourself, you'll have to install the OpenAI Python module. You can do this with the following command, using pip:

pip install openai==0.2.4

Now create an environment variable for your OpenAI API key, which you can find when you try to export code from the Playground:

export OPENAI_KEY='YOUR_API_KEY_HERE'

We will access this in our Python code. To try this out, open a Python shell and run the following code, most of which I took directly from the Playground:

import os
import openai

openai.api_key = os.environ.get("OPENAI_KEY")
prompt_text = "Goku:"

response = openai.Completion.create(
  engine="davinci",
  prompt=prompt_text,
  temperature=0.7,
  max_tokens=256
)

print(prompt_text + response['choices'][0]['text'])

You should see something like this printed to your terminal:

Goku vs Android 17

Now let's change the code to begin the dialogue with a randomly selected character from the series, so it isn't always Goku:

from random import choice
import os
import openai

characters = ['Goku', 'Gohan', 'Piccolo', 'Vegeta', 'Krillin', 'Bulma',
                  'Frieza', 'Trunks', 'Tien', 'Yamcha', 'Android 17', 'Android 18']

openai.api_key = os.environ.get("OPENAI_KEY")
prompt_text = "{}:".format(choice(characters))

response = openai.Completion.create(
  engine="davinci",
  prompt=prompt_text,
  temperature=0.7,
  max_tokens=256
)

print(prompt_text + response['choices'][0]['text'])

Run this code a few times to see what happens!

Gohan and Trunks sense a high power level

Configuring a Twilio phone number

Before being able to respond to messages, you’ll need a Twilio phone number. You can buy a phone number here.

We're going to create a web application using the Flask framework that will need to be visible from the Internet in order for Twilio to send requests to it. We will use ngrok for this, which you’ll need to install if you don’t have it. In your terminal run the following command:

ngrok http 5000

ngrok URL

This provides us with a publicly accessible URL to the Flask app. Configure your phone number as seen in this image so that when a text message is received, Twilio will send a POST request to the /sms route on the app we are going to build, which will sit behind your ngrok URL:

Configuring your Twilio number

With this taken care of, we can move onto actually building the Flask app to respond to text messages with some Dragonball fanfic.

Responding to text messages in Python with Flask

So you have a Twilio number and are able to have the OpenAPI generate character dialogue. It's time to allow users to text a phone number to get a scene of dialogue from the Dragonball universe. Before moving on, open a new terminal tab or window and install the Twilio and Flask Python libraries with the following command:

pip install twilio==6.44.2 Flask==1.1.2

Now create a file called app.py and add the following code to it for our Flask application:

import os
from random import choice

import openai
from flask import Flask
from twilio.twiml.messaging_response import MessagingResponse

openai.api_key = os.environ.get('OPENAI_KEY')

characters = ['Goku', 'Gohan', 'Piccolo', 'Vegeta', 'Krillin', 'Bulma',
              'Frieza', 'Trunks', 'Tien', 'Yamcha', 'Android 17', 'Android 18']
app = Flask(__name__)


@app.route('/sms', methods=['POST'])
def sms():
    prompt_text = '{}:'.format(choice(characters))

    response = openai.Completion.create(
      engine="davinci",
      prompt=prompt_text,
      temperature=0.7,
      max_tokens=128,
      presence_penalty=0.3
    )

    story = response['choices'][0]['text']
    print(story)

    twiml_resp = MessagingResponse()

    # Respond with the full text including the initial prompt,
    # up until the last line break, just in case it was incomplete.
    twiml_resp.message(prompt_text + story[:story.rfind('\n')])

    return str(twiml_resp)


if __name__ == '__main__':
    app.run(debug=True)

This Flask app contains one route, /sms, which will receive a POST request from Twilio whenever a text message is sent to our phone number, as configured before with the ngrok URL.

A few changes were made to the OpenAI API code. First, we are having it generate only 128 tokens, because we are using the davinci engine, which is the most advanced model. This can potentially take a lot of time to generate text, so keeping the number of tokens small will make sure the HTTP response to Twilio doesn't timeout. We're also omitting everything after the final newline character, because sometimes generation will stop mid-sentence, as you can see in this example, where it stopped at the word "have":

Goku and Hercule

Save your code, and run it with the following command:

python app.py

Because we have ngrok running on port 5000, the default port for Flask, you can now text it and receive some hilarious or questionably sensical Dragonball scenarios!

Tweaking it for better results

We now have a Dragonball fanfic text bot! However, this is just the beginning. You can improve this in many ways, by experimenting with all of the different parameters in the API request, or even the prompts that you use to generate the text. For example, maybe you can try just entering a sentence that describes a scene, and seeing what type of dialogue the AI generates:

Generated text with a different input prompt

In this example, it even generated some more descriptive text after my initial description, and the dialogue is in a different format than before.

Now you can create different types of fanfic bots for any series, movies, or books you want! Here's an example using Star Wars characters:

Star Wars dialogue

In terms of technical changes you can also experiment with having it generate more tokens, but making up for the added time that takes by using Redis Queue to send a text message asynchronously when generation is finished, rather than risking a timeout by responding to Twilio's HTTP request right away with TwiML.

If you want to do more with Twilio and GPT-3, check out this tutorial on how to build a text message based chatbot.

Feel free to reach out if you have any questions or comments or just want to show off any stories you generate.