Building a Chatbot with the ChatGPT API, Twilio Programmable Messaging and Python

March 16, 2023
Written by
Reviewed by

A chatbot is a software application that is able to conduct a, usually, written conversation with a human user. In this tutorial I’m going to show you how easy it is to build an SMS chatbot using ChatGPT, the extremely popular chatbot from OpenAI that is practically indistinguishable from a human. To create this chatbot, the Flask framework for Python will be used.

Introduction to ChatGPT

ChatGPT is a highly advanced language model trained on large amounts of text. It is based on the GPT-3 family of language models developed by OpenAI. In spite of its internal complexity, ChatGPT is surprisingly simple to operate: you feed it some text, such as a question or prompt or maybe a list of questions and answers, followed by an open question. Then, the model generates some text as a response, following a similar style and structure to the input data. In this article you will learn how to embed this amazing chat engine into your Twilio SMS application!

OpenAI provides an interactive website where you can try out ChatGPT at https://chat.openai.com. As an example, I just had the following conversation with it:

 

ChatGPT example chat

Pretty amazing, right?

Tutorial requirements

To follow this tutorial you need the following items:

  • Python 3.7 or newer. If your operating system does not provide a Python interpreter, you can go to python.org to download an installer.
  • A Twilio account. If you are new to Twilio click here to create a free account.
  • An OpenAI API key. Request access here.
  • A phone that can send and receive SMS.

Create a Python virtual environment

Following Python best practices, as a first step in creating the chatbot you are going to create a separate directory for the project, and inside it you are going to create a virtual environment. Then, you are going to install the Python packages that are needed for the chatbot in it.

If you are using a Unix or macOS system, open a terminal and enter the following commands to do the tasks described above:

mkdir twilio-chatgpt
cd twilio-chatgpt
python3 -m venv venv
source venv/bin/activate
pip install openai twilio flask python-dotenv pyngrok

For those of you following the tutorial on Windows, enter the following commands in a command prompt window:

md twilio-chatgpt
cd twilio-chatgpt
python -m venv venv
venv\Scripts\activate
pip install openai twilio flask python-dotenv pyngrok

The last command uses pip, the Python package installer, to install the five packages that are needed for this project, which are:

Configuration

As mentioned above, this project requires an API key from OpenAI. The Python application will need to have access to this key, so you are going to create a .env file and store it there. The application will then import it from this file as an environment variable.

Create a .env file in your project directory (note the leading dot) and paste the following into the file, replacing your-openai-api-key-here with your OpenAI key:

OPENAI_KEY=your-openai-api-key-here

You will learn how to work with this file in the next section.

If you plan on putting your project under source control, make sure this file is excluded, as you would not want to accidentally share your OpenAI key.

Sending ChatGPT requests from Python

In this section you are going to create the support code that works with the ChatGPT engine. The code will be stored in a file called chatbot.py. Below you can see the initialization section of this file:

import os
from dotenv import load_dotenv
import openai

load_dotenv()
openai.api_key = os.environ.get('OPENAI_KEY')
completion = openai.ChatCompletion()

start_chat_log = [
    {"role": "system", "content": "You are a helpful assistant."},
]

The load_dotenv() function imports data stored in the .env file as environment variables, which in this case is done to read the OpenAI key you stored in this file earlier. Note how the OPENAI_KEY variable is used in the following line to initialize OpenAI with the key. The completion variable holds the actual client to the ChatGPT engine. This is the object that will interact with the OpenAI service.

The start_chat_log variable is a list that is going to contain all the interactions between the user and ChatGPT. At this initial stage, it has only one entry, which has the purpose of initializing the chat engine and setting the tone for it.

Each message in the chat log is assigned a role. The three available roles are:

  • system: only used in the first prompt to give the chat engine some guidelines on its personality and how to respond and interact with the user.
  • user: the questions or prompts entered by the user.
  • assistant: the responses returned by the ChatGPT engine for previous questions.

The system message can be changed to create different styles of chatbots. Once you have the bot up and running, I encourage you to try different system prompts to see how the responses change accordingly.

Let’s now write a function that sends a query to ChatGPT and gets a response. Add the following function at the bottom of chatbot.py:

def askgpt(question, chat_log=None):
    if chat_log is None:
        chat_log = start_chat_log
    chat_log = chat_log + [{'role': 'user', 'content': question}]
    response = completion.create(model='gpt-3.5-turbo', messages=chat_log)
    answer = response.choices[0]['message']['content']
    chat_log = chat_log + [{'role': 'assistant', 'content': answer}]
    return answer, chat_log

The askgpt() function takes the question from the user as a first argument, followed by an optional chat log. If the chat log is not provided then the function uses start_chat_log to initialize a new chat with the default system prompt.

The function adds the question from the user as the last item in the chat_log list with the user role. It then sends the list to the completion.create() function, which returns the chat engine’s response. The model argument that is passed along with the chat log is used to specify which language model to use. The gpt-3.5-turbo model is designed to work with ChatGPT at the time I’m writing this. But make sure to check the ChatGPT API documentation as this model might have been superseded by another one by the time you read this.

The response from ChatGPT is an object that has a choices attribute, which is a list of possible responses. At this time, this list always comes back with a single response. The message element in this response has the text of the answer in the content sub-element. Before it ends, the function extracts the text returned by ChatGPT from the response objects and adds it to the chat log using the assistant role. It then returns both the response and the updated chat log.

Start a Python shell to test the askgpt() function:

>>> from chatbot import askgpt
>>> answer, log = askgpt('Who played Forrest Gump in the movie?')
>>> answer
'Tom Hanks played the role of Forrest Gump in the movie "Forrest Gump" which was released in 1994.'
>>> answer, log = askgpt('Was he also in Godfather II?', log)
>>> answer
'No, Tom Hanks did not appear in The Godfather Part II. He was not involved in any of The Godfather films.'

Pretty cool, right? Note how by passing the chat log in the second question the engine knows that the discussion is centered on movies in which Tom Hanks has appeared, even though the name isn’t explicitly mentioned.

The askgpt() function is all we need to manage the chat portion of this project. In the next sections you are going to integrate this function with Twilio SMS messaging.

Configure Twilio Programmable Messaging

The next step is to configure a Twilio phone number for the chatbot to use. If you haven’t created a Twilio account yet, create it now and log in.

Buying a Twilio phone number

If you already have a phone number in your Twilio account, you can proceed to the following section. Read on if you need to acquire a phone number.

Open the Twilio Console in your web browser and visit the Phone Numbers section to buy a phone number. Select your country on the “Buy a Number” screen, and check SMS in the capabilities field. If you’d like to request a number from your region, you can enter your area code in the "Number" field.

Buy a Twilio phone number

Click the "Search" button to see what numbers are available, and then click “Buy” for the number that you like from the results. After you confirm your purchase, click the “Close” button.

The documentation provides more details about the phone number buying process.

Configuring the webhook URL

The Twilio Programmable Messaging API uses a webhook to notify an application when there is an incoming message. The chatbot application presented in this article will implement a webhook that will be called by Twilio each time the user sends an SMS to the Twilio phone number. The webhook is going to use the ask() function you created earlier to respond to the user.

The Flask web framework greatly simplifies the task of creating a webhook. Below is a skeleton application with a webhook definition. Copy the following code into a new file called app.py, in the same directory as the chatbot.py file:

from flask import Flask, request
from twilio.twiml.messaging_response import MessagingResponse

app = Flask(__name__)


@app.route('/bot', methods=['POST'])
def bot():
    incoming_msg = request.values['Body']

    # use the incoming message to generate the response here

    r = MessagingResponse()
    r.message('this is the response')
    return str(r)

This code creates a Flask application that has an endpoint with the URL /bot. When Twilio is configured to know about this endpoint, it will send a request each time an SMS is received on the phone number you purchased earlier. The application can read the incoming message, do some processing and generate a response, which is returned at the end of the function.

You can start the above application as follows:

(venv) $ flask run
 * Serving Flask app "app" (lazy loading)
 * Environment: production
   WARNING: This is a development server. Do not use it in a production deployment.
   Use a production WSGI server instead.
 * Debug mode: off
 * Running on http://127.0.0.1:5000/ (Press CTRL+C to quit)

The application is now running on your computer at the http://localhost:5000 URL, but it is not accessible from the Internet. The next step is to provision a temporary public URL that we can give to Twilio so that it can send requests to this webhook.

For this you are going to use the ngrok tool. Leave the application running and open a second terminal. Activate the virtual environment on this terminal as you have done on the first at the start of the tutorial and then enter the following command:

(venv) $ ngrok http 5000

You should see a screen similar to this:

Ngrok screenshot

Note the lines beginning with “Forwarding”. These show the public URLs that ngrok uses to redirect requests into our service, both in http:// and https:// format. What we need to do now is tell Twilio to send incoming SMS notifications to this URL.

Go back to the Twilio Console, click on Phone Numbers / Manage / Active numbers, and then on the phone number that you purchased above. Scroll down to the “Messaging” section, copy the https:// URL from the ngrok output and then paste it on the “A message comes in” field with /bot added at the end of the URL. Make sure the request method is set to HTTP POST. Don’t forget to click the blue "Save" button at the bottom of the page to record these changes.

Configure Twilio SMS Webhook

To test your webhook, send an SMS to your Twilio phone number from your personal phone. You should get the generic response that comes from the Flask application:

Testing the Twilio SMS webhook

Now we know how to run the ChatGPT engine, and how to receive and reply to SMS messages with Twilio. In the next section will integrate the two and have a complete chatbot project.

SMS chatbot with ChatGPT and Twilio

Let’s expand app.py to send the messages received from users to the askgpt() function that we created above, and to maintain the log of the chat. Here is the updated app.py:

from flask import Flask, request, session
from twilio.twiml.messaging_response import MessagingResponse
from chatbot import askgpt

app = Flask(__name__)
app.config['SECRET_KEY'] = 'top-secret!'


@app.route('/bot', methods=['POST'])
def bot():
    incoming_msg = request.values['Body']
    chat_log = session.get('chat_log')

    answer, chat_log = askgpt(incoming_msg, chat_log)
    session['chat_log'] = chat_log

    r = MessagingResponse()
    r.message(answer)
    return str(r)

The most convenient place to store the chat log with a user is in the session variable from Flask, which uses an HTTP cookie for storage. Twilio works similarly to a web browser in this respect, and will accept, store, and resend cookies during a conversation with a user for a period of up to four hours. If there are two or more users chatting with the bot at the same time, each will have their own cookie.

Since session cookies are signed, the Flask application needs to have a secret key configured to be able to generate signatures. For security, you should replace the placeholder value I used for the secret key with something unique.

The bot() endpoint begins by obtaining the user message as before, and the chat log from the session. The session object uses the familiar dictionary syntax, so session.get() will return a previously stored chat log if it exists or None if this user does not have a session. This is convenient because the askgpt() function uses the initial chat log if None is passed.

Calling the askgpt() function works exactly as before. We pass the user message and the current chat log, and we get back the chatbot’s response and the updated chat log.

The now expanded chat log is stored in the Flask user session so that it is recalled when the user writes again. To end the function, the answer is returned as the response, and this will make Twilio send it back to the user over SMS.

Ready to try your new bot? If you are still running your Flask application, stop it with Ctrl-C and then run it again so that it incorporates the latest updates. If you are not running ngrok anymore, go to your second terminal and restart it, and then follow the instructions in the previous section to update your webhook URL in the Twilio Console; as ngrok allocates a different URL every time it runs.

With the Flask application and ngrok running, you are ready to start sending SMS to the bot. Send a text with your first question to the Twilio phone number like you did before, and in a few seconds the response should arrive. Feel free to try the bot on different topics, it knows about a lot of things!

Continuing with the theme of movies and Tom Hanks, here is a short conversation I had with ChatGPT through the Twilio SMS integration:

Example chat session with ChatGPT over SMS

Conclusion

I hope you had as much fun as I had with this project! The possibilities of the ChatGPT engine are endless, so I encourage you to experiment with it by adapting this project and making it your own.

If you are interested in learning more about OpenAI’s language models, I invite you to read my GPT-3 posts as well. GPT-3 is the underlying technology that powers ChatGPT. It has its own API, with additional features and customization options:

If you build something cool I’d love to hear about it!

Miguel Grinberg is a Principal Software Engineer for Technical Content at Twilio. Reach out to him at mgrinberg [at] twilio [dot] com if you have a cool project you’d like to share on this blog!