Predictive Hacks

Unlock the Power of ChatGPT with the Python OpenAI Chat API

Most of us are familiar with the ChatGPT UI, and we are impressed with the stunning results that it generates. Clearly, this is a revolution in the AI world. The UI of the ChatGPT is really friendly, and it allows us to start experimenting with different prompts, but in order to unlock its power, we need to interact with it programmatically. This will let us automate some tasks and build applications.

Python Library

We will need to download the openai library as follows:

pip install --upgrade openai

OpenAI Chat API

Using the Chat API, we can interact programmatically with the ChatGPT and more particularly with the gpt-3.5-turbo and gpt-4 models. The other benefit of using the Chat API instead of the UI, is that you can play with the parameters, such as model, role, temperature, top_p, n, stream, stop, max_tokens, and so on. You can find more info here.

The main difference between the Chat Completion model and the Text Completion model is that the Chat models take as input a series of prompts. However, this does not mean that the Chat models cannot be used for single-turn tasks without any conversations. Keep in mind that the gpt-3.5-turbo performs similarly to text-davinci-003 but it is 10 times cheaper, and as a result, it is recommended to use gpt-3.5-turbo for most use cases.

OpenAI Chat API Calls

An example of an API call looks like this:

# Note: you need to be using OpenAI Python v0.27.0 for the code below to work
import openai

openai.ChatCompletion.create(
  model="gpt-3.5-turbo",
  messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Who won the world series in 2020?"},
        {"role": "assistant", "content": "The Los Angeles Dodgers won the World Series in 2020."},
        {"role": "user", "content": "Where was it played?"}
    ]
)

Where, the messages must be an array of dictionaries, where each dictionary has two keys, the role and the content. The role can be either “system“, “user“, or “assistant” and the content is simply the content of the message, whereas since it is a Chat, it can be either the “prompt” or the “completion”. The “prompt” is the message of the “role”: “user” and the “completion” is the message of the “role”: “assistant”. Usually, every conversation should start with the “role”: “system” that helps set the behavior of the model. Finally, since it is a conversation, the role of the user and the assistant must alternate each time.

Example of Chat API Calls

Independent API Calls

Let’s start with a simple example of a Chat API call. We will make the question:

Where did the World Cup 2006 take place?

Also, we will pass the role of the system as: “You’re an expert in football history”.

import openai
import os
openai.api_key = os.getenv("OPENAI_API_KEY")

response = openai.ChatCompletion.create(
  model="gpt-3.5-turbo",
  messages=[
        {"role": "system", "content": "You're an expert in football history"},
        {"role": "user", "content": "Where did the World Cup 2006 take place?"}
    ]
)

print(response)
 

We get the response object.

<OpenAIObject chat.completion id=chatcmpl-77kDp5JYVEX3tl8LbGB90l5mm1O7O at 0x244d26cc590> JSON: {
  "choices": [
    {
      "finish_reason": "stop",
      "index": 0,
      "message": {
        "content": "The FIFA World Cup 2006 was held in Germany.",
        "role": "assistant"
      }
    }
  ],
  "created": 1682080329,
  "id": "chatcmpl-77kDp5JYVEX3tl8LbGB90l5mm1O7O",
  "model": "gpt-3.5-turbo-0301",
  "object": "chat.completion",
  "usage": {
    "completion_tokens": 12,
    "prompt_tokens": 31,
    "total_tokens": 43
  }
}

As we can see, we got the answer:

The FIFA World Cup 2006 was held in Germany.

We can simply extract the answer as follows:

response['choices'][0]['message']['content']
 

And we get:

'The FIFA World Cup 2006 was held in Germany.'

Now, if we make the question:

Which teams did they make it to the finals?

response = openai.ChatCompletion.create(
  model="gpt-3.5-turbo",
  messages=[
        {"role": "system", "content": "You're an expert in football history"},
        {"role": "user", "content": "Which teams did make it to the finals?"}
    ]
)

response['choices'][0]['message']['content']
 

We get this answer:

I assume you are asking about football World Cup finals. Here is a list of all the teams that have been to a World Cup final:

1. Uruguay (1930, 1950)
2. Argentina (1930, 1978, 1986, 1990)
3. Italy (1934, 1938, 1982, 2006)
4. Czechoslovakia (1934)
5. Hungary (1938)
6. Brazil (1950, 1958, 1962, 1970, 1994, 2002, 2006, 2018)
7. West Germany/Germany (1954, 1966, 1974, 1982, 1986, 1990, 2002, 2014)
8. Netherlands (1974, 1978, 2010)
9. Poland (1974)
10. France (1998, 2006)
11. Sweden (1958)
12. England (1966)
13. Soviet Union/Russia (1966, 1970)
14. Croatia (2018)

It is worth noting that during the 1930s and 1950s, the format of the World Cup was different and there were no semi-finals, so the teams that reached the final had different paths to get there.

Clearly, the ChatGPT was not aware that we were referring to the World Cup 2006 Finals. In order to achieve that, we need to pass on the messages, the history of the conversation.

Conversational API Calls

We can make API calls by taking into consideration the history. What we have to do, is to append the new message to the messages. Let’s start from the beginning by asking the ChatGPT “Where did the World Cup 2006 take place?

messages=[{"role": "system", "content": "You're an expert in football history"},
          {"role": "user", "content": "Where did the World Cup 2006 take place?"}]

response = openai.ChatCompletion.create(
  model="gpt-3.5-turbo",
  messages = messages)

print(response['choices'][0]['message'])
 

We get:

{
  "content": "The World Cup 2006 took place in Germany.",
  "role": "assistant"
}

Now, we have to add the response to the “messages” list. Notice that we have to pass it as a dictionary using the dict command.

messages.append(dict(response['choices'][0]['message']))
 

And we get:

[{'role': 'system', 'content': "You're an expert in football history"},
 {'role': 'user', 'content': 'Where did the World Cup 2006 take place?'},
 {'role': 'assistant', 'content': 'The World Cup 2006 took place in Germany.'}]

Now, let’s make the follow-up question:

Which teams did make it to the finals?

Thus, we have to pass it to the messages list:

messages.append({"role": "user", "content": "Which teams did make it to the finals?"})
 

So the message list becomes:

[{'role': 'system', 'content': "You're an expert in football history"},
 {'role': 'user', 'content': 'Where did the World Cup 2006 take place?'},
 {'role': 'assistant', 'content': 'The World Cup 2006 took place in Germany.'},
 {'role': 'user', 'content': 'Which teams did make it to the finals?'}]

Now, let’s try again and see how what answer we will get this time.

response = openai.ChatCompletion.create(
  model="gpt-3.5-turbo",
  messages = messages)

print(response['choices'][0]['message'])
 

We get:

{
  "content": "The final of the World Cup 2006 was played between Italy and France.",
  "role": "assistant"
}

The final of the World Cup 2006 was played between Italy and France.

This time we got the right answer. The ChatGPT understood that we were referring to the World Cup 2006 and it answered correctly that the final was between Italy against France.

Make a ChatGPT Chatbot

Finally, let’s see how we can build a chatbot by creating a class. The chatbot will take as input the content of the system and will track the conversation. The conversation will end once the user enters the word “END“.

class CreateBot:
    
    def __init__(self, system_prompt):
        '''
        system_prompt: [str] Describes context for Chat Assistant
        '''
        self.system = system_prompt
        self.messages = [{"role": "system", "content": system_prompt}]
        
    
    def chat(self):
        '''
        Tracks dialogue history and takes in user input
        '''
        print('To end conversation, type END')
        question = ''
        while question != 'END':
            # Get User Question
            question = input("")
            
            # Add to messages/dialogue history
            self.messages.append({'role':'user','content':question})

            #Send to ChatGPT and get response
            response = openai.ChatCompletion.create(
                  model="gpt-3.5-turbo",
                  messages=self.messages)

            # Get content of assistant reply
            content = response['choices'][0]['message']['content']
            print('\n')
            print(content)
            print('\n')
            # Add assistant reply for dialogue history
            self.messages.append({'role':'assistant','content':content})
 

Let’s test it with the same example.

football_tutor = CreateBot(system_prompt="You are an expert in football history")
 

We pass the string: “Where did the World Cup 2006 take place?”

Then, we pass the string: “Which teams did make it to the finals?”

We can end the conversation by passing the string “END”.

And we get:

References

Share This Post

Share on facebook
Share on linkedin
Share on twitter
Share on email

Leave a Comment

Subscribe To Our Newsletter

Get updates and learn from the best

More To Explore

Python

Image Captioning with HuggingFace

Image captioning with AI is a fascinating application of artificial intelligence (AI) that involves generating textual descriptions for images automatically.

Python

Intro to Chatbots with HuggingFace

In this tutorial, we will show you how to use the Transformers library from HuggingFace to build chatbot pipelines. Let’s