Example OpenAI API request

It may be shocking, but you already have the requisite skills to integrate ChatGPT into your Ruby projects! We’ll do so with the ruby-openai gem.

Here is a video for this lesson. You should read the lesson and try things for yourself as well. Please note, the OpenAI playground linked below has changed a bit since we recorded this video. If you note some differences, just follow along with the video where I cover some general conceptual topics related to ChatGPT.

Explore the OpenAI API

If you’ve used ChatGPT before, then you have been interacting with OpenAI’s API. ChatGPT is a product OpenAI built to show off the capabilities of their API, but they really make their money by selling API access to developers.

If you haven’t used ChatGPT before, or even if you have, the video OpenAI published introducing GPT-4 gives a good sense of its unique capabilities, and might give you ideas of prompts to experiment with.

If you want to explore this new revolution we’re living through, I recommend that you sign up for an OpenAI developer account. You do not need to sign up now, you can also use the available key in the “Secrets” tab above. If you plan to use OpenAI in your capstone project, you will need to sign up for your own account and get an API key. Do not use our key for your capstone project.

As of this writing, you’ll have to purchase at least $5 worth of API credits in order to gain access to the latest and most powerful model — gpt-4-turbo.

After you sign up, you will be able to experiment in the API playground.

Note: OpenAI API tokens are hot commodities right now amongst thieves who use them to produce SEO spam, scams, etc. So be very careful with yours (and mine, if I give you one). When you get to using the token in your own code, be sure to store it securely in an environment variable and then reference it using the ENV hash.

If you didn’t sign up for your own OpenAI key (remember to do so for your capstone project!), you can use the key in the “Secrets” tab above.

Making API requests using http.rb

After reading OpenAI’s API documentation for a while, I discovered that in order to use the Chat Completions API (which is the one that’s taking over the world) we must send HTTP requests that look like this:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
POST /v1/chat/completions HTTP/1.1
Host: api.openai.com
Authorization: Bearer OUR_SECRET_API_TOKEN_GOES_HERE
content-type: application/json

{
  "model": "gpt-3.5-turbo",
  "messages": [
    {
      "role": "system",
      "content": "You are a helpful assistant who talks like Shakespeare."
    },
    {
      "role": "user",
      "content": "Hello! What are the best spots for pizza in Chicago?"
    }
  ]
}

Things to note about the above HTTP request:

  • It’s a POST request.

  • The resource path is /v1/chat/completions.

  • The host is api.openai.com.

  • Our API token must be in an Authorization header after "Bearer ".

  • The Content-Type header must be set to application/json.

  • The body of the request must be a string containing JSON.

    • The JSON must have a key "model" with a string (e.g. “gpt-3.5-turbo” or “gpt-4”)

    • The JSON must have a key "messages" with an array of hashes.

    • Each hash must have keys "role" and "content".

We can put together a request like this in Ruby using the http.rb gem as follows:

Voila! You can add as many elements into the messages array as you like; that’s how you keep the conversation going.

And that’s that! Now, it’s up to you to decide how to apply GPT’s responses to solve some problem in the world.

Making API requests using ruby-openai

It gets even better: someone in the Ruby community has written and released a gem that makes it even more straightforward for us to interact with this powerful API — the ruby-openai gem.

Setup

Let’s write a real Ruby program in a proper Codespace. Head over to the appdev-projects/ruby-openai-demo repository, fork it, and set up a Codespace as usual.

Create a file to contain our Ruby program — let’s call it chat.rb. Write some code to pretty-print “howdy” and run the program with ruby chat.rb to make sure everything is set up correctly.

Bring in the gem

The crux of the gem is a class called OpenAI::Client. Within this class, the author of the gem has defined many helpful methods that “wrap” the OpenAI API endpoint URLs, so that we don’t have to know the required headers, verbs, bearer tokens, etc, as we did when using the HTTP class above.

All we need to do is learn what Ruby methods the OpenAI::Client has, and what arguments they expect. Then the methods will do the heavy lifting for us, and return the API responses.

So, in our program, we first want to require "openai":

1
2
3
require "openai"

pp "howdy"

If you try that and run the program, you should get an error in require: cannot load such file -- openai (LoadError). That’s because our Codespace doesn’t come with the ruby-openai gem installed out-of-the-box.

As per the gem docs, let’s install the gem by running this at the bash prompt in your terminal:

(Note: the gem name is ruby-openai, not openai.)

1
2
3
4
5
6
7
8
9
10
11
12
13
14
ruby-openai-demo main % gem install ruby-openai

Fetching multipart-post-2.4.0.gem
Fetching ruby-openai-6.5.0.gem
Fetching faraday-2.9.0.gem
Fetching faraday-net_http-3.1.0.gem
Fetching faraday-multipart-1.0.4.gem
Fetching event_stream_parser-1.0.0.gem
Successfully installed multipart-post-2.4.0
Successfully installed faraday-multipart-1.0.4
Successfully installed faraday-net_http-3.1.0
Successfully installed faraday-2.9.0
Successfully installed event_stream_parser-1.0.0
Successfully installed ruby-openai-6.5.0

Voila! Now, try running your program again — it shouldn’t error out this time.

Instantiate a client

Now, we can finally create a new instance of the OpenAI::Client class:

1
2
3
4
5
6
require "openai"
require "dotenv/load"

client = OpenAI::Client.new(access_token: ENV.fetch("OPENAI_API_KEY"))

pp client

This relies upon there being an environment variable set called OPENAI_API_KEY. Follow these instructions to setup your .env file with the dotenv gem, then create a new secret by that name, whose value is an API key from your OpenAI Developer Dashboard.

Now, run your program — if all went well, you should see a new instance of client being pretty-printed.

Use the client

To make the same API call we made above using the http.rb gem, we can now do this (which I learned by looking at the examples in the gem docs):

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
require "openai"
require "dotenv/load"

client = OpenAI::Client.new(access_token: ENV.fetch("OPENAI_API_KEY"))

# Prepare an Array of previous messages
message_list = [
  {
    "role" => "system",
    "content" => "You are a helpful assistant who talks like Shakespeare."
  },
  {
    "role" => "user",
    "content" => "Hello! What are the best spots for pizza in Chicago?"
  }
]

# Call the API to get the next message from GPT
api_response = client.chat(
  parameters: {
    model: "gpt-3.5-turbo",
    messages: message_list
  }
)

pp api_response

Much nicer! We were able to avoid writing a lot of code by using OpenAI::Client instead of HTTP directly.

The ruby-openai gem has lots of other methods that help you interact with other parts of the API — DALL-E for generating images, Whisper for converting speech-to-text, and more. Awesome!

But, the most popular endpoint, the one that’s revolutionizing everything right now, is the one we’ve seen above — Chat Completions.

Challenge

Here’s a challenge for you. Write a program that:

Part 1

  • Prints “Hello! How can I help you today?”

  • Prints a line of fifty "-".

  • Waits for the user to type in a request.

  • Sends the request to the Chat Completions endpoint and prints the response.

  • Prints a line of fifty "-".

Part 2

Enclose the above in a loop such that the user can continue to type requests and get answers until the user types “bye”.

Part 3

Build up a history of the conversation such that you are sending the entire list of requests and responses with each new API request.

That way, the user can have an ongoing conversation, rather than a series of one-off requests.

Solution

Did you try the steps above? If not, please do so! When you get stuck, you’re allowed to peek at the solution in do_not_peek_possible_solution.rb. Before you peek, try and run the solution with ruby do_not_peek_possible_solution.rb and see if you can figure out how it works.


It’s incredible that we, as novice developers, can access the most powerful model ever developed — for the first time, something actually approaching artificial intelligence.

I believe the next few years in tech will be all about finding ways to solve valuable problems using these new generative AI methods. Can you think of any problems to apply it to?


Assessment Details
Review your overall progress for this lesson
Assessment Title Earned Points Current Progress Assessment Points
Example request 0.0
0%
1
Time taken 0.0
0%
1
Solution check 0.0
0%
1
Totals 0 0% 3

    No highlights created for this lesson

    Create a highlight by selecting any text in this lesson, and ask a question about it.