The confusing world of OpenAI pricing
As we all know by now, a free version of ChatGPT exists with unpredictable levels of availability. This free version is based on a model called GPT-3.5. If you want higher availability or if you want to be able to switch to the newer GPT-4 model, you need a ChatGPT Plus subscription. That will cost you $20 per month (excl. tax). So far so good.
Confusingly, this subscription will not help you when you want to use the OpenAI API to access GPT models. That requires a separate subscription with a different pricing model. Instead of a fixed price per month, you pay per 1000 tokens as shown in the table below.
model | 1K prompt tokens | 1K completion tokens | context size |
---|---|---|---|
gpt-3.5-turbo |
$0.002 | $0.002 | 4,096 tokens |
gpt-4 |
$0.030 | $0.060 | 8,192 tokens |
gpt-4-32k |
$0.060 | $0.120 | 32,768 tokens |
On average, a token corresponds to roughly 4 characters of text. Check out this interactive tool to see exactly how text is parsed into tokens. To give you an idea of how to interpret the context size, the current version of the ChatGPT wikipedia page up to and including the "See also" section contains a little under 8000 tokens. That is about 12.5 pages. That means the 32000 tokens of gpt-4-32k
correspond to about 50 pages.
When I wanted to try the API, I installed the openai
python package in a conda environment and created the following code snippet.
conda create -n openai python=3 openai conda activate openai
import os import openai openai.api_key = os.environ["OPENAI_API_KEY"] # print([model["id"] for model in openai.Model.list()["data"]]) # model = "gpt-3.5-turbo" model = "gpt-4" # model = "gpt-4-32k" response = openai.ChatCompletion.create(model=model, messages=[ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "ELI5: quantum computing"} ]) # print(response) print(response["choices"][0]["message"]["content"])
To my surprise, the requested model could not be found. Turns out there is a GPT-4 API waiting list that you have to sign up for, even if you are already a ChatGPT Plus subscriber and as such have access to GPT-4 via the chat interface.
In conclusion: this code snippet will unfortunately only work when you subscribe to the API and got an invite after signing up for the waiting list. You could switch to gpt-3.5-turbo
while waiting for the invitation. For light to medium usage, that might be a cheaper and more reliable way to access a GPT assistant than spending $20 per month on ChatGPT Plus.
Update 2023-07-06: the GPT-4 API is now generally available tyo all paying customers without having to join a waitlist.