:type user_text: str the user's text to query for Query OpenAI GPT-3 for the specific key and get back a response You can use P圜harm’s package manager pane to do this, but a more universal way would be to use pip or similar to install the openai dependency as follows:ĭef generate_gpt3_response(user_text, print_output = False): The first thing we’ll need to do is install the OpenAI dependency. If you’re following along with these steps, you might choose to call this gpt3.py and use some edition of P圜harm. Importing OpenAI and Specifying your API Keyįor the remainder of this article, I’ll be giving you bits and pieces of code that might go into a. Once you have the API Key, it’s time to move into Python code. This key will only be visible once and you will not be able to recover it, so copy it down somewhere safe. To get this API key, click on your picture and organization name in the upper right and then select View API Keys.įrom here, click + Create new secret key to generate a new secret key. While there are a number of interesting links to documentation and examples, what we care about is getting an API key that we can use in Python code. Once you’ve logged in, you should see a screen like the following: If you do not have an account yet, you can click the Sign up link to register. Getting a GPT-3 API Keyīefore you can use GPT-3, you need to create an account and get an API key from OpenAI.Ĭreating an account is fairly simple and you may have even done so already if you’ve interacted with ChatGPT.įirst, go to the Log in page and either log in using your existing account or Google or Microsoft accounts. GPT-3 is useful for any circumstance where you want to be able to generate text given a specific prompt and then review it for potential use later on. This way, GPT-3 allows you to save time writing responses, but still gives you editorial control to fact-check its outputs and avoid the types of confusion you might see from interactions with ChatGPT, for example. GPT-3 would then give you back the direct text it generated and your support team could make modifications to that text and then send it on. GPT-3, on the other hand, is a full API that can be given whatever prompts you want.įor example, if you needed to quickly draft an E-Mail to a customer for review and revision, you could give GPT-3 a prompt of “Generate a polite response to this customer question (customer question here) that gives them a high level overview of the topic” The answer to this is fairly simple: ChatGPT is built as an interactive chat application that directly faces the user. Given these things, a natural question that arises is “why would I use the old way of doing things?” Note: This article is heavily inspired by part of Scott Showalter’s excellent talk on building a personal assistant at CodeMash 2023 and I owe him credit for showing me how simple it was to call the OpenAI API from Python Why use GPT-3 when ChatGPT is Available?īefore we go deeper, I should state that GPT-3 is older that ChatGPT and is a precursor to that technology. In this article we’ll explore how to work with GPT-3 from Python code to generate content from your own prompts - and how much that will cost you. If you ever thought it’d be cool to integrate transformer-based applications into your own code, GPT-3 allows you to do that via its Python API. This is because ChatGPT is related to GPT-3, but presents its output in a chat application instead of via direct API calls. If you’re familiar with ChatGPT, the GPT-3 API works in a similar manner as that application: you give it a piece of text and it gives you back a piece of text in response. In this article we’ll take a look at how you can use the Generalized Pre-trained Transformers v3 API (GPT-3) from OpenAI to generate text content from a string prompt using Python code.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |