OpenAI Proxy
Integrate via REST API
If you’ve already started building your application by using OpenAI’s models directly via the REST API but want to start experimenting with Langtail, you can configure your existing code to start sending requests to the OpenAI Proxy.
In this guide, we’ll generate a Langtail API key, modify your existing code to send LLM requests via the proxy, and then use Langtail’s observability tools to see the requests coming from your app.
Let’s get started:
1
Prerequisites
Before moving on, make sure you have:
- A Langtail account (create one here)
- An OpenAI key already added to your Langtail workspace
2
Generate a project API key
- In Langtail, open an existing project or create a new one.
- In the left sidebar, click Secrets.
- Click New API Key, optionally set a name and a budget limit, then click Create.
- Copy the key.
3
Modify your code to use the proxy
- Add your Langtail project API key to an environment variable. In the following snippet, we use
LANGTAIL_API_KEY
. - In your codebase, find where you call the OpenAI API. Then, make the following adjustments:
- In the url, replace
https://api.openai.com
withhttps://proxy.langtail.com
. - In the Authorization header, use your Langtail project API key instead of your OpenAI key.
- In the url, replace
- Here’s an example calling the proxy using cURL:
4
Test it out
Run your app and make a request to verify that everything works properly.
5
Observe metrics and logs
- Back in Langtail, navigate to the project and then click Logs in the left sidebar.
- You should see log entries coming from your app (they have an environment value of “proxy”).
- Click on a log to see detailed information about the request and response from OpenAI.
- If you want to experiment with this prompt, click Open in Playground. Here you can change parameters and modify different parts of the prompt and then see the result.
Was this page helpful?