TypeScript SDK
Langtail’s TypeScript SDK wraps OpenAI’s API client so that it can be used seamlessly with Langtail. You can store the entire prompt in Langtail and reference it by prompt slug and environment. By routing requests through Langtail, you can get logs, metrics, and many more beneifts.
For more, check out the GitHub repository.
Installation
Use NPM or your package manager of choice to install the langtail
package:
Usage
Here are the different ways you can use the SDK.
1. Invoke a prompt that’s deployed on Langtail (recommended)
To invoke a deployed prompt, you can use lt.prompts.invoke
like this:
If you call a prompt that isn’t deployed yet, you will get an error thrown an error:
2. Invoke a prompt that’s stored in code
If you’re storing your prompt in your codebase and want to give Langtail a try, use this method. You can use lt.chat.completions.create
as a light wrapper over the OpenAI API that still sends logs and metrics to Langtail.
For more, see the example in this doc.
3. Fetch prompt from Langtail and invoke directly (proxyless)
If you’re storing your prompt in Langtail but want to invoke it directly from your code without using Langtail as a proxy, use this method.
You can call LangtailPrompts.get
to retrieve the contents of the prompt:
The response will return something like this:
Now you can build the final output:
Which returns this object:
Finally, you can directly call the OpenAI SDK with the returned object:
Using this method, you’re getting all of the power of Langtail prompts (like variables) and you’re able to send requests directly to OpenAI. This can be helpful if you’re especially sensitive to performance. If you’re taking this route for security reasons, let us know and give you a deeper look into Langtail.
API reference
LangtailNode
Constructor
The constructor accepts an options object with the following properties:
The API key for Langtail.
The base URL for the Langtail API.
A boolean indicating whether to record the API calls.
The organization ID.
The project ID.
The fetch function to use for making HTTP requests. It is passed to openAI client under the hood.
Properties
An object containing a completions
object with a create
method.
An instance of the LangtailPrompts
class.
Methods
chat.completions.create
This method accepts two parameters:
An object that can be of type ChatCompletionCreateParamsNonStreaming & ILangtailExtraProps
, ChatCompletionCreateParamsStreaming & ILangtailExtraProps
, ChatCompletionCreateParamsBase & ILangtailExtraProps
,
or ChatCompletionCreateParams & ILangtailExtraProps
.
OpenAI Core.RequestOptions
object (optional).
It returns a promise that resolves to a ChatCompletion
or a Stream<ChatCompletionChunk>
depending whether you are using streaming or not.
Exceptions
- Throws an error if the
apiKey
is not provided in the options object or as an environment variable.
LangtailPrompts
Constructor
The constructor accepts an options object with the following properties:
The API key for Langtail.
The base URL for the Langtail API.
The organization ID.
The project ID.
The fetch function to use for making HTTP requests. It is passed to openAI client under the hood.
Properties
The API key for Langtail.
The base URL for the Langtail API.
An object containing the options for the Langtail API.
Methods
invoke
This method accepts:
An IRequestParams
or IRequestParamsStream
object.
It returns a promise that resolves to an OpenAIResponseWithHttp
or a StreamResponseType
depending on whether you use streaming or not.
get
This method accepts one parameter with these fields:
A string representing the prompt.
An Environment
string identifier. Accepts values: "preview" | "staging" | "production"
. Defaults to production
String for version. Necessary for preview environment.
Returns Langtail state defined here:
build
This method accepts two parameters:
The state object returned from the get
method.
An object containing the options for the Langtail API.
Returns an object that can be passed to the OpenAI SDK:
Exceptions
- Throws an error if the fetch operation fails.
- Throws an error if there is no body in the response when streaming is enabled.
Was this page helpful?