langtail
package:
lt.prompts.invoke
like this:
lt.chat.completions.create
as a light wrapper over the OpenAI API that still sends logs and metrics to Langtail.
For more, see the example in this doc.
LangtailPrompts.get
to retrieve the contents of the prompt:
completions
object with a create
method.LangtailPrompts
class.ChatCompletionCreateParamsNonStreaming & ILangtailExtraProps
, ChatCompletionCreateParamsStreaming & ILangtailExtraProps
, ChatCompletionCreateParamsBase & ILangtailExtraProps
,
or ChatCompletionCreateParams & ILangtailExtraProps
.Core.RequestOptions
object (optional).ChatCompletion
or a Stream<ChatCompletionChunk>
depending whether you are using streaming or not.
apiKey
is not provided in the options object or as an environment variable.IRequestParams
or IRequestParamsStream
object.OpenAIResponseWithHttp
or a StreamResponseType
depending on whether you use streaming or not.
Environment
string identifier. Accepts values: "preview" | "staging" | "production"
. Defaults to production
get
method.