The Langtail Playground is a prompt engineer’s dream. It offers prompt templating, variables, tools, versioning, sharing, and more in a streamlined interface that helps you build and ship faster.

Here’s a quick overview of its interface:

  1. Templates Panel: Here you can incorporate ‘System’, ‘Assistant’, and ‘User’ templates into your prompts for use during deployment. Insert variables using {{myVariable}} and format prompts with the templating syntax.

The different templates such as ‘System’, ‘Assistant’, and ‘User’ are used if you want to steer the conversation in a specific direction. It can also be useful if you have simple instructions in the system message and then fill in variables in subsequent user messages. But in general it is enough to fill in the system message.

  1. Parameters Panel: Here you can choose an AI model and adjust its parameters to tailor the model’s behavior to your project.
  2. Variables Panel: Here you can find and modify the variables defined within your templates to refine your prompt’s outputs.
  3. Tools Panel: Here you can create a custom tool specification that can enrich the AI model’s response. If the AI model decides to use the tool, the Assistant response will be a function call. You can mock the response, send it back to the model, and see what the final output will be.
  4. Send: Click to invoke the prompt and display the response.
  5. Messages Area: Here you can see the responses generated by the model after invoking the prompt.
  6. Prompt versions: Here you can access and manage previous versions of your prompt and easily revert if necessary.
  7. Save: Click to save your prompt and, if desired, create a named version.
  8. Deploy: Click to publish your prompt as an API into various environments. After deploying, click the “Deployments” tab to see the API endpoint and how to call it.
  9. Share: Click to share your prompt with your team or publicly for the world to see.

Was this page helpful?