Intro

Intro

Navigating the complexities of prompt engineering with large language models (LLMs) can be challenging. Due to the opaque nature of LLMs, predicting the impact of specific words in a prompt on the outcome is often unpredictable. Crafting the right, concise prompt is thus a significant task. Prompter is designed to ease this process, guiding you towards effective and succinct prompts.

Navigating the complexities of prompt engineering with large language models (LLMs) can be challenging. Due to the opaque nature of LLMs, predicting the impact of specific words in a prompt on the outcome is often unpredictable. Crafting the right, concise prompt is thus a significant task. Prompter is designed to ease this process, guiding you towards effective and succinct prompts.

Navigating the complexities of prompt engineering with large language models (LLMs) can be challenging. Due to the opaque nature of LLMs, predicting the impact of specific words in a prompt on the outcome is often unpredictable. Crafting the right, concise prompt is thus a significant task. Prompter is designed to ease this process, guiding you towards effective and succinct prompts.

Prompter is a state-of-the-art prompt debugging tool that simplifies testing and refining prompts for large language models like GPT-3.5 and GPT-4.

Utilizing Prompter, you gain access to powerful features:

Prompter is a state-of-the-art prompt debugging tool that simplifies testing and refining prompts for large language models like GPT-3.5 and GPT-4.

Utilizing Prompter, you gain access to powerful features:

  • 🚀 Generate chat completions and analyze results.

  • ⚙️ Customize parameters such as temperature and top-p.

  • ⏳ Record and label execution histories for easy comparison.

  • 🆚 Conduct batch testing of prompts for streamlined optimization.

  • 🧩 Incorporate variables for enhanced flexibility and precision in prompt crafting.

  • 𝒇 Mimic external function calls for comprehensive testing scenarios.

Prompter is your go-to tool for efficient prompt debugging, helping you swiftly identify and refine the most effective prompts for your specific needs.

Prompter is your go-to tool for efficient prompt debugging, helping you swiftly identify and refine the most effective prompts for your specific needs.

Features

Some key features and benefits of Prompter:

  • Lightweight & User-Friendly

    Prompter is lightweight and runs fully in the browser without installation. It also boasts a stylish and intuitive UI/UX design that makes prompt engineering fast and easy.


  • Customizable Parameters

    Fine-tune parameters like temperature, top-p, etc. to shape model responses.


  • Prompt History

    Save and restore past prompts for reference. Easily re-run previous tests.


  • Variables & Function Calling

    Adjust prompts dynamically and simulate the interaction with external APIs. This also includes support for OpenAI's feature of making parallel function calling.


  • Batch Testing

    Test multiple prompts in batch mode for faster iteration.


  • Custom Endpoints

    Use your own API endpoint in addition to the official OpenAI API.


  • Seed & JSON format

    Prompter supports seed and system fingerprint, and offers JSON output format for enhanced functionality and data handling.


🌟 We believe developers deserve the best – a tool that's both efficient and aesthetically pleasing.

Tutorial

How to use Prompter

Manual

1. Setup API Key

On first launch, you'll be prompted to enter your OpenAI API key or a custom endpoint URL + API key. This allows Prompter to call the API on your behalf.

On first launch, you'll be prompted to enter your OpenAI API key or a custom endpoint URL + API key. This allows Prompter to call the API on your behalf.

Prompter: setup API key
Prompter: setup API key
Prompter: setup API key

You can find your OpenAI API key in Settings.

You can find your OpenAI API key in Settings.


At Prompter, we prioritize your security.

Your API key is safely stored and never accessed, ensuring your sensitive information remains private and protected.

Prompter: settings
Prompter: settings
Prompter: settings

2. Create Prompt

The prompt editor has four main sections👇

Parameters

Configurations like model, max tokens, temperature that shape the API request.

Prompter: Parameters tab
Prompter: Parameters tab
Prompter: Parameters tab

You can set N to output multiple results for the same input(prompts) all at once. It’s recommend to set Temperature > 0 if N > 1. Otherwise the results will be the same for N times.

You can set N to output multiple results for the same input(prompts) all at once. It’s recommend to set Temperature > 0 if N > 1. Otherwise the results will be the same for N times.

Messages

Enter a system message and user message here. You can find details about “System” and “User” message from:

Prompter: Messages tab
Prompter: Messages tab

Variables

Create a new variable by entering the variable name and its value. When using the variables, enclose the variable name with double curly braces {{ and }} in the Messages tab. You can also test variables in batches by clicking the Batch Request button.

Prompter: Variables tab
Prompter: Variables tab
Prompter: Variables tab

Functions

In order to simulate Function Calling, please create a function by entering the function name and description. And then input the JSON schema of function definition and the mock value accordingly.

Prompter: Functions tab
Prompter: Functions tab
Prompter: Functions tab

3. Tweak and Iterate

To start, accept the default parameters or tweak them as needed. Then enter your prompts and hit "Run" to submit the requests. The result will appear on the right, along with a copy button.

Prompter: tweak and iterate
Prompter: tweak and iterate
Prompter: tweak and iterate

Now you can iterate on your prompt:


  • Change parameters like Temperature and Top-p.

  • Edit the prompts and resend.

  • Restore past prompts from history for comparison.

👏 Prompter makes it fast to try small tweaks and see the impact on results.

👏 Prompter makes it fast to try small tweaks and see the impact on results.

4. Results & History

You can easily label the results as "like" or "dislike". In the History, you can quickly filter based on these labels, making it more convenient to select the most appropriate parameters and prompts by comparing the outputs under different input scenarios.

Prompter: history panel
Prompter: history panel
Prompter: history panel

5. Batch Testing ✨

To test multiple prompts at once, you can click the batch request button to switch to Batch Mode. This lets you test many variations efficiently in parallel. Results will display sequentially as they are returned. You can also copy any individual result.

Prompter: batch mode
Prompter: batch mode
Prompter: batch mode

🙋 If the N value in the Parameters tab exceeds 1, the batch request button in the Prompt tab will be disabled. Conversely, if the batch request mode is enabled, N cannot be set to a value greater than 1.

🙋 If the N value in the Parameters tab exceeds 1, the batch request button in the Prompt tab will be disabled. Conversely, if the batch request mode is enabled, N cannot be set to a value greater than 1.

6. Prompt Management

The prompts can be organized and managed into different projects/groups in the left side bar as below.

Prompter: project & prompt management
Prompter: project & prompt management
Prompter: project & prompt management

Misc

Changelog

v

1.0.0

Nov. 28th 2023

Officially launched 🚀


v

0.0.3

Oct. 11th 2023

Support function calling


v

0.0.2

Sep. 4th 2023

Variables & labelling

  • Support variables;

  • Results labelling and filter;

  • Minor improvements;

v

0.0.1

Oct. 11th 2023

Alpha release