Prompt utilities¶
The prompt_utils
module contains functions to assist with converting Message's Dictionaries into prompts that can be used with ChatCompletion
clients.
Supported prompt formats:
- Prompt utilities
- Set prompt builder for client
- Llama 2 Chat builder
- Vicuna Chat builder
- Hugging Face ChatML builder
- WizardLM Chat builder
- StableBeluga2 Chat builder
- Open Assistant Chat builder
- Anthropic Claude Chat builder
Prompt utils are also exporting a mapping dictionary PROMPT_MAPPING
that maps a model name to a prompt builder function. This can be used to select the correct prompt builder function via an environment variable.
PROMPT_MAPPING = {
"chatml_falcon": build_chatml_falcon_prompt,
"chatml_starchat": build_chatml_starchat_prompt,
"llama2": build_llama2_prompt,
"open_assistant": build_open_assistant_prompt,
"stablebeluga": build_stablebeluga_prompt,
"vicuna": build_vicuna_prompt,
"wizardlm": build_wizardlm_prompt,
}
Set prompt builder for client¶
from easyllm.clients import huggingface
huggingface.prompt_builder = "llama2" # vicuna, chatml_falcon, chatml_starchat, wizardlm, stablebeluga, open_assistant
Llama 2 Chat builder¶
Creates LLama 2 chat prompt for chat conversations. Learn more in the Hugging Face Blog on how to prompt Llama 2. If a Message
with an unsupported role
is passed, an error will be thrown.
Example Models:
from easyllm.prompt_utils import build_llama2_prompt
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Explain asynchronous programming in the style of the pirate Blackbeard."},
]
prompt = build_llama2_prompt(messages)
Vicuna Chat builder¶
Creats a Vicuna prompt for a chat conversation. If a Message
with an unsupported role
is passed, an error will be thrown. Reference
Example Models:
from easyllm.prompt_utils import build_vicuna_prompt
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Explain asynchronous programming in the style of the pirate Blackbeard."},
]
prompt = build_vicuna_prompt(messages)
Hugging Face ChatML builder¶
Creates a Hugging Face ChatML prompt for a chat conversation. The Hugging Face ChatML has different prompts for different Example Models, e.g. StarChat or Falcon. If a Message
with an unsupported role
is passed, an error will be thrown. Reference
Example Models: * HuggingFaceH4/starchat-beta
StarChat¶
from easyllm.prompt_utils import build_chatml_starchat_prompt
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Explain asynchronous programming in the style of the pirate Blackbeard."},
]
prompt = build_chatml_starchat_prompt(messages)
Falcon¶
from easyllm.prompt_utils import build_chatml_falcon_prompt
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Explain asynchronous programming in the style of the pirate Blackbeard."},
]
prompt = build_chatml_falcon_prompt(messages)
WizardLM Chat builder¶
Creates a WizardLM prompt for a chat conversation. If a Message
with an unsupported role
is passed, an error will be thrown. Reference
Example Models:
from easyllm.prompt_utils import build_wizardlm_prompt
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Explain asynchronous programming in the style of the pirate Blackbeard."},
]
prompt = build_wizardlm_prompt(messages)
StableBeluga2 Chat builder¶
Creates StableBeluga2 prompt for a chat conversation. If a Message
with an unsupported role
is passed, an error will be thrown. Reference
from easyllm.prompt_utils import build_stablebeluga_prompt
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Explain asynchronous programming in the style of the pirate Blackbeard."},
]
prompt = build_stablebeluga_prompt(messages)
Open Assistant Chat builder¶
Creates Open Assistant ChatML template. Uses <|prompter|>
, </s>
, <|system|>
, and <|assistant>
tokens. If a . If a Message
with an unsupported role
is passed, an error will be thrown. Reference
Example Models:
from easyllm.prompt_utils import build_open_assistant_prompt
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Explain asynchronous programming in the style of the pirate Blackbeard."},
]
prompt = build_open_assistant_prompt(messages)
Anthropic Claude Chat builder¶
Creates Anthropic Claude template. Uses \n\nHuman:
, \n\nAssistant:
. If a . If a Message
with an unsupported role
is passed, an error will be thrown. Reference
Example Models: