Basic Usage
This example demonstrates the core PromptPack workflow: loading packs, creating templates, and formatting prompts.
Loading a PromptPack
Section titled “Loading a PromptPack”from pathlib import Pathfrom promptpack import parse_promptpack
# Load from a filepack = parse_promptpack("path/to/pack.json")
# Or parse from a stringfrom promptpack import parse_promptpack_string
pack_json = '{"id": "my-pack", ...}'pack = parse_promptpack_string(pack_json)Creating Templates
Section titled “Creating Templates”Use PromptPackTemplate to create LangChain-compatible templates:
from promptpack_langchain import PromptPackTemplate
# Create a template from a specific prompt in the packtemplate = PromptPackTemplate.from_promptpack(pack, "support")
# Check input variablesprint(template.input_variables) # ['role', 'company']
# Get LLM parametersparams = template.get_parameters()print(params) # {'temperature': 0.7, 'max_tokens': 1500}Formatting Prompts
Section titled “Formatting Prompts”Format the template with your variables:
# Format with variablesformatted = template.format( role="customer support agent", issue_type="billing")
print(formatted)Output:
You are a customer support agent assistant for TechCorp.
# Company ContextTechCorp provides cloud infrastructure, SaaS products, and enterprise solutions.
# Your RoleHandle billing customer inquiries effectively.
# GuidelinesMaintain a professional yet friendly tone. Be concise and solution-oriented.Using Fragments
Section titled “Using Fragments”PromptPacks support reusable fragments that are automatically resolved:
{ "fragments": { "company_context": "TechCorp provides cloud infrastructure...", "tone_guidelines": "Maintain a professional yet friendly tone..." }, "prompts": { "support": { "system_template": "{{fragment:company_context}}\n\n{{fragment:tone_guidelines}}" } }}Fragments are resolved automatically when you call template.format().
Model Overrides
Section titled “Model Overrides”Templates can have model-specific configurations:
# Create template with model-specific overridestemplate = PromptPackTemplate.from_promptpack( pack, "support", model_name="gpt-4")
# The template and parameters will use GPT-4 specific settingsparams = template.get_parameters()Using with LangChain
Section titled “Using with LangChain”Convert to a ChatPromptTemplate for use with LangChain:
from langchain_openai import ChatOpenAI
# Create chat templatechat_template = template.to_chat_prompt_template( role="support agent", company="Acme Corp")
# Create chainmodel = ChatOpenAI( model="gpt-4o-mini", temperature=template.get_parameters().get("temperature", 0.7))
chain = chat_template | model
# Invokeresponse = chain.invoke({ "messages": [("human", "I was charged twice for my subscription")]})
print(response.content)Complete Example
Section titled “Complete Example”Here’s the complete basic_usage.py example:
#!/usr/bin/env python3from pathlib import Path
from promptpack import parse_promptpackfrom promptpack_langchain import PromptPackTemplate
def main(): # Load PromptPack pack_path = Path(__file__).parent / "packs" / "customer-support.json" pack = parse_promptpack(pack_path)
print(f"Loaded pack: {pack.name} (v{pack.version})") print(f"Available prompts: {list(pack.prompts.keys())}")
# Create template template = PromptPackTemplate.from_promptpack(pack, "support") print(f"Input variables: {template.input_variables}") print(f"Parameters: {template.get_parameters()}")
# Format formatted = template.format( role="customer support agent", issue_type="billing" ) print(formatted)
if __name__ == "__main__": main()