In a previous tutorial, we showed you how to work with LangChain Prompt Templates. Clearly, the prompt templates allow us to automate some tasks and to work much more efficiently. The good news is that you can save your templates as JSON objects, allowing you to use them later or to share them with others. In this tutorial, we will show you how to save and load prompt templates. First, let’s start with a simple prompt template:
from langchain.llms import OpenAI from langchain.chat_models import ChatOpenAI from langchain.prompts import PromptTemplate llm = OpenAI(model_name='text-davinci-003') chat = ChatOpenAI() single_input_prompt = PromptTemplate(input_variables = ['product'], template = 'What is a good name for a company that makes {product}?')
We can see the template by printing it.
single_input_prompt
Output:
PromptTemplate(input_variables=['product'], output_parser=None, partial_variables={}, template='What is a good name for a company that makes {product}?', template_format='f-string', validate_template=True)
Save the Prompt Template
We can easily save the prompt template using the save
method. The template will be saved as a JSON object, where in our case we will call it “myprompt.json”.
single_input_prompt.save("myprompt.json")
Load the Prompt Template
Let’s see now, how we can load the saved template.
from langchain.prompts import load_prompt loaded_prompt = load_prompt("myprompt.json")
Before we run the prompt, let’s make sure that the loaded prompt is the expected one.
loaded_prompt.format(product='colorful socks')
Output:
'What is a good name for a company that makes colorful socks?'
Finally, since we verified that we saved and loaded correctly the prompt, it is time to run it!
print(llm(single_input_prompt.format(product='colorful socks')))
Output:
Colorful Comfort Socks