Predictive Hacks

How to Save and Load Prompt Templates with LangChain

In a previous tutorial, we showed you how to work with LangChain Prompt Templates. Clearly, the prompt templates allow us to automate some tasks and to work much more efficiently. The good news is that you can save your templates as JSON objects, allowing you to use them later or to share them with others. In this tutorial, we will show you how to save and load prompt templates. First, let’s start with a simple prompt template:

from langchain.llms import OpenAI
from langchain.chat_models import ChatOpenAI
from langchain.prompts import PromptTemplate
llm = OpenAI(model_name='text-davinci-003')
chat = ChatOpenAI()
single_input_prompt = PromptTemplate(input_variables = ['product'],
                                template = 'What is a good name for a company that makes {product}?')

We can see the template by printing it.



PromptTemplate(input_variables=['product'], output_parser=None, partial_variables={}, template='What is a good name for a company that makes {product}?', template_format='f-string', validate_template=True)

Save the Prompt Template

We can easily save the prompt template using the save method. The template will be saved as a JSON object, where in our case we will call it “myprompt.json”."myprompt.json")

Load the Prompt Template

Let’s see now, how we can load the saved template.

from langchain.prompts import load_prompt

loaded_prompt = load_prompt("myprompt.json")

Before we run the prompt, let’s make sure that the loaded prompt is the expected one.

loaded_prompt.format(product='colorful socks')


'What is a good name for a company that makes colorful socks?'

Finally, since we verified that we saved and loaded correctly the prompt, it is time to run it!

print(llm(single_input_prompt.format(product='colorful socks')))


Colorful Comfort Socks

Share This Post

Share on facebook
Share on linkedin
Share on twitter
Share on email

Leave a Comment

Subscribe To Our Newsletter

Get updates and learn from the best

More To Explore


Document Splitting with LangChain

In this tutorial, we will talk about different ways of how to split the loaded documents into smaller chunks using