본문 바로가기

ML&DL and LLM

LangChain - 1.2.4 Few-shot prompt template

참조 : https://python.langchain.com/docs/modules/model_io/prompts/few_shot_examples

 

Few-shot prompt templates | 🦜️🔗 Langchain

In this tutorial, we’ll learn how to create a prompt template that uses

python.langchain.com

참조 : https://python.langchain.com/docs/modules/model_io/prompts/few_shot_examples_chat

 

Few-shot examples for chat models | 🦜️🔗 Langchain

This notebook covers how to use few-shot examples in chat models. There

python.langchain.com

 

 

Few-shot prompt templates

 

Use Case

Few-shot prompt templates를 사용하는 경우는 아래와 같이 직접 example은 direct로 넣는 방법과 example_selector를 사용하는 방법이 있음

Step Code Etc.
1.
Create the example set
examples = [ { 'question':'who lived longer~~~~~', 
    'answer':'~~~~~~'}, ~~~~
]
 
2.
Create a formatter 
example_prompt = PromptTemplate(
    input_variables=["question", "answer"], 
    template="Question: {question}\n{answer}"
)
 
Using an
example set
3.
Feed examples
and formatter
prompt = FewShotPromptTemplate(
    examples=examples,
    example_prompt=example_prompt,
    suffix="Question: {input}",
    input_variables=["input"],
)

final_prompt = prompt.format(input="Who was the father of Mary Ball Washington?")
 
Using an 
example
selector
3. 
Feed examples 
into ExampleSelector
example_selector = SemanticSimilarityExampleSelector.from_examples(
    examples,
    OpenAIEmbeddings(),
    Chroma,
    k=1,
)

question = "Who was the father of Mary Ball Washington?"
selected_examples = example_selector.select_examples({"question": question})
 
4. 
Feed example
selector 
prompt = FewShotPromptTemplate(
    example_selector=example_selector,
    example_prompt=example_prompt,
    suffix="Question: {input}",
    input_variables=["input"],
)

final_prompt  = prompt.format(input="Who was the father of Mary Ball Washington?")
 

 

 

 

Few-shot examples for chat models

Example이 몇 개 안되고 fixed 된 경우와 vectorstore에 저장된 경우를 나누어서 보면 다음과 같음

Case Step Code Etc.
Fixed
Examples
1. 
Define examples
examples = [
    {"input": "2+2", "output": "4"},
    {"input": "2+3", "output": "5"},
]
 
2.
Prepare prompt
example_prompt = ChatPromptTemplate.from_messages(
    [
        ("human", "{input}"),
        ("ai", "{output}"),
    ]
)
 
3. 
few-shot prompt
template

few_shot_prompt FewShotChatMessagePromptTemplate(
    example_prompt=example_prompt,
    examples=examples,
)

print(few_shot_prompt.format())
 
4. 
final prompt
final_prompt = ChatPromptTemplate.from_messages(
    [
        ("system", "You are a wondrous wizard of math."),
        few_shot_prompt,
        ("human", "{input}"),
    ]
)
 
Dynamic
few-shot 
prompting
1.
Prepare vectorstore to select examples
examples = [
    {"input": "2+2", "output": "4"},
    {"input": "2+3", "output": "5"},
    {"input": "2+4", "output": "6"},
    {"input": "What did the cow say to the moon?", "output": "nothing at all"},
    {
        "input": "Write me a poem about the moon",
        "output": "One for the moon, and one for me, who are we to talk about the moon?",
    },
]

to_vectorize = [" ".join(example.values()) for example in examples]
embeddings = OpenAIEmbeddings()

vectorstore = Chroma.from_texts(to_vectorize, embeddings, metadatas=examples)
 
2. 
Create selector
example_selector
example_selector = SemanticSimilarityExampleSelector(
    vectorstore=vectorstore,
    k=2,
)

example_selector.select_examples({"input": "horse"})
 
3.
Create prompt
template
few_shot_prompt = FewShotChatMessagePromptTemplate(
    input_variables=["input"],
    example_selector=example_selector,
    example_prompt=ChatPromptTemplate.from_messages(
        [("human", "{input}"), ("ai", "{output}")]
    ),
)
 
4.
final prompt
final_prompt = ChatPromptTemplate.from_messages(
    [
        ("system", "You are a wondrous wizard of math."),
        few_shot_prompt,
        ("human", "{input}"),
    ]
)
 
5. Use with an LLM chain = final_prompt | ChatAnthropic(temperature=0.0)

chain.invoke({"input": "What's 3+3?"})
 

 

 

 

 

 

'ML&DL and LLM' 카테고리의 다른 글

LangChain - 1.3.1 LLM QuickStart  (0) 2024.03.28
LangChain - 1.2.5 MessagePromptTemplate  (0) 2024.03.28
LangChain - 1.2.1 PromptTemplate  (0) 2024.03.27
LangChain - 1.1 Model I/O Concept  (0) 2024.03.27
LangChain  (0) 2024.03.27