- LLM are good at conditional generation
- P(next token | prompt)
- LLMs are not storing state
- Token size is key for better answers
- Langchain - Build apps with LLM
The different types of prompts - zero shot with limited prompts, reasoning preserve states, simplified step by step prompts
A prompt template can contain:
- instructions to the language model,
- a set of few shot examples to help the language model generate a better response,
- a question to the language model.
A few shot prompt template can be constructed from either a set of examples, or from an Example Selector object.
Few shot examples for chat models
Ref - Link
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
!pip -q install openai langchain | |
import os | |
os.environ['OPENAI_API_KEY'] = 'your_key' | |
from langchain.llms import OpenAI | |
llm = OpenAI(model_name='text-davinci-003', | |
temperature=0.9, | |
max_tokens = 256) | |
text = "Why did the recession came after covid" | |
print(llm(text)) | |
"""## Prompt Templates""" | |
from langchain import PromptTemplate | |
restaurant_template = """ | |
I want you to act as a naming consultant for new restaurants. | |
Return a list of restaurant names. Each name should be short, catchy and easy to remember. It shoud relate to the type of restaurant you are naming. | |
What are some good names for a restaurant that is {restaurant_desription}? | |
""" | |
prompt = PromptTemplate( | |
input_variables=["restaurant_desription"], | |
template=restaurant_template, | |
) | |
# An example prompt with one input variable | |
prompt_template = PromptTemplate(input_variables=["restaurant_desription"], template=restaurant_template) | |
description = "Fresh South Indian Food with Idly, Sambar" | |
description_02 = "Hyderbad Biryani and Tandoor items" | |
description_03 = "Jain food and vegeterian menu" | |
## to see what the prompt will be like | |
prompt_template.format(restaurant_desription=description) | |
## querying the model with the prompt template | |
from langchain.chains import LLMChain | |
chain = LLMChain(llm=llm, prompt=prompt_template) | |
# Run the chain only specifying the input variable. | |
print(chain.run(description_03)) | |
"""## with Few Shot Learning""" | |
from langchain import PromptTemplate, FewShotPromptTemplate | |
# First, create the list of few shot examples. | |
examples = [ | |
{"word": "happy", "antonym": "sad"}, | |
{"word": "tall", "antonym": "short"}, | |
] | |
# Next, we specify the template to format the examples we have provided. | |
# We use the `PromptTemplate` class for this. | |
example_formatter_template = """ | |
Word: {word} | |
Antonym: {antonym}\n | |
""" | |
example_prompt = PromptTemplate( | |
input_variables=["word", "antonym"], | |
template=example_formatter_template, | |
) | |
# Finally, we create the `FewShotPromptTemplate` object. | |
few_shot_prompt = FewShotPromptTemplate( | |
# These are the examples we want to insert into the prompt. | |
examples=examples, | |
# This is how we want to format the examples when we insert them into the prompt. | |
example_prompt=example_prompt, | |
# The prefix is some text that goes before the examples in the prompt. | |
# Usually, this consists of intructions. | |
prefix="Give the antonym of every input", | |
# The suffix is some text that goes after the examples in the prompt. | |
# Usually, this is where the user input will go | |
suffix="Word: {input}\nAntonym:", | |
# The input variables are the variables that the overall prompt expects. | |
input_variables=["input"], | |
# The example_separator is the string we will use to join the prefix, examples, and suffix together with. | |
example_separator="\n\n", | |
) | |
# We can now generate a prompt using the `format` method. | |
print(few_shot_prompt.format(input="fast")) | |
from langchain.chains import LLMChain | |
chain = LLMChain(llm=llm, prompt=few_shot_prompt) | |
# Run the chain only specifying the input variable. | |
print(chain.run("fast")) |
Keep Exploring!!!
No comments:
Post a Comment