What is Zero-Shot prompting
Zero-Shot Prompting is a fundamental type of prompting technique. It is considered the simplest type of prompt.
Here's a breakdown of what the sources say about zero-shot prompting:
- Definition: A zero-shot prompt provides only a description of a task and some text for the Large Language Model (LLM) to get started with.
- Key Characteristic: The name "zero-shot" explicitly stands for 'no examples'. Unlike one-shot or few-shot prompting, it does not include demonstrations or examples of the desired input/output pattern. Only an instruction in natural language is given to the model.
- Input: This input could be various things, such as a question, the beginning of a story, or instructions.
- Purpose: It is often used as a baseline prompting technique.
- Usage: When zero-shot prompting is insufficient to achieve the desired results, providing examples (one-shot or few-shot prompting) can be necessary.
- Variations: There are standalone zero-shot techniques, as well as techniques that combine zero-shot prompting with other concepts, such as Zero-Shot Chain-of-Thought (CoT). Zero-Shot-CoT involves appending a thought-inducing phrase to the prompt and does not require exemplars.
For example, a zero-shot prompt might simply be "Classify movie reviews as POSITIVE, NEUTRAL or NEGATIVE." followed by the review text, and the model is expected to provide the sentiment classification based solely on the instruction and the input text, without seeing previous examples of reviews and their corresponding classifications. In a benchmarking study, a zero-shot baseline involved running questions directly through the model with only a base instruction and the question, without any exemplars or thought inducers.