Fundamentals
Zero-Shot Learning
Quick Answer
Asking an LLM to perform a task without providing any examples.
Zero-shot learning means asking the model to perform a task with no examples or prior demonstrations. You simply describe what you want: 'Translate this French sentence to English.' The model relies entirely on its pretraining to understand the task and execute it. Zero-shot performance varies by task—some tasks work well (summarization, translation), while others need examples. Modern large models are surprisingly good at zero-shot tasks due to their vast training data and scale. Zero-shot is the fastest approach but sometimes requires longer, more detailed prompts.
Last verified: 2026-04-08