Fundamentals

Temperature

Quick Answer

A parameter controlling the randomness of model outputs (0=deterministic, 1+=random).

Temperature controls how deterministic or random an LLM's outputs are. A temperature of 0 makes the model always pick the most likely next token, producing deterministic results. Higher temperatures (0.7-1.0) introduce randomness, making outputs more creative but less consistent. Very high temperatures (>1.0) produce increasingly unpredictable outputs. Temperature is essential for tuning behavior: use 0 for factual tasks like data extraction, medium values (0.5-0.7) for balanced tasks, and higher values (0.8+) for creative work. It's often paired with top-p for fine-grained control.

Last verified: 2026-04-08

Compare models

See how different LLMs compare on benchmarks, pricing, and speed.

Browse all models →