Temperature (model temperature)
A parameter controlling the degree of randomness and creativity in AI model responses - from deterministic (0) to highly creative (2).
What is temperature?
Temperature is a parameter you set when calling an LLM API that determines how "creative" or "deterministic" the model should be when choosing the next word. The value typically ranges on a scale of 0 to 2.
How temperature works
- Temperature 0: The model always picks the most probable word. Outputs are consistent and predictable. Best for data extraction, classification, or structured output.
- Temperature 0.5–0.8: Balanced creativity. Good for chatbots and assistants.
- Temperature 1–2: High randomness. Outputs are variable and original. Best for creative writing or brainstorming.
Practical recommendation
For AI automations where you need reliable and structured output (JSON, data extraction), set temperature to 0 or 0.2. For conversational chatbots, use 0.5–0.7.