info Open to new work opportunities! Contact me
Daniel Hladik AI Automation Engineer

← All terms

Chain of Thought (CoT)

A prompting technique that makes an LLM reason step by step, improving accuracy on complex tasks.

What is Chain of Thought?

Chain of Thought (CoT) is a prompting technique in which you explicitly ask the model to lay out its reasoning step by step. Instead of a one-sentence answer, the LLM first writes out intermediate thoughts and only then reaches the final result. On logical, math, and multi-step tasks this noticeably improves accuracy.

How to use CoT

  • Zero-shot CoT: Simply add a phrase like "Think step by step" or "First break down the problem, then answer" to the prompt
  • Few-shot CoT: Include several examples in the prompt that contain step-by-step solutions (builds on few-shot prompting)
  • Reasoning models: Modern models (e.g., o1 or Claude with extended thinking) have CoT built in and reason on their own

When CoT helps

  • Math and logic problems
  • Analysis of contracts and documents with many conditions
  • Decision-making in AI agents that have to pick the next action

Downside: a longer answer means more tokens and therefore higher cost and latency.