easy research

Temperature Scaling

Implement Temperature Scaling from “On Calibration of Modern Neural Networks” (Guo et al., 2017).

Temperature scaling is a simple post-hoc calibration method. It divides logits by a learned temperature T before applying softmax:

$$\hat{q}_i = \text{softmax}(z_i / T)$$

  • T > 1: softer (more uniform) distribution — reduces overconfidence
  • T < 1: sharper (more peaked) distribution
  • T = 1: no change

Input:

  • logits: shape (batch, n_classes) — raw model outputs
  • temperature: float T > 0

Output: Tensor of shape (batch, n_classes) — calibrated probabilities.

Hints

temperature-scaling calibration guo-2017 softmax
Detecting runtime...