medium primitives

Implement Cross-Entropy Loss

Implement the Cross-Entropy loss for multi-class classification.

Given class probabilities (after softmax) and integer class labels:

$$\text{CE} = -\frac{1}{N} \sum_{i=1}^{N} \log(p_{i, y_i})$$

where $p_{i, y_i}$ is the predicted probability for the true class of sample $i$.

Input:

  • probs: a 2D tensor of shape (N, C) with predicted probabilities (rows sum to 1)
  • targets: a 1D integer tensor of shape (N,) with class indices

Output: A scalar representing the mean cross-entropy loss

Note: Add a small epsilon (1e-7) to avoid log(0).

Hints

loss classification multi-class
Detecting runtime...