easy primitives

Implement Softmax

Implement the softmax function from scratch.

The softmax function converts a vector of real numbers into a probability distribution:

$$\text{softmax}(x_i) = \frac{e^{x_i}}{\sum_{j} e^{x_j}}$$

Input: A 1D tensor x of shape (n,)

Output: A 1D tensor of the same shape where all values sum to 1.0

Note: For numerical stability, subtract the max value before exponentiating: $\text{softmax}(x_i) = \frac{e^{x_i - \max(x)}}{\sum_{j} e^{x_j - \max(x)}}$

Hints

activation basics numerical-stability
Detecting runtime...