easy primitives

Implement Leaky ReLU

Implement the Leaky ReLU activation function.

$$\text{LeakyReLU}(x) = \begin{cases} x & \text{if } x > 0 \\ \alpha x & \text{if } x \leq 0 \end{cases}$$

where $\alpha$ is a small positive constant (default 0.01).

Input: A tensor x and a scalar alpha (default 0.01)

Output: A tensor of the same shape

Hints

activation basics
Detecting runtime...