easy primitives

Implement ReLU

Implement the ReLU (Rectified Linear Unit) activation function.

$$\text{ReLU}(x) = \max(0, x)$$

Input: A tensor x of any shape

Output: A tensor of the same shape with all negative values replaced by 0

Hints

activation basics
Detecting runtime...