Implement the ReLU (Rectified Linear Unit) activation function.
$$\text{ReLU}(x) = \max(0, x)$$
Input: A tensor x of any shape
x
Output: A tensor of the same shape with all negative values replaced by 0