easy end_to_end

Two-Layer MLP

Build a simple two-layer feedforward neural network (MLP).

The forward pass computes:

  1. Hidden layer: $h = \text{ReLU}(x \cdot W_1 + b_1)$
  2. Output layer: $y = h \cdot W_2 + b_2$

Input:

  • x: input tensor of shape (batch, in_features)
  • W1: weight matrix of shape (in_features, hidden)
  • b1: bias vector of shape (hidden,)
  • W2: weight matrix of shape (hidden, out_features)
  • b2: bias vector of shape (out_features,)

Output: Tensor of shape (batch, out_features)

Hints

mlp feedforward relu neural-network
Detecting runtime...