easy end_to_end

Weight Initialization

Compute the recommended initialization scale for neural network weights.

Xavier (Glorot) initialization — for layers with tanh/sigmoid: $$\text{scale} = \sqrt{\frac{2}{\text{fan\_in} + \text{fan\_out}}}$$

He (Kaiming) initialization — for layers with ReLU: $$\text{scale} = \sqrt{\frac{2}{\text{fan\_in}}}$$

Given fan_in and fan_out dimensions, compute both initialization scales.

Input:

  • fan_in: number of input units (int)
  • fan_out: number of output units (int)

Output: A dict with “xavier_scale” and “he_scale” (both floats).

Hints

initialization xavier he-init kaiming
Detecting runtime...