neural_tangents.stax.LeakyRelu

neural_tangents.stax.LeakyRelu(alpha, do_stabilize=False)[source]

Leaky ReLU nonlinearity, i.e. alpha * min(x, 0) + max(x, 0).

Parameters:
  • alpha (float) – slope for x < 0.

  • do_stabilize (bool) – set to True for very deep networks.

Return type:

tuple[InitFn, ApplyFn, LayerKernelFn]

Returns:

(init_fn, apply_fn, kernel_fn).