Skip to content

LayerNormalizationConfig

Module: fast_llm.layers.common.normalization.config

Variant of: NormalizationConfig — select with type: layer_norm

Inherits from: LayerNormalizationBaseConfig, NormalizationConfig, ModuleConfig

Fields

biasarchitecture

Type: ParameterConfig    Default: (sub-fields optional)

Configuration for the weight.

epsilonarchitecture

Type: float    Default: 1e-05

Regularizer for the division.

weightarchitecture

Type: ParameterConfig    Default: (sub-fields optional)

Configuration for the weight.

zero_centeredarchitecture

Type: bool    Default: False

Write the normalization weight as w = 1 + w', to improve numerical accuracy when close to one.

lr_scalefeature

Type: float or None    Default: None

Scaling factor for the layer learning rate. Combines multiplicatively with the scale set by the parent layer and individual parameters, if applicable.

implementationperformance

Type: NormalizationImplementation    Default: "auto"

The implementation to use for the normalization layer.