Skip to content

LayerNormalizationBaseConfig

Abstract

This class cannot be instantiated directly. Use one of the variants listed below.

Module: fast_llm.layers.common.normalization.config

Inherits from: NormalizationConfig, ModuleConfig

Fields

epsilonarchitecture

Type: float    Default: 1e-05

Regularizer for the division.

weightarchitecture

Type: ParameterConfig    Default: (sub-fields optional)

Configuration for the weight.

zero_centeredarchitecture

Type: bool    Default: False

Write the normalization weight as w = 1 + w', to improve numerical accuracy when close to one.

lr_scalefeature

Type: float or None    Default: None

Scaling factor for the layer learning rate. Combines multiplicatively with the scale set by the parent layer and individual parameters, if applicable.

implementationperformance

Type: NormalizationImplementation    Default: "auto"

The implementation to use for the normalization layer.