Skip to content

NormalizationConfig

Abstract

This class cannot be instantiated directly. Use one of the variants listed below.

Module: fast_llm.layers.common.normalization.config

Inherits from: ModuleConfig

Fields

lr_scalefeature

Type: float or None    Default: None

Scaling factor for the layer learning rate. Combines multiplicatively with the scale set by the parent layer and individual parameters, if applicable.

Variants

Select a variant by setting type: to one of the following values.

type value Class Description
gated_rms_norm GatedRMSNormalizationConfig Configuration for gated RMS normalization, which applies a learned activation gate alongside the norm weight
layer_norm LayerNormalizationConfig
none NoNormalizationConfig
rms_norm RMSNormalizationConfig

Used in