| FixedRMSNormConfig |
RMS normalization without a learnable weight (fixed unit scale). Used for value norms in Gemma-family models |
| GatedRMSNormalizationConfig |
Configuration for gated RMS normalization, which applies a learned activation gate alongside the norm weight |
| LayerNormalizationBaseConfig (abstract) |
Common configuration for layer norm and rms norm |
| LayerNormalizationConfig |
|
| NoNormalizationConfig |
|
| NormalizationConfig (abstract) |
Abstract base configuration for normalization layers. Use type: layer_norm, rms_norm, gated_rms_norm, or none |
| RMSNormalizationConfig |
|