Skip to content

LoRAConfig

Module: fast_llm.layers.common.peft.config

Variant of: PeftConfig — select with type: lora

Inherits from: PeftConfig

Fields

alphastability

Type: float    Default: 8.0

The LoRA scaling parameter.

dropoutstability

Type: float    Default: 0.0

Dropout rate for LoRA.

freeze_others

Type: bool    Default: True

Whether to freeze other layers during training.

rankstability

Type: int    Default: 8

The LoRA rank, i.e. the size of the intermediate dimension.