Skip to main content

LLM Parameters

These parameters are used only in case of Text Classification category.

To set parameters related to LoRa, one needs to enable it first using the following command:

trainingObject.enable_lora(True)

Now, the following are the parameters which can be set for LoRa-enabled experiments:

  • lora_r: Specifies the rank for the LoRA layer. Must be a positive integer. The default value is 256.
  • lora_aplha: Defines the scaling factor alpha for LoRA. Must be a positive integer. The default value is 512.
  • lora_dropout: Sets the dropout rate for LoRA layers. Must be a float between 0 and 1. The default value is 0.05.
  • q_lora: Determines whether to enable or disable Q LoRA. Must be a boolean value (True or False). The default value is False.

Use the following command to set all or any of the parameters; the default values for each parameter are shown in the example:

trainingObject.set_lora_parameters(q_lora=False,lora_alpha=512,lora_dropout=0.05,lora_r=256)

Note: These parameters are supported only for PyTorch.