Parameters Guide

Fine-Tuning Parameters Guide

Adjusting the fine-tuning configuration parameters allows you to customize the training process to better suit your data and use case. Here's a list of the supported parameters and their impact on the fine-tuning process:

Learning Rate

  • Determines how quickly the model learns.

  • Lower rates are ideal for minor adjustments, while higher rates speed up learning but risk overshooting.

  • Default value: 0.0001

Batch Size

  • Specifies how much data is processed simultaneously.

  • Smaller batch sizes can improve accuracy but take longer to train.

  • Default value: 16

Epochs

  • Indicates how many times the model will iterate through the entire dataset.

  • More epochs can improve accuracy but increase computation time.

  • Default value: 10 epochs

LoRA Rank

  • The rank of the low-rank adaptation matrix.

  • Higher ranks can capture more information but require more memory and computation.

  • Default value: 16

Last updated