mmlearn.modules.lr_schedulers.linear_warmup_cosine_lr

Linear warmup cosine annealing learning rate scheduler.

Functions

linear_warmup_cosine_annealing_lr(optimizer, warmup_steps, max_steps, start_factor=0.3333333333333333, eta_min=0.0, last_epoch=-1)[source]

Create a linear warmup cosine annealing learning rate scheduler.

Parameters:
  • optimizer (Optimizer) – The optimizer for which to schedule the learning rate.

  • warmup_steps (int) – Maximum number of iterations for linear warmup.

  • max_steps (int) – Maximum number of iterations.

  • start_factor (float, optional, default=1/3) – Multiplicative factor for the learning rate at the start of the warmup phase.

  • eta_min (float, optional, default=0) – Minimum learning rate.

  • last_epoch (int, optional, default=-1) – The index of last epoch. If set to -1, it initializes the learning rate as the base learning rate

Returns:

The learning rate scheduler.

Return type:

LRScheduler

Raises:

ValueError – If warmup_steps is greater than or equal to max_steps or if warmup_steps is less than or equal to 0.

linear_warmup_cosine_annealing_lr(optimizer, warmup_steps, max_steps, start_factor=0.3333333333333333, eta_min=0.0, last_epoch=-1)[source]

Create a linear warmup cosine annealing learning rate scheduler.

Parameters:
  • optimizer (Optimizer) – The optimizer for which to schedule the learning rate.

  • warmup_steps (int) – Maximum number of iterations for linear warmup.

  • max_steps (int) – Maximum number of iterations.

  • start_factor (float, optional, default=1/3) – Multiplicative factor for the learning rate at the start of the warmup phase.

  • eta_min (float, optional, default=0) – Minimum learning rate.

  • last_epoch (int, optional, default=-1) – The index of last epoch. If set to -1, it initializes the learning rate as the base learning rate

Returns:

The learning rate scheduler.

Return type:

LRScheduler

Raises:

ValueError – If warmup_steps is greater than or equal to max_steps or if warmup_steps is less than or equal to 0.