Scheduled Optim Pretraining
ScheduledOptimPretraining
Bases: Optimizer
DEPRECATED: moved to AcousticModule.
A custom optimizer that uses AdamW
for optimization and an LambdaLR
for learning rate scheduling.
Source code in notebooks/experiments/optimizer/scheduled_optim_pretraining.py
__init__(train_config, model_config, parameters, defaults={}, step=0)
Initializes the ScheduledOptimPretraining optimizer.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
train_config |
AcousticPretrainingConfig
|
The training configuration. |
required |
model_config |
AcousticModelConfigType
|
The model configuration. |
required |
parameters |
Iterable
|
The model parameters to optimize. |
required |
defaults |
Dict[str, Any]
|
Default optimization options. Defaults to an empty dictionary. |
{}
|
step |
int
|
The current training step. Defaults to None. |
0
|
Source code in notebooks/experiments/optimizer/scheduled_optim_pretraining.py
get_lr()
load_state_dict(state_dict)
Loads the optimizer state dictionary.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
state_dict |
Dict[str, Any]
|
The optimizer state dictionary. |
required |
Source code in notebooks/experiments/optimizer/scheduled_optim_pretraining.py
step(closure)
get_lr_lambda(model_config, train_config, current_step=0)
DEPRECATED: moved to AcousticModule. Returns the custom lambda function for the learning rate schedule.
Returns function: The custom lambda function for the learning rate schedule.