fl4health.clients.scaffold_client module¶
- class DPScaffoldClient(data_path, metrics, device, loss_meter_type=LossMeterType.AVERAGE, checkpoint_and_state_module=None, reporters=None, progress_bar=False, client_name=None)[source]¶
Bases:
ScaffoldClient
,InstanceLevelDpClient
Federated Learning client for Instance Level Differentially Private Scaffold strategy
Implemented as specified in https://arxiv.org/abs/2111.09278
- class ScaffoldClient(data_path, metrics, device, loss_meter_type=LossMeterType.AVERAGE, checkpoint_and_state_module=None, reporters=None, progress_bar=False, client_name=None)[source]¶
Bases:
BasicClient
- __init__(data_path, metrics, device, loss_meter_type=LossMeterType.AVERAGE, checkpoint_and_state_module=None, reporters=None, progress_bar=False, client_name=None)[source]¶
Federated Learning Client for Scaffold strategy.
Implementation based on https://arxiv.org/pdf/1910.06378.pdf.
- Parameters:
data_path (Path) – path to the data to be used to load the data for client-side training
metrics (Sequence[Metric]) – Metrics to be computed based on the labels and predictions of the client model
device (torch.device) – Device indicator for where to send the model, batches, labels etc. Often ‘cpu’ or ‘cuda’
loss_meter_type (LossMeterType, optional) – Type of meter used to track and compute the losses over each batch. Defaults to LossMeterType.AVERAGE.
checkpoint_and_state_module (ClientCheckpointAndStateModule | None, optional) – A module meant to handle both checkpointing and state saving. The module, and its underlying model and state checkpointing components will determine when and how to do checkpointing during client-side training. No checkpointing (state or model) is done if not provided. Defaults to None.
reporters (Sequence[BaseReporter] | None, optional) – A sequence of FL4Health reporters which the client should send data to. Defaults to None.
progress_bar (bool, optional) – Whether or not to display a progress bar during client training and validation. Uses tqdm. Defaults to False
client_name (str | None, optional) – An optional client name that uniquely identifies a client. If not passed, a hash is randomly generated. Client state will use this as part of its state file name. Defaults to None.
- compute_parameters_delta(params_1, params_2)[source]¶
Computes element-wise difference of two lists of NDarray where elements in params_2 are subtracted from elements in params_1
- compute_updated_control_variates(local_steps, delta_model_weights, delta_control_variates)[source]¶
Computes the updated local control variates according to option 2 in Equation 4 of paper
- get_parameter_exchanger(config)[source]¶
Returns Full Parameter Exchangers. Subclasses that require custom Parameter Exchangers can override this.
- Parameters:
config (Config) – The config from server.
- Returns:
Used to exchange parameters between server and client.
- Return type:
- get_parameters(config)[source]¶
Packs the parameters and control variates into a single NDArrays to be sent to the server for aggregation
- modify_grad()[source]¶
Modifies the gradient of the local model to correct for client drift. To be called after the gradients have been computed on a batch of data. Updates not applied to params until step is called on optimizer.
- Return type:
- set_parameters(parameters, config, fitting_round)[source]¶
Assumes that the parameters being passed contain model parameters concatenated with server control variates. They are unpacked for the clients to use in training. If it’s the first time the model is being initialized, we assume the full model is being initialized and use the FullParameterExchanger() to set all model weights :type parameters:
List
[ndarray
[Any
,dtype
[Any
]]] :param parameters: Parameters have information about model state to be added to the relevant clientmodel and also the server control variates (initial or after aggregation)
- Parameters:
config (Config) – The config is sent by the FL server to allow for customization in the function if desired.
- Return type:
- setup_client(config)[source]¶
Set dataloaders, optimizers, parameter exchangers and other attributes derived from these. Then set initialized attribute to True. Extends the basic client to extract the learning rate from the optimizer and set the learning_rate attribute (used to compute updated control variates).
- Parameters:
config (Config) – The config from the server.
- Return type:
- transform_gradients(losses)[source]¶
Hook function for model training only called after backwards pass but before optimizer step. Used to modify gradient to correct for client drift in Scaffold.
- Return type:
- update_after_train(local_steps, loss_dict, config)[source]¶
Called after training with the number of local_steps performed over the FL round and the corresponding loss dictionary.
- Return type:
- update_control_variates(local_steps)[source]¶
Updates local control variates along with the corresponding updates according to the option 2 in Equation 4 in https://arxiv.org/pdf/1910.06378.pdf To be called after weights of local model have been updated.
- Return type: