Auto-tuning Module References

federatedscope.autotune.choice_types

class federatedscope.autotune.choice_types.Continuous(lb, ub)[source]

Represents a continuous search space, e.g., in the range [0.001, 0.1].

grid(grid_cnt)[source]

Generate a given nunber of grids from this search space.

Parameters

grid_cnt (int) – the number of grids.

Returns

the sampled value.

Return type

float

sample()[source]

Sample a value from this search space.

Returns

the sampled value.

Return type

float

class federatedscope.autotune.choice_types.Discrete(*args)[source]

Represents a discrete search space, e.g., {‘abc’, ‘ijk’, ‘xyz’}.

sample()[source]

Sample a value from this search space.

Returns

the sampled value.

Return type

depends on the original choices.

federatedscope.autotune.choice_types.discretize(contd_choices, num_bkt)[source]

Discretize a given continuous search space into the given number of buckets.

Parameters
  • contd_choices (Continuous) – continuous choices.

  • num_bkt (int) – number of buckets.

Returns

discritized choices.

Return type

Discrete

federatedscope.autotune.algos

class federatedscope.autotune.algos.IterativeScheduler(cfg)[source]

Bases: federatedscope.autotune.algos.ModelFreeBase

The base class for HPO algorithms that divide the whole optimization procedure into iterations.

optimize()[source]

To optimize the hyperparameters, that is, executing the HPO algorithm and then returning the results.

class federatedscope.autotune.algos.ModelFreeBase(cfg)[source]

Bases: federatedscope.autotune.algos.Scheduler

To attempt a collection of configurations exhaustively.

optimize()[source]

To optimize the hyperparameters, that is, executing the HPO algorithm and then returning the results.

class federatedscope.autotune.algos.SHAWrapFedex(cfg)[source]

Bases: federatedscope.autotune.algos.SuccessiveHalvingAlgo

This SHA is customized as a wrapper for FedEx algorithm.

class federatedscope.autotune.algos.Scheduler(cfg)[source]

Bases: object

The base class for describing HPO algorithms

optimize()[source]

To optimize the hyperparameters, that is, executing the HPO algorithm and then returning the results.

class federatedscope.autotune.algos.SuccessiveHalvingAlgo(cfg)[source]

Bases: federatedscope.autotune.algos.IterativeScheduler

Successive Halving Algorithm (SHA) tailored to FL setting, where, in each iteration, just a limited number of communication rounds are allowed for each trial.

class federatedscope.autotune.algos.TrialExecutor(cfg_idx, signal, returns, trial_config)[source]

Bases: threading.Thread

This class is responsible for executing the FL procedure with a given trial configuration in another thread.

run()[source]

Method representing the thread’s activity.

You may override this method in a subclass. The standard run() method invokes the callable object passed to the object’s constructor as the target argument, if any, with sequential and keyword arguments taken from the args and kwargs arguments, respectively.

federatedscope.autotune.algos.get_scheduler(init_cfg)[source]

To instantiate an scheduler object for conducting HPO :param init_cfg: configuration. :type init_cfg: yacs.Node