Auto-tuning Module References

federatedscope.autotune.choice_types

class federatedscope.autotune.choice_types.Continuous(lb, ub)[source]

Represents a continuous search space, e.g., in the range [0.001, 0.1].

grid(grid_cnt)[source]

Generate a given nunber of grids from this search space.

Parameters

grid_cnt (int) – the number of grids.

Returns

the sampled value.

Return type

float

sample()[source]

Sample a value from this search space.

Returns

the sampled value.

Return type

float

class federatedscope.autotune.choice_types.Discrete(*args)[source]

Represents a discrete search space, e.g., {‘abc’, ‘ijk’, ‘xyz’}.

sample()[source]

Sample a value from this search space.

Returns

the sampled value.

Return type

depends on the original choices.

federatedscope.autotune.choice_types.discretize(contd_choices, num_bkt)[source]

Discretize a given continuous search space into the given number of buckets.

Parameters
  • contd_choices (Continuous) – continuous choices.

  • num_bkt (int) – number of buckets.

Returns

discritized choices.

Return type

Discrete

federatedscope.autotune.algos

class federatedscope.autotune.algos.IterativeScheduler(cfg, client_cfgs=None)[source]

Bases: ModelFreeBase

The base class for HPO algorithms that divide the whole optimization procedure into iterations.

_generate_next_population(configs, perfs)[source]

To generate the configurations for the next stage.

Parameters
  • configs (list) – the configurations of last stage.

  • perfs (list) – their corresponding performances.

Returns

configuration for the next stage.

Return type

list

_iteration(configs)[source]

To evaluate the given collection of configurations at this stage.

Parameters

configs (list) – each element is a trial configuration.

Returns

the performances of the given configurations.

Return type

list

_setup()[source]

Prepare the initial configurations based on the search space.

_stop_criterion(configs, last_results)[source]

To determine whether the algorithm should be terminated.

Parameters
  • configs (list) – each element is a trial configuration.

  • last_results (DataFrame) – each row corresponds to a specific

  • performance. (configuration as well as its latest) –

Returns

whether to terminate.

Return type

bool

optimize()[source]

To optimize the hyperparameters, that is, executing the HPO algorithm and then returning the results.

class federatedscope.autotune.algos.ModelFreeBase(cfg, client_cfgs=None)[source]

Bases: Scheduler

To attempt a collection of configurations exhaustively.

_evaluate(configs)[source]

To evaluate (i.e., conduct the FL procedure) for a given collection of configurations.

_setup()[source]

Prepare the initial configurations based on the search space.

optimize()[source]

To optimize the hyperparameters, that is, executing the HPO algorithm and then returning the results.

class federatedscope.autotune.algos.SHAWrapFedex(cfg, client_cfgs=None)[source]

Bases: SuccessiveHalvingAlgo

This SHA is customized as a wrapper for FedEx algorithm.

_setup()[source]

Prepare the initial configurations based on the search space.

class federatedscope.autotune.algos.Scheduler(cfg, client_cfgs=None)[source]

Bases: object

The base class for describing HPO algorithms

_evaluate(configs)[source]

To evaluate (i.e., conduct the FL procedure) for a given collection of configurations.

_setup()[source]

Prepare the initial configurations based on the search space.

optimize()[source]

To optimize the hyperparameters, that is, executing the HPO algorithm and then returning the results.

class federatedscope.autotune.algos.SuccessiveHalvingAlgo(cfg, client_cfgs=None)[source]

Bases: IterativeScheduler

Successive Halving Algorithm (SHA) tailored to FL setting, where, in each iteration, just a limited number of communication rounds are allowed for each trial.

_generate_next_population(configs, perfs)[source]

To generate the configurations for the next stage.

Parameters
  • configs (list) – the configurations of last stage.

  • perfs (list) – their corresponding performances.

Returns

configuration for the next stage.

Return type

list

_setup()[source]

Prepare the initial configurations based on the search space.

_stop_criterion(configs, last_results)[source]

To determine whether the algorithm should be terminated.

Parameters
  • configs (list) – each element is a trial configuration.

  • last_results (DataFrame) – each row corresponds to a specific

  • performance. (configuration as well as its latest) –

Returns

whether to terminate.

Return type

bool

class federatedscope.autotune.algos.TrialExecutor(cfg_idx, signal, returns, trial_config, client_cfgs)[source]

Bases: Thread

This class is responsible for executing the FL procedure with a given trial configuration in another thread.

run()[source]

Method representing the thread’s activity.

You may override this method in a subclass. The standard run() method invokes the callable object passed to the object’s constructor as the target argument, if any, with sequential and keyword arguments taken from the args and kwargs arguments, respectively.

federatedscope.autotune.algos.get_scheduler(init_cfg, client_cfgs=None)[source]

To instantiate a scheduler object for conducting HPO :param init_cfg: configuration :param client_cfgs: client-specific configuration

federatedscope.autotune.hpbandster

class federatedscope.autotune.hpbandster.MyWorker(cfg, ss, sleep_interval=0, client_cfgs=None, *args, **kwargs)[source]
compute(config, budget, **kwargs)[source]

The function you have to overload implementing your computation.

Parameters
  • config_id (tuple) – a triplet of ints that uniquely identifies a configuration. the convention is id = (iteration, budget index, running index) with the following meaning: - iteration: the iteration of the optimization algorithms. E.g, for Hyperband that is one round of Successive Halving - budget index: the budget (of the current iteration) for which this configuration was sampled by the optimizer. This is only nonzero if the majority of the runs fail and Hyperband resamples to fill empty slots, or you use a more ‘advanced’ optimizer. - running index: this is simply an int >= 0 that sort the configs into the order they where sampled, i.e. (x,x,0) was sampled before (x,x,1).

  • config (dict) – the actual configuration to be evaluated.

  • budget (float) – the budget for the evaluation

  • working_directory (str) – a name of a directory that is unique to this configuration. Use this to store intermediate results on lower budgets that can be reused later for a larger budget (for iterative algorithms, for example).

Returns

needs to return a dictionary with two mandatory entries:
  • ’loss’: a numerical value that is MINIMIZED

  • ’info’: This can be pretty much any build in python type, e.g. a dict with lists as value. Due to Pyro4 handling the remote function calls, 3rd party types like numpy arrays are not supported!

Return type

dict

federatedscope.autotune.smac

federatedscope.autotune.utils

federatedscope.autotune.utils.arm2dict(kvs)[source]
Parameters

kvs (dict) – key is hyperparameter name in the form aaa.bb.cccc, and value is the choice.

Returns

the same specification for creating a cfg node.

Return type

config (dict)

federatedscope.autotune.utils.config2cmdargs(config)[source]
Parameters

config (dict) – key is cfg node name, value is the specified value.

Returns

cmd args

Return type

results (list)

federatedscope.autotune.utils.config2str(config)[source]
Parameters
  • config (dict) – key is cfg node name, value is the choice of

  • hyper-parameter.

Returns

the string representation of this config

Return type

name (str)

federatedscope.autotune.utils.eval_in_fs(cfg, config=None, budget=0, client_cfgs=None, trial_index=0)[source]
Parameters
  • cfg – fs cfg

  • config – sampled trial CS.Configuration

  • budget – budget round for this trial

  • client_cfgs – client-wise cfg

Returns

The best results returned from FedRunner

federatedscope.autotune.utils.parse_condition_param(condition, ss)[source]

Parse conditions param to generate ConfigSpace.conditions

Condition parameters: EqualsCondition, NotEqualsCondition, LessThanCondition, GreaterThanCondition, InCondition

Parameters
  • condition (dict) – configspace condition dict, which is supposed to

  • for (have four keys) –

  • ss (CS.ConfigurationSpace) – configspace

Returns

the conditions for configspace

Return type

ConfigSpace.conditions

federatedscope.autotune.utils.parse_search_space(config_path)[source]

Parse yaml format configuration to generate search space

Parameters

config_path (str) – the path of the yaml file.

Returns

the search space.

Return type

ConfigSpace object