Hypermodules

Hypermodule is a (PyTorch) module which contains many architecture/hyperparameter candidates for this module. By using hypermodule in user defined model, NNI will help users automatically find the best architecture/hyperparameter of the hypermodules for this model. This follows the design philosophy of Retiarii that users write DNN model as a space.

There has been proposed some hypermodules in NAS community, such as AutoActivation, AutoDropout. Some of them are implemented in the Retiarii framework.

class nni.retiarii.nn.pytorch.AutoActivation(unit_num: int = 1, label: Optional[str] = None)[source]

This module is an implementation of the paper “Searching for Activation Functions” (https://arxiv.org/abs/1710.05941). NOTE: current beta is not per-channel parameter

Parameters

unit_num (int) – the number of core units

forward(x)[source]

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.