Evaluator

FunctionalEvaluator

class nni.retiarii.evaluator.FunctionalEvaluator(function, **kwargs)[source]

Functional evaluator that directly takes a function and thus should be general.

function

The full name of the function.

arguments

Keyword arguments for the function other than model.

Classification

class nni.retiarii.evaluator.pytorch.Classification(criterion=<class 'torch.nn.modules.loss.CrossEntropyLoss'>, learning_rate=0.001, weight_decay=0.0, optimizer=<class 'torch.optim.adam.Adam'>, train_dataloaders=None, val_dataloaders=None, export_onnx=True, train_dataloader=None, **trainer_kwargs)[source]

Evaluator that is used for classification.

Parameters
  • criterion (nn.Module) – Class for criterion module (not an instance). default: nn.CrossEntropyLoss

  • learning_rate (float) – Learning rate. default: 0.001

  • weight_decay (float) – L2 weight decay. default: 0

  • optimizer (Optimizer) – Class for optimizer (not an instance). default: Adam

  • train_dataloaders (DataLoader) – Used in trainer.fit(). A PyTorch DataLoader with training samples. If the lightning_module has a predefined train_dataloader method this will be skipped.

  • val_dataloaders (DataLoader or List of DataLoader) – Used in trainer.fit(). Either a single PyTorch Dataloader or a list of them, specifying validation samples. If the lightning_module has a predefined val_dataloaders method this will be skipped.

  • export_onnx (bool) – If true, model will be exported to model.onnx before training starts. default true

  • trainer_kwargs (dict) – Optional keyword arguments passed to trainer. See Lightning documentation for details.

Examples

>>> evaluator = Classification()

To use customized criterion and optimizer:

>>> evaluator = Classification(nn.LabelSmoothingCrossEntropy, optimizer=torch.optim.SGD)

Extra keyword arguments will be passed to trainer, some of which might be necessary to enable GPU acceleration:

>>> evaluator = Classification(accelerator='gpu', devices=2, strategy='ddp')

Regression

class nni.retiarii.evaluator.pytorch.Regression(criterion=<class 'torch.nn.modules.loss.MSELoss'>, learning_rate=0.001, weight_decay=0.0, optimizer=<class 'torch.optim.adam.Adam'>, train_dataloaders=None, val_dataloaders=None, export_onnx=True, train_dataloader=None, **trainer_kwargs)[source]

Evaluator that is used for regression.

Parameters
  • criterion (nn.Module) – Class for criterion module (not an instance). default: nn.MSELoss

  • learning_rate (float) – Learning rate. default: 0.001

  • weight_decay (float) – L2 weight decay. default: 0

  • optimizer (Optimizer) – Class for optimizer (not an instance). default: Adam

  • train_dataloaders (DataLoader) – Used in trainer.fit(). A PyTorch DataLoader with training samples. If the lightning_module has a predefined train_dataloader method this will be skipped.

  • val_dataloaders (DataLoader or List of DataLoader) – Used in trainer.fit(). Either a single PyTorch Dataloader or a list of them, specifying validation samples. If the lightning_module has a predefined val_dataloaders method this will be skipped.

  • export_onnx (bool) – If true, model will be exported to model.onnx before training starts. default: true

  • trainer_kwargs (dict) – Optional keyword arguments passed to trainer. See Lightning documentation for details.

Examples

>>> evaluator = Regression()

Extra keyword arguments will be passed to trainer, some of which might be necessary to enable GPU acceleration:

>>> evaluator = Regression(gpus=1)

Utilities

class nni.retiarii.evaluator.pytorch.Trainer(logger=True, checkpoint_callback=None, enable_checkpointing=True, callbacks=None, default_root_dir=None, gradient_clip_val=None, gradient_clip_algorithm=None, process_position=0, num_nodes=1, num_processes=None, devices=None, gpus=None, auto_select_gpus=False, tpu_cores=None, ipus=None, log_gpu_memory=None, progress_bar_refresh_rate=None, enable_progress_bar=True, overfit_batches=0.0, track_grad_norm=- 1, check_val_every_n_epoch=1, fast_dev_run=False, accumulate_grad_batches=None, max_epochs=None, min_epochs=None, max_steps=- 1, min_steps=None, max_time=None, limit_train_batches=None, limit_val_batches=None, limit_test_batches=None, limit_predict_batches=None, val_check_interval=None, flush_logs_every_n_steps=None, log_every_n_steps=50, accelerator=None, strategy=None, sync_batchnorm=False, precision=32, enable_model_summary=True, weights_summary='top', weights_save_path=None, num_sanity_val_steps=2, resume_from_checkpoint=None, profiler=None, benchmark=None, deterministic=None, reload_dataloaders_every_n_epochs=0, auto_lr_find=False, replace_sampler_ddp=True, detect_anomaly=False, auto_scale_batch_size=False, prepare_data_per_node=None, plugins=None, amp_backend='native', amp_level=None, move_metrics_to_cpu=False, multiple_trainloader_mode='max_size_cycle', stochastic_weight_avg=False, terminate_on_nan=None)[source]

Traced version of pytorch_lightning.Trainer. See https://pytorch-lightning.readthedocs.io/en/stable/common/trainer.html

class nni.retiarii.evaluator.pytorch.DataLoader(dataset, batch_size=1, shuffle=False, sampler=None, batch_sampler=None, num_workers=0, collate_fn=None, pin_memory=False, drop_last=False, timeout=0, worker_init_fn=None, multiprocessing_context=None, generator=None, *, prefetch_factor=2, persistent_workers=False)[source]

Traced version of torch.utils.data.DataLoader. See https://pytorch.org/docs/stable/data.html

Customization

class nni.retiarii.Evaluator[source]

Evaluator of a model. An evaluator should define where the training code is, and the configuration of training code. The configuration includes basic runtime information trainer needs to know (such as number of GPUs) or tune-able parameters (such as learning rate), depending on the implementation of training code.

Each config should define how it is interpreted in _execute(), taking only one argument which is the mutated model class. For example, functional evaluator might directly import the function and call the function.

evaluate(model_cls)[source]

To run evaluation of a model. The model could be either a concrete model or a callable returning a model.

The concrete implementation of evaluate depends on the implementation of _execute() in sub-class.

class nni.retiarii.evaluator.pytorch.Lightning(lightning_module, trainer, train_dataloaders=None, val_dataloaders=None, train_dataloader=None)[source]

Delegate the whole training to PyTorch Lightning.

Since the arguments passed to the initialization needs to be serialized, LightningModule, Trainer or DataLoader in this file should be used. Another option is to hide dataloader in the Lightning module, in which case, dataloaders are not required for this class to work.

Following the programming style of Lightning, metrics sent to NNI should be obtained from callback_metrics in trainer. Two hooks are added at the end of validation epoch and the end of fit, respectively. The metric name and type depend on the specific task.

Warning

The Lightning evaluator are stateful. If you try to use a previous Lightning evaluator, please note that the inner lightning_module and trainer will be reused.

Parameters
  • lightning_module (LightningModule) – Lightning module that defines the training logic.

  • trainer (Trainer) – Lightning trainer that handles the training.

  • train_dataloders – Used in trainer.fit(). A PyTorch DataLoader with training samples. If the lightning_module has a predefined train_dataloader method this will be skipped. It can be any types of dataloader supported by Lightning.

  • val_dataloaders (Optional[Any]) – Used in trainer.fit(). Either a single PyTorch Dataloader or a list of them, specifying validation samples. If the lightning_module has a predefined val_dataloaders method this will be skipped. It can be any types of dataloader supported by Lightning.

fit(model)[source]

Fit the model with provided dataloader, with Lightning trainer.

Parameters

model (nn.Module) – The model to fit.

class nni.retiarii.evaluator.pytorch.LightningModule(*args, **kwargs)[source]

Basic wrapper of generated model. Lightning modules used in NNI should inherit this class.

It’s a subclass of pytorch_lightning.LightningModule. See https://pytorch-lightning.readthedocs.io/en/stable/common/lightning_module.html

running_mode: Literal['multi', 'oneshot'] = 'multi'

An indicator of whether current module is running in a multi-trial experiment or an one-shot. This flag should be automatically set by experiments when they start to run.

set_model(model)[source]

Set the inner model (architecture) to train / evaluate.

Parameters

model (callable or nn.Module) – Can be a callable returning nn.Module or nn.Module.

Cross-graph Optimization (experimental)

class nni.retiarii.evaluator.pytorch.cgo.evaluator.MultiModelSupervisedLearningModule(criterion, metrics, learning_rate=0.001, weight_decay=0.0, optimizer=<class 'torch.optim.adam.Adam'>)[source]

Lightning Module of SupervisedLearning for Cross-Graph Optimization. Users who needs cross-graph optimization should use this module.

Parameters
  • criterion (nn.Module) – Class for criterion module (not an instance). default: nn.CrossEntropyLoss

  • learning_rate (float) – Learning rate. default: 0.001

  • weight_decay (float) – L2 weight decay. default: 0

  • optimizer (Optimizer) – Class for optimizer (not an instance). default: Adam

class nni.retiarii.evaluator.pytorch.cgo.evaluator.Classification(criterion=<class 'torch.nn.modules.loss.CrossEntropyLoss'>, learning_rate=0.001, weight_decay=0.0, optimizer=<class 'torch.optim.adam.Adam'>, train_dataloader=None, val_dataloaders=None, **trainer_kwargs)[source]

Trainer that is used for classification.

Parameters
  • criterion (nn.Module) – Class for criterion module (not an instance). default: nn.CrossEntropyLoss

  • learning_rate (float) – Learning rate. default: 0.001

  • weight_decay (float) – L2 weight decay. default: 0

  • optimizer (Optimizer) – Class for optimizer (not an instance). default: Adam

  • train_dataloders (DataLoader) – Used in trainer.fit(). A PyTorch DataLoader with training samples. If the lightning_module has a predefined train_dataloader method this will be skipped.

  • val_dataloaders (DataLoader or List of DataLoader) – Used in trainer.fit(). Either a single PyTorch Dataloader or a list of them, specifying validation samples. If the lightning_module has a predefined val_dataloaders method this will be skipped.

  • trainer_kwargs (dict) – Optional keyword arguments passed to trainer. See Lightning documentation for details.

class nni.retiarii.evaluator.pytorch.cgo.evaluator.Regression(criterion=<class 'torch.nn.modules.loss.MSELoss'>, learning_rate=0.001, weight_decay=0.0, optimizer=<class 'torch.optim.adam.Adam'>, train_dataloader=None, val_dataloaders=None, **trainer_kwargs)[source]

Trainer that is used for regression.

Parameters
  • criterion (nn.Module) – Class for criterion module (not an instance). default: nn.MSELoss

  • learning_rate (float) – Learning rate. default: 0.001

  • weight_decay (float) – L2 weight decay. default: 0

  • optimizer (Optimizer) – Class for optimizer (not an instance). default: Adam

  • train_dataloders (DataLoader) – Used in trainer.fit(). A PyTorch DataLoader with training samples. If the lightning_module has a predefined train_dataloader method this will be skipped.

  • val_dataloaders (DataLoader or List of DataLoader) – Used in trainer.fit(). Either a single PyTorch Dataloader or a list of them, specifying validation samples. If the lightning_module has a predefined val_dataloaders method this will be skipped.

  • trainer_kwargs (dict) – Optional keyword arguments passed to trainer. See Lightning documentation for details.