Model Evaluators

A model evaluator is for training and validating each generated model.

Usage of Model Evaluator

In multi-trial NAS, a sampled model should be able to be executed on a remote machine or a training platform (e.g., AzureML, OpenPAI). Thus, both the model and its model evaluator should be correctly serialized. To make NNI correctly serialize model evaluator, users should apply serialize on some of their functions and objects.

serialize enables re-instantiation of model evaluator in another process or machine. It is implemented by recording the initialization parameters of user instantiated evaluator.

The evaluator related APIs provided by Retiarii have already supported serialization, for example pl.Classification, pl.DataLoader, no need to apply serialize on them. In the following case users should use serialize API manually.

If the initialization parameters of the evaluator APIs (e.g., pl.Classification, pl.DataLoader) are not primitive types (e.g., int, string), they should be applied with serialize. If those parameters’ initialization parameters are not primitive types, serialize should also be applied. In a word, serialize should be applied recursively if necessary.

Below is an example, transforms.Compose, transforms.Normalize, and MNIST are serialized manually using serialize. serialize takes a class cls as its first argument, its following arguments are the arguments for initializing this class. pl.Classification is not applied serialize because it is already serializable as an API provided by NNI.

import nni.retiarii.evaluator.pytorch.lightning as pl
from nni.retiarii import serialize
from torchvision import transforms

transform = serialize(transforms.Compose, [serialize(transforms.ToTensor()), serialize(transforms.Normalize, (0.1307,), (0.3081,))])
train_dataset = serialize(MNIST, root='data/mnist', train=True, download=True, transform=transform)
test_dataset = serialize(MNIST, root='data/mnist', train=False, download=True, transform=transform)
evaluator = pl.Classification(train_dataloader=pl.DataLoader(train_dataset, batch_size=100),
                              val_dataloaders=pl.DataLoader(test_dataset, batch_size=100),
                              max_epochs=10)

Supported Model Evaluators

NNI provides some commonly used model evaluators for users’ convenience. If these model evaluators do not meet users’ requirement, they can customize new model evaluators following the tutorial here.

class nni.retiarii.evaluator.pytorch.lightning.Classification(criterion: torch.nn.modules.module.Module = <class 'torch.nn.modules.loss.CrossEntropyLoss'>, learning_rate: float = 0.001, weight_decay: float = 0.0, optimizer: torch.optim.optimizer.Optimizer = <class 'torch.optim.adam.Adam'>, train_dataloader: Optional[torch.utils.data.dataloader.DataLoader] = None, val_dataloaders: Optional[Union[torch.utils.data.dataloader.DataLoader, List[torch.utils.data.dataloader.DataLoader]]] = None, export_onnx: bool = True, **trainer_kwargs)[source]

Trainer that is used for classification.

Parameters
  • criterion (nn.Module) – Class for criterion module (not an instance). default: nn.CrossEntropyLoss

  • learning_rate (float) – Learning rate. default: 0.001

  • weight_decay (float) – L2 weight decay. default: 0

  • optimizer (Optimizer) – Class for optimizer (not an instance). default: Adam

  • train_dataloders (DataLoader) – Used in trainer.fit(). A PyTorch DataLoader with training samples. If the lightning_module has a predefined train_dataloader method this will be skipped.

  • val_dataloaders (DataLoader or List of DataLoader) – Used in trainer.fit(). Either a single PyTorch Dataloader or a list of them, specifying validation samples. If the lightning_module has a predefined val_dataloaders method this will be skipped.

  • export_onnx (bool) – If true, model will be exported to model.onnx before training starts. default true

  • trainer_kwargs (dict) – Optional keyword arguments passed to trainer. See Lightning documentation for details.

class nni.retiarii.evaluator.pytorch.lightning.Regression(criterion: torch.nn.modules.module.Module = <class 'torch.nn.modules.loss.MSELoss'>, learning_rate: float = 0.001, weight_decay: float = 0.0, optimizer: torch.optim.optimizer.Optimizer = <class 'torch.optim.adam.Adam'>, train_dataloader: Optional[torch.utils.data.dataloader.DataLoader] = None, val_dataloaders: Optional[Union[torch.utils.data.dataloader.DataLoader, List[torch.utils.data.dataloader.DataLoader]]] = None, export_onnx: bool = True, **trainer_kwargs)[source]

Trainer that is used for regression.

Parameters
  • criterion (nn.Module) – Class for criterion module (not an instance). default: nn.MSELoss

  • learning_rate (float) – Learning rate. default: 0.001

  • weight_decay (float) – L2 weight decay. default: 0

  • optimizer (Optimizer) – Class for optimizer (not an instance). default: Adam

  • train_dataloders (DataLoader) – Used in trainer.fit(). A PyTorch DataLoader with training samples. If the lightning_module has a predefined train_dataloader method this will be skipped.

  • val_dataloaders (DataLoader or List of DataLoader) – Used in trainer.fit(). Either a single PyTorch Dataloader or a list of them, specifying validation samples. If the lightning_module has a predefined val_dataloaders method this will be skipped.

  • export_onnx (bool) – If true, model will be exported to model.onnx before training starts. default: true

  • trainer_kwargs (dict) – Optional keyword arguments passed to trainer. See Lightning documentation for details.