NNI Documentation

NNI (Neural Network Intelligence) is a lightweight but powerful toolkit to help users automate:

Get Started

To install the current release:

$ pip install nni

See the installation guide if you need additional help on installation.

Try your first NNI experiment

$ nnictl hello

Note

You need to have PyTorch (as well as torchvision) installed to run this experiment.

To start your journey now, please follow the absolute quickstart of NNI!

Why choose NNI?

NNI makes AutoML techniques plug-and-play

_images/hpo-small.svg

Hyperparameter Tuning

params = nni.get_next_parameter()

class Net(nn.Module):
    ...

model = Net()
optimizer = optim.SGD(model.parameters(),
                      params['lr'],
                      params['momentum'])

for epoch in range(10):
    train(...)

accuracy = test(model)
nni.report_final_result(accuracy)

_images/pruning-small.svg

Model Pruning

# define a config_list
config = [{
    'sparsity': 0.8,
    'op_types': ['Conv2d']
}]

# generate masks for simulated pruning
wrapped_model, masks = \
    L1NormPruner(model, config). \
    compress()

# apply the masks for real speedup
ModelSpeedup(unwrapped_model, input, masks). \
    speedup_model()

_images/quantization-small.svg

Quantization

# define a config_list
config = [{
    'quant_types': ['input', 'weight'],
    'quant_bits': {'input': 8, 'weight': 8},
    'op_types': ['Conv2d']
}]

# in case quantizer needs a extra training
quantizer = QAT_Quantizer(model, config)
quantizer.compress()
# Training...

# export calibration config and
# generate TensorRT engine for real speedup
calibration_config = quantizer.export_model(
    model_path, calibration_path)
engine = ModelSpeedupTensorRT(
    model, input_shape, config=calib_config)
engine.compress()

_images/multi-trial-nas-small.svg

Neural Architecture Search

# define model space
class Model(nn.Module):
    self.conv2 = nn.LayerChoice([
        nn.Conv2d(32, 64, 3, 1),
        DepthwiseSeparableConv(32, 64)
    ])
model_space = Model()
# search strategy + evaluator
strategy = RegularizedEvolution()
evaluator = FunctionalEvaluator(
    train_eval_fn)

# run experiment
RetiariiExperiment(model_space,
    evaluator, strategy).run()

_images/one-shot-nas-small.svg

One-shot NAS

# define model space
space = AnySearchSpace()

# get a darts trainer
trainer = DartsTrainer(space, loss, metrics)
trainer.fit()

# get final searched architecture
arch = trainer.export()

_images/feature-engineering-small.svg

Feature Engineering

selector = GBDTSelector()
selector.fit(
    X_train, y_train,
    lgb_params=lgb_params,
    eval_ratio=eval_ratio,
    early_stopping_rounds=10,
    importance_type='gain',
    num_boost_round=1000)

# get selected features
features = selector.get_selected_features()

NNI eases the effort to scale and manage AutoML experiments

_images/training-service-small.svg

Training Service

An AutoML experiment requires many trials to explore feasible and potentially good-performing models. Training service aims to make the tuning process easily scalable in a distributed platforms. It provides a unified user experience for diverse computation resources (e.g., local machine, remote servers, AKS). Currently, NNI supports more than 9 kinds of training services.

_images/web-portal-small.svg

Web Portal

Web portal visualizes the tuning process, exposing the ability to inspect, monitor and control the experiment.

_images/webui.gif

_images/experiment-management-small.svg

Experiment Management

The DNN model tuning often requires more than one experiment. Users might try different tuning algorithms, fine-tune their search space, or switch to another training service. Experiment management provides the power to aggregate and compare tuning results from multiple experiments, so that the tuning workflow becomes clean and organized.

Get Support and Contribute Back

NNI is maintained on the NNI GitHub repository. We collect feedbacks and new proposals/ideas on GitHub. You can:

Gitter

WeChat

https://user-images.githubusercontent.com/39592018/80665738-e0574a80-8acc-11ea-91bc-0836dc4cbf89.png https://github.com/scarlett2018/nniutil/raw/master/wechat.png

Citing NNI

If you use NNI in a scientific publication, please consider citing NNI in your references.

Microsoft. Neural Network Intelligence (version v3.0pt1). https://github.com/microsoft/nni

Bibtex entry (please replace the version with the particular version you are using):

@software{nni2021,
   author = {{Microsoft}},
   month = {1},
   title = {{Neural Network Intelligence}},
   url = {https://github.com/microsoft/nni},
   version = {2.0},
   year = {2021}
}