Uncategorized Modules

nni.common.framework

nni.common.framework.get_default_framework()[source]

Retrieve default deep learning framework set either with env variables or manually.

nni.common.framework.set_default_framework(framework)[source]

Set default deep learning framework to simplify imports.

Some functionalities in NNI (e.g., NAS / Compression), relies on an underlying DL framework. For different DL frameworks, the implementation of NNI can be very different. Thus, users need import things tailored for their own framework. For example:

from nni.nas.xxx.pytorch import yyy

rather than:

from nni.nas.xxx import yyy

By setting a default framework, shortcuts will be made. As such nni.nas.xxx will be equivalent to nni.nas.xxx.pytorch.

Another way to setting it is through environment variable NNI_FRAMEWORK, which needs to be set before the whole process starts.

If you set the framework with set_default_framework(), it should be done before all imports (except nni itself) happen, because it will affect other import’s behaviors. And the behavior is undefined if the framework is “re”-set in the middle.

The supported frameworks here are listed below. It doesn’t mean that they are fully supported by NAS / Compression in NNI.

  • pytorch (default)

  • tensorflow

  • mxnet

  • none (to disable the shortcut-import behavior).

Examples

>>> import nni
>>> nni.set_default_framework('tensorflow')
>>> # then other imports
>>> from nni.nas.xxx import yyy
nni.common.framework.shortcut_framework(current)[source]

Make current a shortcut of current.framework.

nni.common.framework.shortcut_module(current, target, package=None)[source]

Make current module an alias of target module in package.

nni.common.serializer

class nni.common.serializer.Traceable[source]

A traceable object have copy and dict. Copy and mutate are used to copy the object for further mutations. Dict returns a TraceDictType to enable serialization.

get()[source]

Get the original object. Usually used together with trace_copy.

property trace_args: List[Any]

List of positional arguments passed to symbol. Usually empty if kw_only is true, in which case all the positional arguments are converted into keyword arguments.

trace_copy()[source]

Perform a shallow copy. NOTE: NONE of the attributes will be preserved. This is the one that should be used when you want to “mutate” a serializable object.

property trace_kwargs: Dict[str, Any]

Dict of keyword arguments.

property trace_symbol: Any

Symbol object. Could be a class or a function. get_hybrid_cls_or_func_name and import_cls_or_func_from_hybrid_name is a pair to convert the symbol into a string and convert the string back to symbol.

class nni.common.serializer.Translatable[source]

Inherit this class and implement translate when the wrapped class needs a different parameter from the wrapper class in its init function.

Deprecated since version 3.0.

nni.common.serializer.dump(obj, fp=None, *, use_trace=True, pickle_size_limit=4096, allow_nan=True, **json_tricks_kwargs)[source]

Convert a nested data structure to a json string. Save to file if fp is specified. Use json-tricks as main backend. For unhandled cases in json-tricks, use cloudpickle. The serializer is not designed for long-term storage use, but rather to copy data between processes. The format is also subject to change between NNI releases.

It’s recommended to use dump with trace. The traced object can be stored with their traced arguments. For more complex objects, it will look for _dump and _load pair in the class. If not found, it will fallback to binary dump with cloudpickle.

To compress the payload, please use dump_bytes().

Parameters:
  • obj (any) – The object to dump.

  • fp (file handler or path) – File to write to. Keep it none if you want to dump a string.

  • pickle_size_limit (int) – This is set to avoid too long serialization result. Set to -1 to disable size check.

  • allow_nan (bool) – Whether to allow nan to be serialized. Different from default value in json-tricks, our default value is true.

  • json_tricks_kwargs (dict) – Other keyword arguments passed to json tricks (backend), e.g., indent=2.

Returns:

Normally str. Sometimes bytes (if compressed).

Return type:

str or bytes

nni.common.serializer.is_traceable(obj, must_be_instance=False)[source]

Check whether an object is a traceable instance or type.

Note that an object is traceable only means that it implements the “Traceable” interface, and the properties have been implemented. It doesn’t necessary mean that its type is wrapped with trace, because the properties could be added after the instance has been created.

If must_be_instance is set to true, the check returns false if obj is a type.

nni.common.serializer.is_wrapped_with_trace(cls_or_func)[source]

Check whether a function or class is already wrapped with @nni.trace. If a class or function is already wrapped with trace, then the created object must be “traceable”.

nni.common.serializer.load(string=None, *, fp=None, preserve_order=False, ignore_comments=True, **json_tricks_kwargs)[source]

Load the string or from file, and convert it to a complex data structure. At least one of string or fp has to be not none.

Parameters:
  • string (str) – JSON string to parse. Can be set to none if fp is used.

  • fp (str) – File path to load JSON from. Can be set to none if string is used.

  • preserve_order (bool) – json_tricks parameter to use OrderedDict instead of dict. The order is in fact always preserved even when this is False.

  • ignore_comments (bool) – Remove comments (starting with # or //). Default is true.

Returns:

The loaded object.

Return type:

any

nni.common.serializer.trace(cls_or_func=None, *, kw_only=True, inheritable=False)[source]

Annotate a function or a class if you want to preserve where it comes from. This is usually used in the following scenarios:

  1. Care more about execution configuration rather than results, which is usually the case in AutoML. For example, you want to mutate the parameters of a function.

  2. Repeat execution is not an issue (e.g., reproducible, execution is fast without side effects).

When a class/function is annotated, all the instances/calls will return a object as it normally will. Although the object might act like a normal object, it’s actually a different object with NNI-specific properties. One exception is that if your function returns None, it will return an empty traceable object instead, which should raise your attention when you want to check whether the None is None.

When parameters of functions are received, it is first stored as trace_args and trace_kwargs. _unwrap_parameter() will be invoked if it’s defined on the parameter to do some transformations (e.g., Mutable parameters can be transformed to fixed value to make the wrapped function happy). And then a shallow copy will be passed to wrapped function/class. This is to prevent mutable objects gets modified in the wrapped function/class. When the function finished execution, we also record extra information about where this object comes from. That’s why it’s called “trace”. When call nni.dump, that information will be used, by default.

If kw_only is true, try to convert all parameters into kwargs type. This is done by inspecting the argument list and types. This can be useful to extract semantics, but can be tricky in some corner cases. Therefore, in some cases, some positional arguments will still be kept.

If inheritable is true, the trace information from superclass will also be available in subclass. This however, will make the subclass un-trace-able. Note that this argument has no effect when tracing functions.

Warning

Generators will be first expanded into a list, and the resulting list will be further passed into the wrapped function/class. This might hang when generators produce an infinite sequence. We might introduce an API to control this behavior in future.

Examples

@nni.trace
def foo(bar):
    pass

nni.typehint

Types for static checking.

class nni.typehint.ParameterRecord[source]

The format which is used to record parameters at NNI manager side.

MsgDispatcher packs the parameters generated by tuners into a ParameterRecord and sends it to NNI manager. NNI manager saves the tuner into database and sends it to trial jobs when they ask for parameters. TrialCommandChannel receives the ParameterRecord and then hand it over to trial.

Most users don’t need to use this class directly.

nni.typehint.Parameters

Return type of nni.get_next_parameter().

For built-in tuners, this is a dict whose content is defined by search space.

Customized tuners do not need to follow the constraint and can use anything serializable.

alias of Dict[str, Any]

nni.typehint.SearchSpace

Type of experiment.config.search_space.

For built-in tuners, the format is detailed in Search Space.

Customized tuners do not need to follow the constraint and can use anything serializable, except None.

alias of Dict[str, _ParameterSearchSpace]

nni.typehint.TrialMetric

Type of the metrics sent to nni.report_final_result() and nni.report_intermediate_result().

For built-in tuners it must be a number (float, int, numpy.float32, etc).

Customized tuners do not need to follow this constraint and can use anything serializable.