Retrieve default deep learning framework set either with env variables or manually.
Set default deep learning framework to simplify imports.
Some functionalities in NNI (e.g., NAS / Compression), relies on an underlying DL framework. For different DL frameworks, the implementation of NNI can be very different. Thus, users need import things tailored for their own framework. For example:
from nni.nas.xxx.pytorch import yyy
from nni.nas.xxx import yyy
By setting a default framework, shortcuts will be made. As such
nni.nas.xxxwill be equivalent to
Another way to setting it is through environment variable
NNI_FRAMEWORK, which needs to be set before the whole process starts.
If you set the framework with
set_default_framework(), it should be done before all imports (except nni itself) happen, because it will affect other import’s behaviors. And the behavior is undefined if the framework is “re”-set in the middle.
The supported frameworks here are listed below. It doesn’t mean that they are fully supported by NAS / Compression in NNI.
none(to disable the shortcut-import behavior).
>>> import nni >>> nni.set_default_framework('tensorflow') >>> # then other imports >>> from nni.nas.xxx import yyy
currenta shortcut of
- class nni.common.serializer.Traceable[source]¶
A traceable object have copy and dict. Copy and mutate are used to copy the object for further mutations. Dict returns a TraceDictType to enable serialization.
- property trace_args: List[Any]¶
List of positional arguments passed to symbol. Usually empty if
kw_onlyis true, in which case all the positional arguments are converted into keyword arguments.
Perform a shallow copy. NOTE: NONE of the attributes will be preserved. This is the one that should be used when you want to “mutate” a serializable object.
- property trace_kwargs: Dict[str, Any]¶
Dict of keyword arguments.
- property trace_symbol: Any¶
Symbol object. Could be a class or a function.
import_cls_or_func_from_hybrid_nameis a pair to convert the symbol into a string and convert the string back to symbol.
- class nni.common.serializer.Translatable[source]¶
Inherit this class and implement
translatewhen the wrapped class needs a different parameter from the wrapper class in its init function.
Deprecated since version 3.0.
- nni.common.serializer.dump(obj, fp=None, *, use_trace=True, pickle_size_limit=4096, allow_nan=True, **json_tricks_kwargs)[source]¶
Convert a nested data structure to a json string. Save to file if fp is specified. Use json-tricks as main backend. For unhandled cases in json-tricks, use cloudpickle. The serializer is not designed for long-term storage use, but rather to copy data between processes. The format is also subject to change between NNI releases.
It’s recommended to use
trace. The traced object can be stored with their traced arguments. For more complex objects, it will look for
_loadpair in the class. If not found, it will fallback to binary dump with cloudpickle.
To compress the payload, please use
obj (any) – The object to dump.
fp (file handler or path) – File to write to. Keep it none if you want to dump a string.
pickle_size_limit (int) – This is set to avoid too long serialization result. Set to -1 to disable size check.
allow_nan (bool) – Whether to allow nan to be serialized. Different from default value in json-tricks, our default value is true.
json_tricks_kwargs (dict) – Other keyword arguments passed to json tricks (backend), e.g., indent=2.
Normally str. Sometimes bytes (if compressed).
- Return type:
str or bytes
- nni.common.serializer.is_traceable(obj, must_be_instance=False)[source]¶
Check whether an object is a traceable instance or type.
Note that an object is traceable only means that it implements the “Traceable” interface, and the properties have been implemented. It doesn’t necessary mean that its type is wrapped with trace, because the properties could be added after the instance has been created.
must_be_instanceis set to true, the check returns false if
objis a type.
Check whether a function or class is already wrapped with
@nni.trace. If a class or function is already wrapped with trace, then the created object must be “traceable”.
- nni.common.serializer.load(string=None, *, fp=None, preserve_order=False, ignore_comments=True, **json_tricks_kwargs)[source]¶
Load the string or from file, and convert it to a complex data structure. At least one of string or fp has to be not none.
string (str) – JSON string to parse. Can be set to none if fp is used.
fp (str) – File path to load JSON from. Can be set to none if string is used.
preserve_order (bool) – json_tricks parameter to use
dict. The order is in fact always preserved even when this is False.
ignore_comments (bool) – Remove comments (starting with
//). Default is true.
The loaded object.
- Return type:
- nni.common.serializer.trace(cls_or_func=None, *, kw_only=True, inheritable=False)[source]¶
Annotate a function or a class if you want to preserve where it comes from. This is usually used in the following scenarios:
Care more about execution configuration rather than results, which is usually the case in AutoML. For example, you want to mutate the parameters of a function.
Repeat execution is not an issue (e.g., reproducible, execution is fast without side effects).
When a class/function is annotated, all the instances/calls will return a object as it normally will. Although the object might act like a normal object, it’s actually a different object with NNI-specific properties. One exception is that if your function returns None, it will return an empty traceable object instead, which should raise your attention when you want to check whether the None
When parameters of functions are received, it is first stored as
_unwrap_parameter()will be invoked if it’s defined on the parameter to do some transformations (e.g.,
Mutableparameters can be transformed to fixed value to make the wrapped function happy). And then a shallow copy will be passed to wrapped function/class. This is to prevent mutable objects gets modified in the wrapped function/class. When the function finished execution, we also record extra information about where this object comes from. That’s why it’s called “trace”. When call
nni.dump, that information will be used, by default.
kw_onlyis true, try to convert all parameters into kwargs type. This is done by inspecting the argument list and types. This can be useful to extract semantics, but can be tricky in some corner cases. Therefore, in some cases, some positional arguments will still be kept.
inheritableis true, the trace information from superclass will also be available in subclass. This however, will make the subclass un-trace-able. Note that this argument has no effect when tracing functions.
Generators will be first expanded into a list, and the resulting list will be further passed into the wrapped function/class. This might hang when generators produce an infinite sequence. We might introduce an API to control this behavior in future.
@nni.trace def foo(bar): pass
Types for static checking.
- class nni.typehint.ParameterRecord(_typename, _fields=None, /, **kwargs)[source]¶
The format which is used to record parameters at NNI manager side.
MsgDispatcherpacks the parameters generated by tuners into a
ParameterRecordand sends it to NNI manager. NNI manager saves the tuner into database and sends it to trial jobs when they ask for parameters.
ParameterRecordand then hand it over to trial.
Most users don’t need to use this class directly.
Return type of
For built-in tuners, this is a
dictwhose content is defined by search space.
Customized tuners do not need to follow the constraint and can use anything serializable.
For built-in tuners, the format is detailed in Search Space.
Customized tuners do not need to follow the constraint and can use anything serializable, except
Type of the metrics sent to
For built-in tuners it must be a number (
Customized tuners do not need to follow this constraint and can use anything serializable.