neon.transforms.activation.Rectlin

class neon.transforms.activation.Rectlin(slope=0, name=None)[source]

Bases: neon.transforms.transform.Transform

Rectified Linear Unit (ReLu) activation function, \(f(x) = \max(x, 0)\). Can optionally set a slope which will make this a Leaky ReLu.

__init__(slope=0, name=None)[source]

Class constructor.

Parameters:
  • slope (float, optional) – Slope for negative domain. Defaults to 0.
  • name (string, optional) – Name to assign this class instance.

Methods

__init__([slope, name]) Class constructor.
bprop(x[, nglayer, error, deltas]) Returns the derivative.
gen_class(pdict)
get_description([skip]) Returns a dict that contains all necessary information needed to serialize this object.
recursive_gen(pdict, key) helper method to check whether the definition
be = None
bprop(x, nglayer=None, error=None, deltas=None)[source]

Returns the derivative.

Parameters:x (Tensor or optree) – Input value
Returns:Derivative
Return type:Tensor or optree
classnm

Returns the class name.

gen_class(pdict)
get_description(skip=[], **kwargs)

Returns a dict that contains all necessary information needed to serialize this object.

Parameters:skip (list) – Objects to omit from the dictionary.
Returns:Dictionary format for object information.
Return type:(dict)
modulenm

Returns the full module path.

recursive_gen(pdict, key)

helper method to check whether the definition dictionary is defining a NervanaObject child, if so it will instantiate that object and replace the dictionary element with an instance of that object