neon.optimizers.optimizer.ExpSchedule

class neon.optimizers.optimizer.ExpSchedule(decay)[source]

Bases: neon.optimizers.optimizer.Schedule

Exponential learning rate schedule. This schedule implements

\[\alpha(t) = \frac{\alpha_\circ}{1 + \beta t}\]

where \(\beta\) is the decay rate, and \(\alpha_\circ\) is the initial learning rate.

__init__(decay)[source]

Class constructor.

Parameters:decay (float) – Decay rate.

Methods

__init__(decay) Class constructor.
gen_class(pdict)
get_description([skip]) Returns a dict that contains all necessary information needed to serialize this object.
get_learning_rate(learning_rate, epoch) Returns the current learning rate given the epoch and initial learning rate.
recursive_gen(pdict, key) helper method to check whether the definition
be = None
classnm

Returns the class name.

gen_class(pdict)
get_description(skip=[], **kwargs)

Returns a dict that contains all necessary information needed to serialize this object.

Parameters:skip (list) – Objects to omit from the dictionary.
Returns:Dictionary format for object information.
Return type:(dict)
get_learning_rate(learning_rate, epoch)[source]

Returns the current learning rate given the epoch and initial learning rate.

Parameters:
  • learning_rate (float) – Initial learning rate
  • epoch (int) – Current epoch, used to calculate the adjusted learning rate.
Returns:

The adjusted learning rate.

Return type:

(float)

modulenm

Returns the full module path.

recursive_gen(pdict, key)

helper method to check whether the definition dictionary is defining a NervanaObject child, if so it will instantiate that object and replace the dictionary element with an instance of that object