neon.optimizers.optimizer.PowerSchedule

class neon.optimizers.optimizer.PowerSchedule(step_config, change)[source]

Bases: neon.optimizers.optimizer.Schedule

Multiplies the learning rate by a factor at regular epoch intervals.

This schedule will multiply the learning rate by the factor change every step_config epochs. For example,

schedule = Schedule(step_config=2, change=0.5)
optimizer = GradientDescentMomentum(0.1, 0.9, schedule=schedule)

will yield a learning rate schedule of:

Epoch LR
0 0.1
1 0.1
2 0.05
3 0.05
4 0.025
5 0.025
6 0.0125
7 0.0125
__init__(step_config, change)[source]

Class constructor.

Parameters:
  • step_config (int) – Learning rate update interval (in epochs)
  • change (int) – Update factor

Methods

__init__(step_config, change) Class constructor.
gen_class(pdict)
get_description([skip]) Returns a dict that contains all necessary information needed to serialize this object.
get_learning_rate(learning_rate, epoch) Returns the current learning rate given the epoch and initial learning rate.
recursive_gen(pdict, key) helper method to check whether the definition
be = None
classnm

Returns the class name.

gen_class(pdict)
get_description(skip=[], **kwargs)

Returns a dict that contains all necessary information needed to serialize this object.

Parameters:skip (list) – Objects to omit from the dictionary.
Returns:Dictionary format for object information.
Return type:(dict)
get_learning_rate(learning_rate, epoch)[source]

Returns the current learning rate given the epoch and initial learning rate.

Parameters:
  • learning_rate (float) – Initial learning rate
  • epoch (int) – Current epoch, used to calculate the adjusted learning rate.
Returns:

The adjusted learning rate.

Return type:

(float)

modulenm

Returns the full module path.

recursive_gen(pdict, key)

helper method to check whether the definition dictionary is defining a NervanaObject child, if so it will instantiate that object and replace the dictionary element with an instance of that object