neon.transforms.cost.CrossEntropyBinary

class neon.transforms.cost.CrossEntropyBinary(scale=1)[source]

Bases: neon.transforms.cost.Cost

Binary cross-entropy cost.

The binary cross-entropy cost is used when the labels have two classes: 0 and 1. The cost is computed as \(C = \sum -t\log(y)-(1-t)\log(1-y)\), where \(t\) is the target label and \(y\) is the network output.

Note: The backpropagation assumes that this cost is coupled with an output layer that uses the Logistic() activation function. This allows for a shortcut in the deriviate that saves computation.

__init__(scale=1)[source]
Parameters:scale (float, optional) – Amount by which to scale the backpropagated error (default: 1)

Methods

__init__([scale])
param scale:Amount by which to scale the backpropagated error (default: 1)
bprop(y, t) Returns the derivative of the binary cross entropy cost.
gen_class(pdict)
get_description([skip]) Returns a dict that contains all necessary information needed to serialize this object.
recursive_gen(pdict, key) helper method to check whether the definition
be = None
bprop(y, t)[source]

Returns the derivative of the binary cross entropy cost.

Parameters:
  • y (Tensor or OpTree) – Output of previous layer or model
  • t (Tensor or OpTree) – True targets corresponding to y
Returns:

Returns the (mean) shortcut derivative of the binary entropy

cost function (y - t) / y.shape[1]

Return type:

OpTree

classnm

Returns the class name.

gen_class(pdict)
get_description(skip=[], **kwargs)

Returns a dict that contains all necessary information needed to serialize this object.

Parameters:skip (list) – Objects to omit from the dictionary.
Returns:Dictionary format for object information.
Return type:(dict)
modulenm

Returns the full module path.

recursive_gen(pdict, key)

helper method to check whether the definition dictionary is defining a NervanaObject child, if so it will instantiate that object and replace the dictionary element with an instance of that object