elektronn2.neuromancer.variables module

class elektronn2.neuromancer.variables.VariableParam(value=None, name=None, apply_train=True, apply_reg=True, dtype=None, strict=False, allow_downcast=None, borrow=False, broadcastable=None)[source]

Bases: theano.tensor.sharedvar.TensorSharedVariable

Extension of theano TensorSharedVariable. Additional features are described by the parameters, otherwise identical

Parameters:
  • value
  • name (str) –
  • flag (apply_train) – whether to apply regularisation (e.g. L2) on this param
  • flag – whether to train this parameter (as opposed to a meta-parameter or a parameter that is kept const. during a training phase)
  • dtype
  • strict (bool) –
  • allow_downcast (bool) –
  • borrow (bool) –
  • broadcastable
clone()[source]

Return a new Variable like self.

Returns:A new Variable instance (or subclass instance) with no owner or index.
Return type:Variable instance

Notes

Tags are copied to the returned instance.

Name is copied to the returned instance.

updates
class elektronn2.neuromancer.variables.VariableWeight(shape=None, init_kwargs=None, value=None, name=None, apply_train=True, apply_reg=True, dtype=None, strict=False, allow_downcast=None, borrow=False, broadcastable=None)[source]

Bases: elektronn2.neuromancer.variables.VariableParam

set_value(new_value, borrow=False)[source]

Set the non-symbolic value associated with this SharedVariable.

Parameters:
  • borrow (bool) – True to use the new_value directly, potentially creating problems related to aliased memory.
  • to this value will be visible to all functions using (Changes) –
  • SharedVariable. (this) –
class elektronn2.neuromancer.variables.ConstantParam(value, name=None, dtype=None, make_singletons_broadcastable=True)[source]

Bases: theano.tensor.var.TensorConstant

Identical to theano VariableParam except that there are two two addition attributes apply_train and apply_reg`, which are both false. This is just to tell ELEKTRONN2 that this parameter is to be exempted from training. Obviously the set_value method raises an exception because this is a real constant. Constants are faster in the theano graph.

clone()[source]

We clone this object, but we don’t clone the data to lower memory requirement. We suppose that the data will never change.

get_value(borrow=False)[source]
set_value(new_value, borrow=False)[source]
updates
elektronn2.neuromancer.variables.initweights(shape, dtype='float64', scale='glorot', mode='normal', pool=None, spatial_axes=None)[source]