Merge layers

class lasagne.layers.ConcatLayer(incomings, axis=1, cropping=None, **kwargs)[source]

Concatenates multiple inputs along the specified axis. Inputs should have the same shape except for the dimension specified in axis, which can have different sizes.

Parameters:

incomings : a list of Layer instances or tuples

The layers feeding into this layer, or expected input shapes

axis : int

Axis which inputs are joined over

cropping : None or [crop]

Cropping for each input axis. Cropping is described in the docstring for autocrop(). Cropping is always disabled for axis.

get_output_for(inputs, **kwargs)[source]

Propagates the given inputs through this layer (and only this layer).

Parameters:

inputs : list of Theano expressions

The Theano expressions to propagate through this layer.

Returns:

Theano expressions

The output of this layer given the inputs to this layer.

Notes

This is called by the base lasagne.layers.get_output() to propagate data through a network.

This method should be overridden when implementing a new Layer class with multiple inputs. By default it raises NotImplementedError.

get_output_shape_for(input_shapes)[source]

Computes the output shape of this layer, given a list of input shapes.

Parameters:

input_shape : list of tuple

A list of tuples, with each tuple representing the shape of one of the inputs (in the correct order). These tuples should have as many elements as there are input dimensions, and the elements should be integers or None.

Returns:

tuple

A tuple representing the shape of the output of this layer. The tuple has as many elements as there are output dimensions, and the elements are all either integers or None.

Notes

This method must be overridden when implementing a new Layer class with multiple inputs. By default it raises NotImplementedError.

lasagne.layers.concat[source]

alias of lasagne.layers.merge.ConcatLayer

class lasagne.layers.ElemwiseMergeLayer(incomings, merge_function, cropping=None, **kwargs)[source]

This layer performs an elementwise merge of its input layers. It requires all input layers to have the same output shape.

Parameters:

incomings : a list of Layer instances or tuples

the layers feeding into this layer, or expected input shapes, with all incoming shapes being equal

merge_function : callable

the merge function to use. Should take two arguments and return the updated value. Some possible merge functions are theano.tensor: mul, add, maximum and minimum.

cropping : None or [crop]

Cropping for each input axis. Cropping is described in the docstring for autocrop()

See also

ElemwiseSumLayer
Shortcut for sum layer.
get_output_for(inputs, **kwargs)[source]

Propagates the given inputs through this layer (and only this layer).

Parameters:

inputs : list of Theano expressions

The Theano expressions to propagate through this layer.

Returns:

Theano expressions

The output of this layer given the inputs to this layer.

Notes

This is called by the base lasagne.layers.get_output() to propagate data through a network.

This method should be overridden when implementing a new Layer class with multiple inputs. By default it raises NotImplementedError.

get_output_shape_for(input_shapes)[source]

Computes the output shape of this layer, given a list of input shapes.

Parameters:

input_shape : list of tuple

A list of tuples, with each tuple representing the shape of one of the inputs (in the correct order). These tuples should have as many elements as there are input dimensions, and the elements should be integers or None.

Returns:

tuple

A tuple representing the shape of the output of this layer. The tuple has as many elements as there are output dimensions, and the elements are all either integers or None.

Notes

This method must be overridden when implementing a new Layer class with multiple inputs. By default it raises NotImplementedError.

class lasagne.layers.ElemwiseSumLayer(incomings, coeffs=1, cropping=None, **kwargs)[source]

This layer performs an elementwise sum of its input layers. It requires all input layers to have the same output shape.

Parameters:

incomings : a list of Layer instances or tuples

the layers feeding into this layer, or expected input shapes, with all incoming shapes being equal

coeffs: list or scalar

A same-sized list of coefficients, or a single coefficient that is to be applied to all instances. By default, these will not be included in the learnable parameters of this layer.

cropping : None or [crop]

Cropping for each input axis. Cropping is described in the docstring for autocrop()

Notes

Depending on your architecture, this can be used to avoid the more costly ConcatLayer. For example, instead of concatenating layers before a DenseLayer, insert separate DenseLayer instances of the same number of output units and add them up afterwards. (This avoids the copy operations in concatenation, but splits up the dot product.)

get_output_for(inputs, **kwargs)[source]

Propagates the given inputs through this layer (and only this layer).

Parameters:

inputs : list of Theano expressions

The Theano expressions to propagate through this layer.

Returns:

Theano expressions

The output of this layer given the inputs to this layer.

Notes

This is called by the base lasagne.layers.get_output() to propagate data through a network.

This method should be overridden when implementing a new Layer class with multiple inputs. By default it raises NotImplementedError.