# npdl.layers¶

Base Layers

 Layer The Layer class represents a single layer of a neural network.

Core Layers

 Linear A fully connected layer implemented as the dot product of inputs and weights. Dense A fully connected layer implemented as the dot product of inputs and weights. Softmax A fully connected layer implemented as the dot product of inputs and weights. Dropout A dropout layer.

Convolution Layers

 Convolution Convolution operator for filtering windows of two-dimensional inputs.

Embedding Layer

Normalization Layer

 BatchNormal Batch normalization layer (Ioffe and Szegedy, 2014) [R1] .

Pooling Layers

 MeanPooling Average pooling operation for spatial data. MaxPooling Max pooling operation for spatial data.

Recurrent Layers

 Recurrent A recurrent neural network (RNN) is a class of artificial neural network where connections between units form a directed cycle. SimpleRNN Fully-connected RNN where the output is to be fed back to input. GRU Gated recurrent units (GRUs) are a gating mechanism in recurrent neural networks, introduced in 2014. LSTM Bacth LSTM, support mask, but not support training. BatchLSTM Batch LSTM, support training, but not support mask.

Shape Layers