What are the limitations of recurring neural networks

Bidirectional recurring neural networks - Bidirectional recurrent neural networks

Bidirectional recurring neural networks ( BRNN ) connect two hidden layers of opposite directions to the same output. With this form of generative deep learning, the output layer can simultaneously receive information from past (backwards) and future (forward) states. BRNNs were invented and introduced by Schuster and Paliwal in 1997 to increase the amount of input information available to the network. For example, multilayer perceptrons (MLPs) and time-delayed neural networks (TDNNs) have limitations on the flexibility of the input data because their input data must be specified. RNNs (Standard Recurrent Neural Network) are also subject to restrictions, since the future input information cannot be obtained from the current status. On the contrary, BRNNs do not require their input data to be set. In addition, your future input information can be accessed from the current status.

BRNNs are particularly useful when the context of the input is needed. For example, in handwriting recognition, knowing the letters before and after the current letter can improve performance.

Architecture

Structure of RNN and BRNN

The principle of BRNN is to split the neurons of a regular RNN into two directions, one for the positive time direction (forward states) and one for the negative time direction (backward states). The output of these two states is not connected to inputs of the opposite directional states. The general structure of RNN and BRNN can be shown in the diagram on the right. By using two time directions, input information from the past and future of the current time frame can be used, as opposed to standard RNNs which require the delays for including future information.

training

BRNNs can be trained with algorithms similar to RNNs because the two directed neurons do not interact. However, if back propagation in time is used, additional processes are required because the update of the input and output layers cannot happen at the same time. The general training procedures are as follows: In the forward pass, forward and backward states are passed first, then output neurons are passed. For the backward pass, output neurons are passed first, then forward states and backward states are passed next. After forward and backward sweeps, the weights are updated.

Applications

BRNN applications include:

  • translation
  • Handwritten recognition
  • Prediction of protein structure
  • Tag of Speech Tagging
  • Dependency analysis
  • Entity extraction

References

External links

  • [1] Implementation of BRNN / LSTM in Python with Theano