Recurrent Neural Network(RNN Part-1)

 

Sequential Data

In mathematics, a sequence is an enumerated collection of objects in which repetitions are allowed and order matters. Like a set, it contains members (also called elements, or terms). The number of elements (possibly infinite) is called the length of the sequence. Unlike a set, the same elements can appear multiple times at different positions in a sequence, and unlike a set, the order does matter.

Sequential Data - Representation

  • Every element of sequence can be a D dimensional vector as well 
  • In that case a sample is a T x D dimension array  
  • For scalar elements a sample is a T x 1 dimension array 
  • Every sample for a sequential data have 2 dimensions (T and D)

Sequence Modeling Problem

  • Input is a sequence of length T  
  • Output is another sequence of length Q 
  • T = Q - POS Tagging 
  • T > Q - Speech to Text 
  • T < Q - Translation , this can be T >= Q as well 

MLP Limitations

For any layer with input x, trainable weight W and Bias b we can write the Transformation equation (without non-linearity) as

y = W.x + b

  • Some data are inherently SEQUENTIAL in nature and MLP is not a natural choice 
  • We can model some data as a SEQUENCE for better representation

Why MLP is not Suitable? 

  • How can we feed 2D input (T x D) in MLP that accepts a 1D input ? 
  • We can FLATTEN the 2D input to make it 1D (row major / column major ordering) 
  • This destroy “Temporal Relation” between sequential elements 
  • MLP considers all elements of input to be “Temporally Independent” to others which is not True

RNN can solve these problems in one architecture.

How RNN Works? 

Autoregressive Model

Simple linear regression and AR models differ by the fact that Y is dependent on X and previous values for Y. In time series modeling, a Nonlinear Autoregressive Exogenous model (NARX) is a nonlinear autoregressive model which has exogenous inputs. 
This means that the model relates the current value of a time series to 

  • Past values of the same series 
  • Current and past values of the driving (exogenous) series — that is, of the externally determined series that influences the series of interest.


Comments

Popular posts from this blog

Supervised Learning(Part-5)

Supervised Learning(Part-2)

Convolutional Neural Networks(Part-2)