Recurrent Neural Networks(RNN Part-3)

 

Implementation in TensorFlow

Class RNN Model/ CNN Model/ Model/ Encoder

  • Code to implement neural net
  • application independent
  • tf.Keras.Model to be inherited

Class application specific name

  • application specific
  • __init__(): object of encoder class will be declared here
  • train(): load training data, run batches of training
  • train_step(): application independent, optimize neural network for one batch, forward batch and backward batch
  • load_model(): restore previous weights (from pre trained model)

Load all data

  • application specific
  • loads all data from file
  • collect metadata
  • load data and create matrix 
  • create X (input matrix) and Y (output matrix)
  • creation of one hot encoding

Load all batches

  • create a generator
  • processed batch data
  • return padded X
  • one hot encoded Y

Example

import tensorflow as tf 
import numpy as np
class RNN_Encoder(tf.keras.Model):
    def __init__(self,nodes):
        super(RNN_Encoder,self).__init__()
        self.rnn_1 = tf.keras.layers.GRU(nodes[0],return_sequences=True,return_state=True)
        self.rnn_2 = tf.keras.layers.GRU(nodes[1],return_sequences=True,return_state=True)
        print("RNN Ready")
    
    def call(self,inputs):
        output_1,state = self.rnn_1(inputs)
        output_2,state = self.rnn_2(output_1,initial_state=state)
        return output_1,output_2
T = 12
D = 5
X = np.random.random([2,T,D])
print("Input shape ",X.shape)
demo_net = RNN_Encoder([16,16])
rnn_out,rnn_state = demo_net(X)
print("Output shape ",rnn_out.shape)
print("State shape ",rnn_state.shape)

Comments

Popular posts from this blog

Supervised Learning(Part-5)

Supervised Learning(Part-2)

Convolutional Neural Networks(Part-2)