NLP Historical tools part 2

RNN

Recurrent neural network, as the name suggests RNN is neural network with re entitled means something is repeated right. Okay,, lets see what we can get.

Memory cell is the thing that has preferred RNN over classical approaches.What is memory ? Everyone is known to memory. sth that stores the information right or not ?

Types of RNN

  • One to one
  • one to many
  • many to one
  • many to many (input and ouput length can varry eg.language translation)
  • many to many(same input and output length) eg.POS tag

Why RNN ??

Able to learn representation of complete sentence rather than just a word.

******************Cons of RNN ***********************

  • parallelization
  • BP through variable length sequence
  • transmitting information through single bottleneck
  • long term dependency

Encoder and Decoder

use cased for language translation Generation(NLG)

language translation with Encoder decoder

As shown in figure all the input words have a single blue filled box as a final hidden state that is used for decoder step. The multiple outputs are predicted until the completed state is achieved.

Cons of Encoder Decoder

  • crammed meaning (depends upon single dark blue)
  • long term dependencies(> 30 words) backbone is RNN or LSTM or GRU
  • word alignment
  • word interpretation

Resolving techniques

entire information should be encapsulated in single state.

Can use intermediate states…

Lets move on to solution in next article.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store