NLP Historical tools part 2

About RNN and encoder decoder

Adarsha Regmi
2 min readOct 11, 2021

RNN

Recurrent neural network, as the name suggests RNN is neural network with re entitled means something is repeated right. Okay,, lets see what we can get.

Memory cell is the thing that has preferred RNN over classical approaches.What is memory ? Everyone is known to memory. sth that stores the information right or not ?

Types of RNN

  • One to one
  • one to many
  • many to one
  • many to many (input and ouput length can varry eg.language translation)
  • many to many(same input and output length) eg.POS tag

Why RNN ??

Able to learn representation of complete sentence rather than just a word.

******************Cons of RNN ***********************

  • parallelization
  • BP through variable length sequence
  • transmitting information through single bottleneck
  • long term dependency

Encoder and Decoder

use cased for language translation Generation(NLG)

language translation with Encoder decoder

As shown in figure all the input words have a single blue filled box as a final hidden state that is used for decoder step. The multiple outputs are predicted until the completed state is achieved.

Cons of Encoder Decoder

  • crammed meaning (depends upon single dark blue)
  • long term dependencies(> 30 words) backbone is RNN or LSTM or GRU
  • word alignment
  • word interpretation

Resolving techniques

entire information should be encapsulated in single state.

Can use intermediate states…

Lets move on to solution in next article.

--

--