Historical NLP tools

Algo techinques and drawbacks

Adarsha Regmi
2 min readOct 11, 2021

** topics

* Neural network and word2Vec

Introduction:

Neural network is the black box that takes in the input data and returns the output. Recently the use of neural network has lead into better result. The NLM is the case where we make use of neural network for modelling the NLP use cases.

fig : neural network architecture
  • About Word2vec and Word Embeddings

The word2Vec takes in the input words and try to build interpretation of single word in the form of vector. It is aimed to generate the embedding where the similar context should have same spatial space. The techniques include skip-gram and CBOW(Common bag of words).

Brief intro in CBOW and SKip gram

CBOW:This method takes the context of each word as the input and tries to predict the word corresponding to the context.

eg. Have a great day.

Here using context great we’re trying to predict “day”. In this process we learn a vector representation for target day.

Let’s get the terms in the picture right:
- Wvn is the weight matrix that maps the input x to the hidden layer (V*N dimensional matrix)
-W`nv is the weight matrix that maps the hidden layer outputs to the final output layer (N*V dimensional matrix)

There is no any activation in hidden layer expect in the output layer where soft-max is calculated.

Skip gram is the opposite of CBOW model where the single context is used to produce different context. suitable for small amount of words

If you are interested to study math. click here.

Moving on topics

  • RNN
  • Encoder and Decoder
  • Encoder and Decoder with Attention

--

--