Sequence to Sequence model for NLP

Sequence to Sequence model is one of the most effective model for Natural Language Processing tasks such as translation. There are very few quality papers available on this topic and today I will summarize a very effective paper titled as “Sequence to Sequence Learning with Neural Networks” written by the Google Research Team.

Sequence to Sequence model for NLP

Increasing Performance using Multi-Layer LSTM

In this tutorial, we will introduce multi-layer LSTM to increase the performance of the model. Multi-layer can also be thought as multiple LSTM units. The perplexity will be decreased slightly as compared to single LSTM unit, explained in the previous article.

Increasing Performance using Multi-Layer LSTM

Bigram Based LSTM with Regularization

In the previous LSTM tutorial, we used a single character at a time, now the bigram approach is to predict a character using two characters at a time. This tutorial will also introduce regularization technique known as Dropout in RNN.

Bigram Based LSTM with Regularization

Recurrent Neural Networks (LSTM) Tutorial

Recurrent Neural Networks are one of the most used ANN structure in text and speech learning problems. The purpose of RNN is to work well when the input is in sequence and varies in length, the speech and text are the examples of such input.

Introducing Recurrent Neural Networks (LSTM)

Text Mining Tutorial using Word2Vec (Continuous Bag of Words)

Continuous Bag of Words also known as CBOW is another Word2Vec technique used to find the relationship among the keywords. It is actually the opposite of the previous technique skip gram model. We will find out how it is different and how it impacts the performance on the same dataset.

http://www.tensorflowhub.org/2017/01/word2vec-skip-gram-model-tensorflow.html

Understanding Text Corpus using Word2Vec (Skip Gram Model) - Tutorial

Understanding a text corpus is really hard for a computer considering the old learning styles, they just learn the things but are not familiar of how the words really work with respect to other words (independent from the context).


Understanding Text Corpus using Word2Vec (Skip Gram Model)

Transposed ConvNets Tutorial (Deconvolution)

We have discussed on the convolutional neural networks in the previous tutorials with examples in tensorflow, in this, I will introduce the transposed convolution also called as the deconvolution or the inverse of convolution with the experiments I did in tensorflow.

Experimenting with Transposed ConvNets (Deconvolution)