Sequence to Sequence model is one of the most effective model for Natural Language Processing tasks such as translation. There are very few quality papers available on this topic and today I will summarize a very effective paper titled as “Sequence to Sequence Learning with Neural Networks” written by the Google Research Team.
In the previous LSTM tutorial, we used a single character at a time, now the bigram approach is to predict a character using two characters at a time. This tutorial will also introduce regularization technique known as Dropout in RNN.
Recurrent Neural Networks are one of the most used ANN structure in text and speech learning problems. The purpose of RNN is to work well when the input is in sequence and varies in length, the speech and text are the examples of such input.
Continuous Bag of Words also known as CBOW is another Word2Vec technique used to find the relationship among the keywords. It is actually the opposite of the previous technique skip gram model. We will find out how it is different and how it impacts the performance on the same dataset.
Understanding a text corpus is really hard for a computer considering the old learning styles, they just learn the things but are not familiar of how the words really work with respect to other words (independent from the context).
We have discussed on the convolutional neural networks in the previous tutorials with examples in tensorflow, in this, I will introduce the transposed convolution also called as the deconvolution or the inverse of convolution with the experiments I did in tensorflow.
- Lets get start with TensorFlow
- Classification of hand written letters or digits using Neural Network
- Introducing Convolutional Neural Networks
- Understanding Text Corpus using Word2Vec (Skip Gram Model) - Tutorial
- Bigram Based LSTM with Regularization
- TensorFlow Basic Easy Example in Python
- Transposed ConvNets Tutorial (Deconvolution)
- Increasing Performance using Multi-Layer LSTM
- L2 and Dropout Regularization in Neural Network Classification
- Recurrent Neural Networks (LSTM) Tutorial