Sequence to Sequence model for NLP

Sequence to Sequence model is one of the most effective model for Natural Language Processing tasks such as translation. There are very few quality papers available on this topic and today I will summarize a very effective paper titled as “Sequence to Sequence Learning with Neural Networks” written by the Google Research Team.

Sequence to Sequence model for NLP

0 comments:

Increasing Performance using Multi-Layer LSTM

In this tutorial, we will introduce multi-layer LSTM to increase the performance of the model. Multi-layer can also be thought as multiple LSTM units. The perplexity will be decreased slightly as compared to single LSTM unit, explained in the previous article.

Increasing Performance using Multi-Layer LSTM

0 comments:

Bigram Based LSTM with Regularization

In the previous LSTM tutorial, we used a single character at a time, now the bigram approach is to predict a character using two characters at a time. This tutorial will also introduce regularization technique known as Dropout in RNN.

Bigram Based LSTM with Regularization

0 comments:

Recurrent Neural Networks (LSTM) Tutorial

Recurrent Neural Networks are one of the most used ANN structure in text and speech learning problems. The purpose of RNN is to work well when the input is in sequence and varies in length, the speech and text are the examples of such input.

Introducing Recurrent Neural Networks (LSTM)

0 comments:

Text Mining Tutorial using Word2Vec (Continuous Bag of Words)

Continuous Bag of Words also known as CBOW is another Word2Vec technique used to find the relationship among the keywords. It is actually the opposite of the previous technique skip gram model. We will find out how it is different and how it impacts the performance on the same dataset.

http://www.tensorflowhub.org/2017/01/word2vec-skip-gram-model-tensorflow.html

0 comments:

Understanding Text Corpus using Word2Vec (Skip Gram Model) - Tutorial

Understanding a text corpus is really hard for a computer considering the old learning styles, they just learn the things but are not familiar of how the words really work with respect to other words (independent from the context).


Understanding Text Corpus using Word2Vec (Skip Gram Model)

0 comments:

Transposed ConvNets Tutorial (Deconvolution)

We have discussed on the convolutional neural networks in the previous tutorials with examples in tensorflow, in this, I will introduce the transposed convolution also called as the deconvolution or the inverse of convolution with the experiments I did in tensorflow.

Experimenting with Transposed ConvNets (Deconvolution)

0 comments:

Tuning parameters to improve Accuracy in ConvNets

So far we have performed the convolutional neural networks with regularization techniques in tensorflow but we have not focused on improving the accuracy of the Convnets. In this tutorial, we will tune the parameters to get top accuracy as possible, I was able to get 96%, lets see how much accuracy can you get.

Tuning parameters to improve Accuracy in ConvNets

0 comments:

Introducing Dropout and L2 Regularization in ConvNets

In this tutorial, I will show how dropout and L2 regularization affect the convolutional neural networks. Its same as we did in the simple neural network. You will see a minor increase in the accuracy but this is not our main concern here. The main concern here to avoid overfitting using these two techniques. 

Introducing Dropout and L2 Regularization in ConvNets

0 comments:

Introducing Pooling in Convolutional Neural Networks

In the previous tutorial, we started with Convolutional Neural Networks, in this I will share the concept of pooling and how it works to improve the system of the CNN. Also this will include the upgraded version of previous python code with maximum pooling function.

Introducing Pooling in Convolutional Neural Networks

0 comments:

Introducing Convolutional Neural Networks

In this tutorial, I will introduce Convolutional Neural Network that are computationally expensive but much accurate than the simple Neural Network. The previous python code will be converted into Convolutional Neural Network.

 Introducing Convolutional Neural Networks

0 comments: