Stokastik

Machine Learning, AI and Programming

Category: AI

Dynamic Programming in NLP - Skip Grams

In this series of posts, I will be exploring and showing how dynamic programming technique is used in machine learning and natural language processing. Dynamic programming is very popular in programming interviews. DP technique is used mainly in problems which has optimal substructure and can be defined recursively, which means that a problem of size N can be solved by solving smaller sub-problems of size m < N. Such problems […]

Continue Reading →

Neural Networks as a Function Approximator

For the past few days, I have been reading quite a lot of research papers, articles and blogs related to artificial neural networks and its transition towards deep learning. With so many different methods of selecting the best neural network architecture for a problem, the optimal hyper-parameters, the best optimization algorithm and so on, it becomes a little overwhelming to connect all the dots together when we ourselves start to […]

Continue Reading →

Building a Neural Network from scratch in Python

In this post I am going to build an artificial neural network from scratch. Although there exists a lot of advanced neural network libraries written using a variety of programming languages, the idea is not to re-invent the wheel but to understand what are the components required to make a workable neural network. A full-fledged industrial scale neural network might require a lot of research and experimentation with the dataset. Building a simple […]

Continue Reading →

Using Word Vectors in Multi-Class Text Classification

Earlier we have seen how instead of representing words in a text document as isolated features (or as N-grams), we can encode them into multidimensional vectors where each dimension of the vector represents some kind semantic or relational similarity with other words in the corpus. Machine Learning problems such as classification or clustering, requires documents to be represented as a document-feature matrix (with TF or TF-IDF weighting), thus we need some […]

Continue Reading →

Understanding Word Vectors and Word2Vec

Quite recently I have been exploring the Word2Vec tool, for representing words in text documents as vectors. I got the initial ideas about word2vec utility from Google's code archive webpage. The idea behind coming up with this kind of utility caught my interest and later I went on to read the following papers by Mikolov et. al. to better understand the algorithm and its implementation. Efficient Estimation of Word Representations […]

Continue Reading →