Tag Archives: MNIST

A Simple Deep Network

During this spring break, I worked on building a simple deep network, which has two parts, sparse autoencoder and softmax regression. The method is exactly the same as the “Building Deep Networks for Classification” part in UFLDL tutorial. For better understanding it, I re-implemented it using C++ and OpenCV.  GENERAL OUTLINE Read dataset (including training data […]

Posted in Algorithm, Machine Learning, OpenCV | Also tagged , , , , , , , | 37 Responses

[UFLDL Exercise] Implement deep networks for digit classification

I’m learning Prof. Andrew Ng’s Unsupervised Feature Learning and Deep Learning tutorial, This is the 6th exercise, which is a combination of Sparse Autoencoder and Softmax regression algorithm, and fine-tuning algorithm. It builds a 2-hidden layers sparse autoencoder net and one layer Softmax regression, we first train this network layer by layer, from left to right, then […]

Posted in Algorithm, Machine Learning | Also tagged , , , , , , | 3 Responses

C++ code for reading MNIST dataset

Here’s a code for reading MNIST dataset in C++, the dataset can be found HERE, and the file format is as well. Using this code, you can read MNIST dataset into a double vector, or an OpenCV Mat, or Armadillo mat. Feel free to use it for any purpose. (part of this code is stolen from […]

Posted in Machine Learning, OpenCV | Also tagged , , , | 5 Responses

[UFLDL Exercise] Softmax Regression

I’m learning Prof. Andrew Ng’s Unsupervised Feature Learning and Deep Learning tutorial, This is the 4th exercise, which is using Softmax regression to build a classifier and classify MNIST handwritten digits. Just like my other UFLDL exercise posts, I’ll not go through the detail of the material. More details about this exercise can be found HERE. I’ll re-implement Softmax […]

Posted in Algorithm, Machine Learning | Also tagged , , , | 4 Responses