Getting started with MatConvNet

Available at:

My goal is to use this toolbox to classify cars into four classes: Sedan, Minivan, SUV and pickup. I already have the data from my prior work (

My plan is to first use transfer learning i. e. to use one of the deep networks pre-trained with imagenet data to extract features and use an SVM classifier to do the actual classification. I plan to do this with Matlab 2016a on Windows 10 PC equipped with GeForce GTX 960. I am aware that Matlab also has deep learning support in its Neural Network toolbox, I am going with MatConNet in hopes that it will stay more cutting edge.

The code below is derived from and

clear; close all; clc;
%Load the pre-trained net
net = load('imagenet-vgg-f.mat');
net = vl_simplenn_tidy(net) ;

%Remove the last layer (softmax layer)
net.layers = net.layers(1 : end - 1);

%% This deals with reading the data and getting the ground truth class labels

%All files are are inside the root 
root = 'C:\Users\ninad\Dropbox\Ninad\Lasagne\data\c200x200\';
Files = dir(fullfile(root, '*.bmp'));

%Load the map which stores the class information
load MakeModels.mat
for i = 1 : length(Files)
    waitbar (i/ length(Files));
    % Read class from the map
    Q = Files(i).name(end - 18 : end - 4);
    Qout = MakeModels(Q);
    Files(i).class = Qout.Type;
    % Preprocess the data and get it ready for the CNN
    im = imread(fullfile(root, Files(i).name));
    im_ = single(im); % note: 0-255 range
    im_ = imresize(im_, net.meta.normalization.imageSize(1:2));
    im_ = bsxfun(@minus, im_, net.meta.normalization.averageImage);

    % run the CNN to compute the features
    feats = vl_simplenn(net, im_) ;
    Files(i).feats = squeeze(feats(end).x);

%% Classifier training

%Select training data fraction

trainFraction = 0.5;
randomsort = randperm(length(Files));
trainSamples = randomsort(1 : trainFraction * length(Files));
testSamples = randomsort(trainFraction * length(Files)+1 : end);

Labels = [Files.class];
Features = [Files.feats];

trainingFeatures = Features( :, trainSamples);
trainingLabels = Labels( :, trainSamples);

classifier = fitcecoc(trainingFeatures', trainingLabels);

%% Carry out the validation with rest of the data
testFeatures = Features( :, testSamples);
testLabels = Labels( :, testSamples);

predictedLabels = predict(classifier, testFeatures');

confMat = confusionmat(testLabels, predictedLabels);

% Convert confusion matrix into percentage form
confMat = bsxfun(@rdivide,confMat,sum(confMat,2))

% Display the mean accuracy




  1. hi
    thanks for your classification code
    i tried to run the code but i got this error with loading the ‘MakeModels.mat’.the error looks as the following:
    Error using load
    Unable to read file ‘MakeModels.mat’. No such file or directory.

    Error in Untitled5 (line 29)
    load MakeModels.mat

  2. good code.. can you tell me what makemodels is ???

  3. Hi
    I am new in Matlab and in the deep learning
    I am going to start Matconvnet.
    as you know in Matlab 2016 the toolbox of deep learning is avilable
    but i don’t know how to start
    should i install any dependency like version 2014 or erlier version?
    i am so confused
    I look forward your answer
    kind regard

  4. Hi dear
    Ninad Thakoor
    I am new in deep learning.I am going to start Matconvnet
    I use Matlab2016.I don’t know that how to start.
    how to run Matconvnet toolbox. i mean that should i install any dependency like earlier version Or not.
    I was wondering if you could help me
    kind regard

  5. Dear Ninad

    I want to train a new model using MatConvNet with my own pictures dataset. My pictures classified into the two classes. I have downloaded and installed the MatConvNet on matlab2016a and my pictures are ready but I dont know how to start using MatConvNet at all and the tutorial and examples are not so clear for me. Would you please help me in this case?


  6. Thank you very much for the nice tutorial.

    Can you please show how to do fine tune pre-trained network?

Leave a Reply

Your email address will not be published.