C++ Implements Neural Network from Zero

Long text warning: a total of 22727 words Note: All sourc...

Long text warning: a total of 22727 words

Note: All source addresses are attached at the end of the article

Suggestion: Find the right time to read after collecting.

1. Design of Net Classes and Initialization of Neural Networks Talk less and start directly

Since it is implemented in C++, it is natural to think of designing a neural network class to represent the neural network, which I call the Net class here.Since this class name is so common that it is likely to conflict with programs written by others, it is not difficult to think of my last name, Liu, because all my programs are included in the namespace liu.In the previous blog reverse propagation algorithm Resource collation, I listed a few more good resources.Students who are not familiar with theory and have a learning spirit can go out and turn left to see the resources for this article.It is assumed that the reader has a certain understanding of the basic theory of neural networks.

Elements of a neural network

Before coding really begins, it is still necessary to give an account of the neural network basis, which is actually the idea of designing classes and writing programs.In short, a neural network consists of several elements:

  • Neuronal Nodes

  • layer

  • weights

  • bias item

The two main calculation processes of a neural network are forward propagation and reverse propagation.Each layer's forward propagation contains a weighted sum (convolution?)Linear operation and non-linear operation of activation function.Reverse propagation mainly uses BP algorithm to update weights.Although there are many details, the above is sufficient for this first article.

Design of Net Class Net Class - Mat-based

Calculations in neural networks can almost be expressed in the form of matrix calculations, which is one of the reasons why I use the Mat class of OpenCV, which provides a very complete and fully optimized array operation method. Another reason is that the library I am most familiar with is OpenCV...There are many good libraries and frameworks that represent different parts of a neural network with many classes.Blob classes represent data, Layer classes represent layers, and Optimizer classes represent optimization algorithms.However, it is not so complex here, but mainly limited in ability. Only one Net class is used to represent the neural network.

Or let the program talk directly. The Net class is included in Net.h, roughly as follows.

#ifndef NET_H #define NET_H #endif // NET_H #pragma once #include <iostream> #include<opencv2corecore.hpp> #include<opencv2highguihighgui.hpp> //#include<iomanip> #include"Function.h" namespace liu { class Net { public: std::vector<int> layer_neuron_num; std::vector<cv::Mat> layer; std::vector<cv::Mat> weights; std::vector<cv::Mat> bias; public: Net() {}; ~Net() {}; //Initialize net:genetate weights matrices,layer matrices and bias matrices // bias default all zero void initNet(std::vector<int> layer_neuron_num_); //Initialise the weights matrices. void initWeights(int type = 0, double a = 0., double b = 0.1); //Initialise the bias matrices. void initBias(cv::Scalar& bias); //Forward void forward(); //Forward void backward(); protected: //initialise the weight matrix.if type =0,Gaussian.else uniform. void initWeight(cv::Mat &dst, int type, double a, double b); //Activation function cv::Mat activationFunction(cv::Mat &x, std::string func_type); //Compute delta error void deltaError(); //Update weights void updateWeights(); }; }
Explain

The above is not the complete form of the Net class, but a simplified version corresponding to the content of this article, which will appear clearer after simplification.

Member Variables and Member Functions Member Variables and Member Functions

Now the Net class has only four member variables:

  • Number of neurons per layer (layer_neuron_num)

  • layer

  • Weight Matrix (weights)

  • bias item

Needless to say, for ease of calculation, each layer and offset is also represented by Mat, and each layer and offset is represented by a single column matrix.

In addition to the default constructors and destructors, member functions of the Net class also include:

  • initNet(): Used to initialize neural networks

  • InitWeights(): Initialize the weight matrix and call the initWeights() function

  • initBias(): Initialize offset

  • forward(): Performs forward operations, including linear operations and nonlinear activation, while calculating errors

  • backward(): Perform reverse propagation and call the updateWeights() function to update the weight.

These functions are already the core of the neural network program.The rest of the content is to slowly achieve, what to add when it is achieved, open the way on mountains, and build bridges on rivers.

Neural Network Initialization initNet() function

Let's start with the initNet() function, which takes only one parameter, the number of neurons per layer, and then initializes the neural network.The meaning of initializing a neural network here is to generate a matrix for each layer, a matrix for each weight, and a matrix for each offset.It sounds simple, but it's also simple.

The implementation code is in Net.cpp.

There is no difficulty in generating all kinds of matrices here. The only thing to be concerned about is the determination of the number of rows and columns of the weight matrix.It is worth noting that the weight is set to 0 by default.

//Initialize net void Net::initNet(std::vector<int> layer_neuron_num_) { layer_neuron_num = layer_neuron_num_; //Generate every layer. layer.resize(layer_neuron_num.size()); for (int i = 0; i < layer.size(); i++) { layer[i].create(layer_neuron_num[i], 1, CV_32FC1); } std::cout << "Generate layers, successfully!" << std::endl; //Generate every weights matrix and bias weights.resize(layer.size() - 1); bias.resize(layer.size() - 1); for (int i = 0; i < (layer.size() - 1); ++i) { weights[i].create(layer[i + 1].rows, layer[i].rows, CV_32FC1); //bias[i].create(layer[i + 1].rows, 1, CV_32FC1); bias[i] = cv::Mat::zeros(layer[i + 1].rows, 1, CV_32FC1); } std::cout << "Generate weights matrices and bias, successfully!" << std::endl; std::cout << "Initialise Net, done!" << std::endl; }
Weight Initialization initWeight() function

The weight initialization function initWeights() calls the initWeight() function, which is essentially to initialize one or more differences.

Offset initialization assigns the same value to all offsets.The Scalar object is used here to assign values to the matrix.

//initialise the weights matrix.if type =0,Gaussian.else uniform. void Net::initWeight(cv::Mat &dst, int type, double a, double b) { if (type == 0) { randn(dst, a, b); } else { randu(dst, a, b); } } //initialise the weights matrix. void Net::initWeights(int type, double a, double b) { //Initialise weights cv::Matrices and bias for (int i = 0; i < weights.size(); ++i) { initWeight(weights[i], 0, 0., 0.1); } }

Offset initialization assigns the same value to all offsets.The Scalar object is used here to assign values to the matrix.

//Initialise the bias matrices. void Net::initBias(cv::Scalar& bias_) { for (int i = 0; i < bias.size(); i++) { bias[i] = bias_; } }

So far, all the parts of the neural network that need to be initialized have been initialized.

Initialization Test

We can initialize a neural network with the following code. Although it has no function, at least we can test if the current code has a BUG:

#include"../include/Net.h" //<opencv2opencv.hpp> using namespace std; using namespace cv; using namespace liu; int main(int argc, char *argv[]) { //Set neuron number of every layer vector<int> layer_neuron_num = { 784,100,10 }; // Initialise Net and weights Net net; net.initNet(layer_neuron_num); net.initWeights(0, 0., 0.01); net.initBias(Scalar(0.05)); getchar(); return 0; }

There's no problem with hands-on testing.

Source Link

All the code is hosted on Github and you can download it if you are interested.The source link address is

https://github.com/LiuXiaolong19920720/simple_net

If you learn to sail against the water, if you don't advance, you will go back. One hundred Chocolate s 446 original articles published, 1096 praised, 160,000 visits+ Private letter follow

3 February 2020, 20:40 | Views: 4869

Add new comment

For adding a comment, please log in
or create account

0 comments