The hot word “neuron network” is part of machine learning. It follows the same math logic described before except in a more convoluted manners.

In essence, the neuron unit can be regarded as one regression unit:

Convolution occurs in the sense that there is hidden layer composed of multiple neurons to take into features from x0, x1, … to xn, and also in the sense that this hidden layer can be more than one, so pass on and on up to the final hypothesis output.

There are forward propagation and back propagation in neural network. Here is the forward propagation. Even it looks intimidating it’s just the convoluted layers passing the same logistic function in multiple layers, and expressed in succinct matrix format:

Now the fun part. Once these setting are done, how to realize all sorts of smart mimicking algorithm, let’s start from a very basic simple AND and OR logic for “neurons” to work on.

Above simplified setting assumes there are only two features x1 and x2, construct such a function (-30+20×1+20×2) successfully realize the AND logic, meaning only when both x1=x2=1, the output is true(1). The OR function below is similar: -10+20×1+20×2. Along the same vein, there are other logic such as negation and combination of these logic etc. easily be realized by math here.