Difference between revisions of "Neuron (neural network)"
From Maths
(Saving work) |
m |
||
Line 2: | Line 2: | ||
<div style="float:right;margin:0.05em;"> | <div style="float:right;margin:0.05em;"> | ||
{| class="wikitable" border="1" | {| class="wikitable" border="1" | ||
− | | <center><span style="font-size:1. | + | | <center><span style="font-size:1.1em"><mm>\xymatrix{ |
I_{1} \ar[ddrr]^{w_{1} } \\ | I_{1} \ar[ddrr]^{w_{1} } \\ | ||
I_{2} \ar[drr]_{w_{2} } \\ | I_{2} \ar[drr]_{w_{2} } \\ | ||
Line 10: | Line 10: | ||
}</mm></span></center> | }</mm></span></center> | ||
|- | |- | ||
− | ! Block diagram of a generic neuron with | + | ! Block diagram of a generic neuron with inputs: {{M|I_1,\ldots,I_n}} |
|} | |} | ||
</div>A ''neuron'' in a [[neural network]] has: | </div>A ''neuron'' in a [[neural network]] has: | ||
Line 23: | Line 23: | ||
In the example to the right, the output of the neuron would be: | In the example to the right, the output of the neuron would be: | ||
* {{M|1=\mathcal{A}\left(\sum_{i=1}^n(I_iw_i)+\theta\right)}} | * {{M|1=\mathcal{A}\left(\sum_{i=1}^n(I_iw_i)+\theta\right)}} | ||
+ | ==Specific models== | ||
+ | For an exhaustive list see [[:Category:Types of neuron in a neural network]] | ||
+ | ==[[McCulloch–Pitts neuron]]== | ||
+ | {{:McCulloch–Pitts neuron}} | ||
+ | ==References== | ||
+ | <references/> | ||
+ | {{Neural networks navbox}} | ||
{{CS Definition|Neural Networks}}{{Statistics Definition|Neural Networks}} | {{CS Definition|Neural Networks}}{{Statistics Definition|Neural Networks}} |
Revision as of 12:25, 22 April 2016
Definition
|
Block diagram of a generic neuron with inputs: [ilmath]I_1,\ldots,I_n[/ilmath] |
---|
- an output domain, [ilmath]\mathcal{O} [/ilmath] typically [ilmath][-1,1]\subseteq\mathbb{R} [/ilmath] or [ilmath][0,1]\subseteq\mathbb{R} [/ilmath]
- Usually [ilmath]\{0,1\} [/ilmath] for input and output neurons
- some inputs, [ilmath]I_i[/ilmath], typically [ilmath]I_i\in\mathbb{R} [/ilmath]
- some weights, 1 for each input, [ilmath]w_i[/ilmath], again [ilmath]w_i\in\mathbb{R} [/ilmath]
- a way to combine each input with a weight (typically multiplication) ([ilmath]I_i\cdot w_i[/ilmath] - creating an "input activation", [ilmath]A_i\in\mathbb{R} [/ilmath]
- a bias, [ilmath]\theta[/ilmath] (pf the same type as the result of combining an input with a weight. Typically this can be simulated by having a fixed "on" input, and treating the bias as another weight) - another input activation, [ilmath]A_0[/ilmath]
- a way to combine the input values, typically: [ilmath]\sum_{j=0}^nA_j=\sum_{j=1}^nI_jw_j+\theta[/ilmath]
- an activation function [ilmath]\mathcal{A}(\cdot):\mathbb{R}\rightarrow\mathcal{O}\subseteq\mathbb{R} [/ilmath], this maps the combined input activations to an output value.
In the example to the right, the output of the neuron would be:
- [ilmath]\mathcal{A}\left(\sum_{i=1}^n(I_iw_i)+\theta\right)[/ilmath]
Specific models
For an exhaustive list see Category:Types of neuron in a neural network
McCulloch–Pitts neuron
References
Template:Neural networks navbox