- Perceptron
- Can use AND and OR
- but not XOR
- Hidden layers
- When there are multiple layers, XOR can be used, it cannot work on the single layer.
- Cycle Rededuncy Check
- Propagation - transfer of data from source to destination
- Back propagation - transfer of data, goes back for training of the data if it cannot find the results
- Rectified Linear Unit - it is a form of activation function. It will change the negative value to the zero value.
- Matrix Multiplication
- dying ReLU - when it is overloaded, so it will distribute the load
- Two inputs > Matrix Multiplication > Add Biased > Perform Rectified Linear Unit > Output
- Biased - One sided input
- Weighted
Recent Pastes