上QQ阅读APP看书,第一时间看更新
There's more...
Neural networks have been employed in a variety of tasks. These tasks can be broadly classified into two categories: function approximation (regression) and classification. Depending on the task at hand, one activation function may be better than the other. Generally, it is better to use ReLU neuron for hidden layers. For classification tasks, softmax is normally a better choice, and for regression problems, it is better to use sigmoid or hyperbolic tangent.