Hands-On Neural Networks
上QQ阅读APP看书,第一时间看更新

Softmax

The softmax function is a generalization of the sigmoid function. While the sigmoid gives us the probability for a binary output, softmax allows us to transform an un-normalized vector into a probability distribution. That means that the softmax will output a vector that will sum up to 1, and all of its values will be between 0 and 1.