![Deep Learning with Keras](https://wfqqreader-1252317822.image.myqcloud.com/cover/291/36701291/b_36701291.jpg)
上QQ阅读APP看书,第一时间看更新
Activation function — sigmoid
The sigmoid function is defined as follows:
![](https://epubservercos.yuewen.com/C78C0C/19470410508972506/epubprivate/OEBPS/Images/image_01_068.jpg?sign=1738934390-M0oQZOcuYoPhvOKV44UZ1z8wqukymz2B-0-9e0cce4762c3633028ee88a3921d6a5d)
As represented in the following graph, it has small output changes in (0, 1) when the input varies in . Mathematically, the function is continuous. A typical sigmoid function is represented in the following graph:
![](https://epubservercos.yuewen.com/C78C0C/19470410508972506/epubprivate/OEBPS/Images/B06258_01_05.jpg?sign=1738934390-8Y2VaWvsfW6HExAG4DdGH7NNlYR5Zh23-0-af11c912782181efc9e168088ed06b1d)
A neuron can use the sigmoid for computing the nonlinear function . Note that, if
is very large and positive, then
, so
, while if
is very large and negative
so
. In other words, a neuron with sigmoid activation has a behavior similar to the perceptron, but the changes are gradual and output values, such as 0.5539 or 0.123191, are perfectly legitimate. In this sense, a sigmoid neuron can answer maybe.