Pregunta:
Activation functions
Autor: Christian NRespuesta:
(a) is a step function or threshold function (b) is a rectified linear function ReLU(x): max(0,x) The smooth version (everywhere-differentiable) of ReLU is called soft plus softPlus(x) : log(1 + eX) Changing the bias weight W0,i moves the threshold location
0 / 5 (0 calificaciones)
1 answer(s) in total