Derivative of Sigmoid and Softmax Explained Visually

2020/09/07 に公開
視聴回数 7,072
0
0
The derivative of the sigmoid function can be understood intuitively by looking at how the denominator of the function transforms the numerator. The derivative of the softmax function, which can be thought of as an extension of the sigmoid function to multiple classes, works in a very similar way, and in this video, I explain that relationship. The sigmoid and softmax are commonly used in neural networks, so having a more intuitive understanding of their derivatives will help us better understand how the gradients propagate through our neural networks during backpropagation.

My previous video, "Why We Use the Sigmoid Function in Neural Networks for Binary Classification":
📼 https://youtu.be/WsFasV46KgQ

My previous video, "Softmax Function Explained In Depth with 3D Visuals":
📼 https://youtu.be/ytbYRIN0N4g

Desmos graph for sigmoid derivative:
📈 https://www.desmos.com/calculator/r8hxsriucw

Desmos graph for softmax derivative:
📈 https://www.desmos.com/calculator/u5r0zgh3jg

Join our Discord community:
💬 https://discord.gg/cdQhRgw

Connect with me:
🐦 Twitter - https://twitter.com/elliotwaite
📷 Instagram - https://www.instagram.com/elliotwaite
👱 Facebook - https://www.facebook.com/elliotwaite
💼 LinkedIn - https://www.linkedin.com/in/elliotwaite

🎵 Kazukii - Return
https://soundcloud.com/ohthatkazuki
https://open.spotify.com/artist/5d07MpiIaNmmEMTq79KAga
https://www.youtube.com/user/OfficialKazuki