25.7.12
This website uses cookies to ensure you get the best experience on our website. Learn more

Neural Network Mathematics: Exploring the Math behind Gradient Descent

Skillsoft issued completion badges are earned based on viewing the percentage required or receiving a passing score when assessment is required. Because neural networks comprise thousands of neurons and interconnections, one can assume training a neural network involves millions of computations. This is where a general-purpose optimization algorithm called gradient descent comes in. Use this course to gain an intuitive and visual understanding of how gradient descent and the gradient vector work. As you advance, examine three neural network activation functions, ReLU, sigmoid, and hyperbolic tangent functions, and two variants of the ReLU function, Leaky ReLU and ELU. In examining variants of the ReLU activation function, learn how to use them to deal with deep neural network training issues. Finally, implement a neural network from scratch using TensorFlow and basic Python. When you're done, you'll be able to illustrate the mathematical intuition behind neural networks and be prepared to tackle more complex machine learning problems.

Issued on

January 17, 2023

Expires on

Does not expire