Back to Browse

Tutorial-37:Xavier/Glorot and He weights initialization |Deep Learning

535 views
Jul 1, 2025
19:24

Connect with us on Social Media! 📸 Instagram: https://www.instagram.com/algorithm_avenue7/?next=%2F 🧵 Threads: https://www.threads.net/@algorithm_avenue7 📘 Facebook: https://www.facebook.com/algorithmavenue7 🎮 Discord: https://discord.com/invite/tbajs47w "CODE LINK"-https://colab.research.google.com/drive/1kAp9Nzb-HsVcWpnzJN83Jw1kFpSBfGeZ?usp=sharing In this video, we dive deep into weight initialization techniques in neural networks, focusing on Xavier (Glorot) and He initialization. These methods are crucial for training deep neural networks efficiently by preventing issues like vanishing or exploding gradients. 📌 Topics Covered: ✅ Xavier/Glorot Initialization (for sigmoid/tanh activations) ✅ He Initialization (for ReLU/Leaky ReLU activations) ✅ Mathematical intuition behind these methods ✅ How to implement them in PyTorch 👉 If you found this useful, don’t forget to Like , Share , and Subscribe for more awesome content! #activationfunctions #relu #sigmoid #tanha #leakyReLU #softmax #deeplearning #machinelearning #neuralnetworks #ai #artificialintelligence #datascience #weightinitialization #biasinitialization #deepneuralnetworks #gradientdescent #backpropagation #vanishinggradients #explodinggradients #xavierinitialization #glorotinitialization #heinitialization #optimization #trainingneuralnetworks #ml #dl #computerscience #airesearch #modeltraining #learningalgorithms #python #tensorflow #pytorch #keras #datasciencetips #aitraining

Download

1 formats

Video Formats

360pmp456.7 MB

Right-click 'Download' and select 'Save Link As' if the file opens in a new tab.

Tutorial-37:Xavier/Glorot and He weights initialization |Deep Learning | NatokHD