Have you ever wondered which optimization algorithm to use for your Neural network Model to produce slightly better and faster results by updating the Model parameters such as Weights and Bias values? Should we use Gradient Descent or Stochastic gradient Descent or Adam? I too didn’t know about the major differences between these different types of Optimization Strategies and which one is better over another before writing this article. NOTE: Having a good theoretical knowledge is amazing but implementing them in code in a real-time deep learning project is a completely different thing. You might get different and unexpected results based on different problems and datasets. So as a Bonus,I am also adding the links to the various courses which has helped me a lot in my journey to learn Data science and ML, experiment and compare different optimization strategies which led me to write this article on comparisons between different optimizers while implementing deep l...
If you are reading this blog, I am sure that we share similar interests and will be in similar industries. I have worked on various projects which involve ML/AI/CV and also written blogs on various platforms to share my knowledge regarding the same. Why I started these blogs: 1) Because I'm endlessly obsessed with Machine Learning. 2) Because I want to help inspiring developers get started and get good at applied ML. Happy Learning folks!