This is the first part on my series ‘Machine Learning in plain English – Part 1’ in which I discuss the intuition behind different Machine Learning algorithms, metrics and the approaches etc. These presentations will not include tiresome math or laborious programming constructs, and will instead focus on just the concepts behind the Machine Learning algorithms. This presentation discusses what Machine Learning is, Gradient Descent, linear, multi variate & polynomial regression, bias/variance, under fit, good fit and over fit and finally logistic regression etc.
It is hoped that these presentations will trigger sufficient interest in you, to explore this fascinating field further
To see actual implementations of the most widely used Machine Learning algorithms in R and Python, check out My book ‘Practical Machine Learning with R and Python’ on Amazon
Also see
1. Practical Machine Learning with R and Python – Part 3
2.R vs Python: Different similarities and similar differences
3. Perils and pitfalls of Big Data
4. Deep Learning from first principles in Python, R and Octave – Part 2
5. Getting started with memcached-libmemcached
To see all post see “Index of posts“
Very well made and informative. Gives good points on ML. But I felt 1) Some knowledge of maths/statistics would be needed to understand the presentation – Just plain english won’t help :). 2) Would have been nice if the why parts were covered in simple and elaborate way as for someone without any knowledge of ML, it could be a bit tough to understand bunch of points – like why over fit is better than good fit. Why general fit is considered in ML and what the benefits are. The reason/concept/usage behind the Parabolic graph wasn’t clear.
Would have been if the video was a bit longer with a bit more elaboration of the points.
LikeLike
A good book on ML theory is An Introduction to Statistical Learning by Prof Trevor Hastie, Prof Robert Tibeshiranu et al. By the way, in my presentation I mention that we should go for ‘good fit’ over ‘under fit’ or ‘overfit’
LikeLike