29
Demystifying Machine Learning
Demystifying Machine Learning refers to the series of blogs in which I am going to explain the different algorithms of Machine Learning with behind the scenes mathematics and code implementation from scratch. This is an introductory article for the entire series about what the audience can expect, what are Algorithms we are going to cover and what prerequisites you need to fulfil in order to get the most out of this series.
There are a few things you need to be comfortable with so that you can get the best out of the articles.
- Working knowledge of Python and OOP concepts. It'll be great if you already worked with Numpy before.
- Basic mathematical concepts like Linear Algebra, Calculus and probability will be needed.
Don't worry, if you're not a pro in maths. It's going to be a beginner-friendly introduction to all algorithms and mathematics with explaining every concept in brief.
Below is the list of all the algorithms we are going to cover in this series. The main focus of this series is going to be on Supervised and Unsupervised algorithms.
Deep Learning is a subset of Machine Learning, so we will also be dealing with Neural Networks from scratch, but just to keep content consistent, advanced stuff like CNNs, sequence models, etc are kept out of the scope for this series.
-
Linear Regression :
- Linear Regression using normal equation and regularization
- Linear Regression using gradient descent and regularization
- Logistic Regression with regularization
-
Support Vector Machines (SVM)
- Optimal Margin Classifiers
- Kernel and SVM
-
Neural Networks :
- Shallow Neural Networks
- Deep Neural Networks
- K-Means Clustering
- Dimensionality Reduction (P.C.A)
- Anomaly Detection
And the best part, we're not just going to understand the algorithm and implement it with Python. In the end or in middle, we'll be going to make our own Python package including classes of the algorithms we implemented and later on publish it on PIP
so that you can pip install <package>
it whenever you want. (Just for fun π)
Articles are going to be long and comprehensive with loads of mathematical notations and code implementations, so it'll be obvious not to hurry just to finish an article within a specific reading time. Take your time, do google search whenever required, make hand notes and understand what the code is actually doing.
If feeling overwhelmed adopt the policy of "1 algorithm a week". This series is designed in such a way that you get to know your algorithm by asking how it's working and why it's working. A good practice is NOT to skip lines and jump to some random section. If get stuck somewhere or you got any other doubts, feel free to contact me via my socials.
All code implementations will be pushed to a GitHub repository whose link will be shared in the respective algorithm's article, so that you can also contribute to it, if you got something interesting about that code implementation.
This is all to this introductory article. Nothing better than ending your week by learning a new ML algorithm π. So make sure to follow me on Twitter for updates on newly published articles. See you in the next week with Linear Regression, till then enjoy and have a nice day π
29