Course Description
Prerequisites: No prerequisites, knowledge of some undergraduate level mathematics would help but is not mandatory. Working knowledge of Python would be helpful if you want to run the source code that is provided.
Taught by a Stanfordeducated, exGoogler and an IIT, IIM – educated exFlipkart lead analyst. This team has decades of practical experience in quant trading, analytics and ecommerce.
This course is a downtoearth, shy but confident take on machine learning techniques that you can put to work today
Let’s parse that.
The course is downtoearth : it makes everything as simple as possible – but not simpler
The course is shy but confident : It is authoritative, drawn from decades of practical experience but shies away from needlessly complicating stuff.
You can put ML to work today : If Machine Learning is a car, this car will have you driving today. It won’t tell you what the carburetor is.
The course is very visual : most of the techniques are explained with the help of animations to help you understand better.
This course is practical as well : There are hundreds of lines of source code with comments that can be used directly to implement natural language processing and machine learning for text summarization, text classification in Python.
The course is also quirky. The examples are irreverent. Lots of little touches: repetition, zooming out so we remember the big picture, active learning with plenty of quizzes. There’s also a peppy soundtrack, and art – all shown by studies to improve cognition and recall.
What’s Covered:
Machine Learning:
Supervised/Unsupervised learning, Classification, Clustering, Association Detection, Anomaly Detection, Dimensionality Reduction, Regression.
Naive Bayes, Knearest neighbours, Support Vector Machines, Artificial Neural Networks, Kmeans, Hierarchical clustering, Principal Components Analysis, Linear regression, Logistics regression, Random variables, Bayes theorem, Biasvariance tradeoff
Natural Language Processing with Python:
Corpora, stopwords, sentence and word parsing, autosummarization, sentiment analysis (as a special case of classification), TFIDF, Document Distance, Text summarization, Text classification with Naive Bayes and KNearest Neighbours and Clustering with KMeans
Sentiment Analysis:
Why it’s useful, Approaches to solving – RuleBased , MLBased , Training , Feature Extraction, Sentiment Lexicons, Regular Expressions, Twitter API, Sentiment Analysis of Tweets with Python
A Note on Python: The codealongs in this class all use Python 2.7. Source code (with copious amounts of comments) is attached as a resource with all the codealongs. The source code has been provided for both Python 2 and Python 3 wherever possible.
Mail us about anything – anything! – and we will always reply 🙂
What are the requirements?
No prerequisites, knowledge of some undergraduate level mathematics would help but is not mandatory. Working knowledge of Python would be helpful if you want to run the source code that is provided.
What am I going to get from this course?
Identify situations that call for the use of Machine Learning
Understand which type of Machine learning problem you are solving and choose the appropriate solution
Use Machine Learning and Natural Language processing to solve problems like text classification, text summarization in Python
What is the target audience?
Yep! Analytics professionals, modelers, big data professionals who haven’t had exposure to machine learning
Yep! Engineers who want to understand or learn machine learning and apply it to problems they are solving
Yep! Product managers who want to have intelligent conversations with data scientists and engineers about machine learning
Yep! Tech executives and investors who are interested in big data, machine learning or natural language processing
Yep! MBA graduates or business professionals who are looking to move to a heavily quantitative role
Section 1: Introduction  

Lecture 1 
What this course is about

03:17  
Section 2: Jump right in : Machine learning for Spam detection  
Lecture 2 
Machine Learning: Why should you jump on the bandwagon?

16:31  
Lecture 3 
Plunging In – Machine Learning Approaches to Spam Detection

17:01  
Lecture 4 
Spam Detection with Machine Learning Continued

17:04  
Lecture 5 
Get the Lay of the Land : Types of Machine Learning Problems

17:26  
Section 3: Naive Bayes Classifier  
Lecture 6 
Random Variables

20:10  
Lecture 7 
Bayes Theorem

18:36  
Lecture 8 
Naive Bayes Classifier

08:49  
Lecture 9 
Naive Bayes Classifier : An example

14:03  
Section 4: KNearest Neighbors  
Lecture 10 
KNearest Neighbors

13:09  
Lecture 11 
KNearest Neighbors : A few wrinkles

14:47  
Section 5: Support Vector Machines  
Lecture 12 
Support Vector Machines Introduced

08:16  
Lecture 13 
Support Vector Machines : Maximum Margin Hyperplane and Kernel Trick

16:23  
Section 6: Clustering as a form of Unsupervised learning  
Lecture 14 
Clustering : Introduction

19:07  
Lecture 15 
Clustering : KMeans and DBSCAN

13:42  
Section 7: Association Detection  
Lecture 16 
Association Rules Learning

09:12  
Section 8: Dimensionality Reduction  
Lecture 17 
Dimensionality Reduction

10:22  
Lecture 18 
Principal Component Analysis

18:53  
Section 9: Artificial Neural Networks  
Lecture 19 
Artificial Neural Networks:Perceptrons Introduced

11:18  
Section 10: Regression as a form of supervised learning  
Lecture 20 
Regression Introduced : Linear and Logistic Regression

13:54  
Lecture 21 
Bias Variance Tradeoff

10:13  
Section 11: Natural Language Processing and Python  
Lecture 22 
Installing Python – Anaconda and Pip

09:00  
Lecture 23 
Natural Language Processing with NLTK

07:26  
Lecture 24 
Natural Language Processing with NLTK – See it in action

14:14  
Lecture 25 
Web Scraping with BeautifulSoup

18:09  
Lecture 26 
A Serious NLP Application : Text Auto Summarization using Python

11:34  
Lecture 27 
Python Drill : Autosummarize News Articles I

18:33  
Lecture 28 
Python Drill : Autosummarize News Articles II

11:28  
Lecture 29 
Python Drill : Autosummarize News Articles III

10:23  
Lecture 30 
Put it to work : News Article Classification using KNearest Neighbors

19:29  
Lecture 31 
Put it to work : News Article Classification using Naive Bayes Classifier

19:24  
Lecture 32 
Python Drill : Scraping News Websites

15:45  
Lecture 33 
Python Drill : Feature Extraction with NLTK

18:51  
Lecture 34 
Python Drill : Classification with KNN

04:15  
Lecture 35 
Python Drill : Classification with Naive Bayes

08:08  
Lecture 36 
Document Distance using TFIDF

11:03  
Lecture 37 
Put it to work : News Article Clustering with KMeans and TFIDF

14:32  
Lecture 38 
Python Drill : Clustering with K Means

08:32  
Section 12: Sentiment Analysis  
Lecture 39 
A Sneak Peek at what’s coming up

02:36  
Lecture 40 
Sentiment Analysis – What’s all the fuss about?

17:17  
Lecture 41 
ML Solutions for Sentiment Analysis – the devil is in the details

19:57  
Lecture 42 
Sentiment Lexicons ( with an introduction to WordNet and SentiWordNet)

18:49  
Lecture 43 
Regular Expressions

17:53  
Lecture 44 
Regular Expressions in Python

05:41  
Lecture 45 
Put it to work : Twitter Sentiment Analysis

17:48  
Lecture 46 
Twitter Sentiment Analysis – Work the API

20:00  
Lecture 47 
Twitter Sentiment Analysis – Regular Expressions for Preprocessing
Preview 
12:24  
Lecture 48 
Twitter Sentiment Analysis – Naive Bayes, SVM and Sentiwordnet

19:40  
Section 13: Decision Trees  
Lecture 49 
Planting the seed – What are Decision Trees?

17:00  
Lecture 50 
Growing the Tree – Decision Tree Learning

18:03  
Lecture 51 
Branching out – Information Gain

18:51  
Lecture 52 
Decision Tree Algorithms

07:50  
Lecture 53 
Titanic : Decision Trees predict Survival (Kaggle) – I

19:21  
Lecture 54 
Titanic : Decision Trees predict Survival (Kaggle) – II

14:16  
Lecture 55 
Titanic : Decision Trees predict Survival (Kaggle) – III

13:00  
Section 14: A Few Useful Things to Know About Overfitting  
Lecture 56 
Overfitting – the bane of Machine Learning

19:03  
Lecture 57 
Overfitting Continued

11:19  
Lecture 58 
Cross Validation

18:55  
Lecture 59 
Simplicity is a virtue – Regularization

07:18  
Lecture 60 
The Wisdom of Crowds – Ensemble Learning

16:39  
Lecture 61 
Ensemble Learning continued – Bagging, Boosting and Stacking

18:02  
Section 15: Random Forests  
Lecture 62 
Random Forests – Much more than trees

12:28  
Lecture 63 
Back on the Titanic – Cross Validation and Random Forests

20:03  
Section 16: Recommendation Systems  
Lecture 64 
What do Amazon and Netflix have in common?

16:43  
Lecture 65 
Recommendation Engines – A look inside

10:45  
Lecture 66 
What are you made of? – ContentBased Filtering

13:35  
Lecture 67 
With a little help from friends – Collaborative Filtering

10:26  
Lecture 68 
A Neighbourhood Model for Collaborative Filtering

17:51  
Lecture 69 
Top Picks for You! – Recommendations with Neighbourhood Models

09:41  
Lecture 70 
Discover the Underlying Truth – Latent Factor Collaborative Filtering

20:13  
Lecture 71 
Latent Factor Collaborative Filtering contd.

12:09  
Lecture 72 
Gray Sheep and Shillings – Challenges with Collaborative Filtering

08:12  
Lecture 73 
The Apriori Algorithm for Association Rules

18:31  
Section 17: Recommendation Systems in Python  
Lecture 74 
Back to Basics : Numpy in Python

18:05  
Lecture 75 
Back to Basics : Numpy and Scipy in Python

14:19  
Lecture 76 
Movielens and Pandas

16:45  
Lecture 77 
Code Along – What’s my favorite movie? – Data Analysis with Pandas

06:18  
Lecture 78 
Code Along – Movie Recommendation with Nearest Neighbour CF

18:10  
Lecture 79 
Code Along – Top Movie Picks (Nearest Neighbour CF)

06:16  
Lecture 80 
Code Along – Movie Recommendations with Matrix Factorization

17:55  
Lecture 81 
Code Along – Association Rules with the Apriori Algorithm

09:50  
Section 18: A Taste of Deep Learning and Computer Vision  
Lecture 82 
Computer Vision – An Introduction

18:08  
Lecture 83 
Perceptron Revisited

16:00  
Lecture 84 
Deep Learning Networks Introduced

17:01  
Lecture 85 
Code Along – Handwritten Digit Recognition I

14:29  
Lecture 86 
Code Along – Handwritten Digit Recognition – II

17:35  
Lecture 87 
Code Along – Handwritten Digit Recognition – III

06:01  
Section 19: Quizzes  
Quiz 1 
Machine Learning Jump Right In

1 question  
Quiz 2 
Machine Learning Jump Right In II

1 question  
Quiz 3 
Machine Learning Algorithms

1 question  
Quiz 4 
Types of ML problems

1 question  
Quiz 5 
Random Variables

1 question  
Quiz 6 
Bayes theorem

1 question  
Quiz 7 
Naive Bayes

1 question  
Quiz 8 
Naive Bayes

1 question  
Quiz 9 
Classification

1 question  
Quiz 10 
Naive Bayes

1 question  
Quiz 11 
kNN Algorithm

1 question  
Quiz 12 
kNN Algorithm

1 question  
Quiz 13 
SVM

1 question  
Quiz 14 
SVM

1 question  
Quiz 15 
Clustering

1 question  
Quiz 16 
Association rule learning

1 question  
Quiz 17 
Dimensionality Reduction

1 question  
Quiz 18 
PCA

1 question  
Quiz 19 
Artificial Neural Network

1 question  
Quiz 20 
Artificial Neural Network

1 question  
Quiz 21 
Regression

1 question  
Quiz 22 
Bias Variance Tradeoff

1 question  
Quiz 23 
NLP

1 question  
Quiz 24 
NLP Bayes

1 question  
Quiz 25 
NLP kNN

1 question  
Quiz 26 
TFIDF

1 question  
Quiz 27 
NLP kmeans

1 question 
Instructor Biography
Loony Corn, A 4person team;exGoogle; Stanford, IIM Ahmedabad, IIT
Loonycorn is us, Janani Ravi, Vitthal Srinivasan, Swetha Kolalapudi and Navdeep Singh. Between the four of us, we have studied at Stanford, IIM Ahmedabad, the IITs and have spent years (decades, actually) working in tech, in the Bay Area, New York, Singapore and Bangalore.
Janani: 7 years at Google (New York, Singapore); Studied at Stanford; also worked at Flipkart and Microsoft
Vitthal: Also Google (Singapore) and studied at Stanford; Flipkart, Credit Suisse and INSEAD too
Swetha: Early Flipkart employee, IIM Ahmedabad and IIT Madras alum
Navdeep: longtime Flipkart employee too, and IIT Guwahati alum
We think we might have hit upon a neat way of teaching complicated tech courses in a funny, practical, engaging way, which is why we are so excited to be here on Udemy!
We hope you will try our offerings, and think you’ll like them 🙂