From the course: Machine Learning and AI Foundations: Classification Modeling
Unlock the full course today
Join today to access over 24,700 courses taught by industry experts.
KNN
- [Instructor] Of all the techniques that we're gonna discuss K- Nearest Neighbors is arguably the most straightforward conceptually. It's actually kind of fun talking about K- Nearest Neighbors but it can also be quite effective. So K-Nearest Neighbors is a so called lazy learner and makes it quite different from the other choices. No model per se is built. Basically what's happening is that we have a technique that although memory intensive is not computationally intensive at model building at all because it doesn't build a model. It simply memorizes the locations of all the cases. Now at scoring when you go to deploy this thing then it has to find all the nearest neighbors and that then takes some work So it kind of turns the typical process on its head. It's virtually instant at model building but then at scoring it can be a little bit slower at scoring than some other techniques. And that's the notion of a "Lazy" learner. Also the notion of nearest is essentially Euclidean…
Contents
-
-
-
-
-
Overview2m 10s
-
(Locked)
Discriminant with three categories5m 44s
-
(Locked)
Discriminant with two categories5m 2s
-
(Locked)
Stepwise discriminant1m 3s
-
(Locked)
Logistic regression10m 54s
-
(Locked)
Stepwise logistic regression1m 3s
-
(Locked)
Decision Trees4m 46s
-
(Locked)
KNN3m 58s
-
(Locked)
Linear SVM8m 2s
-
Neural nets7m 57s
-
(Locked)
Bayesian networks7m 54s
-
(Locked)
Heterogenous ensembles3m 22s
-
(Locked)
Bagging and random forest3m 26s
-
(Locked)
Boosting and XGBoost1m 57s
-
-
-