From the course: Machine Learning and AI Foundations: Classification Modeling

Unlock the full course today

Join today to access over 24,700 courses taught by industry experts.

KNN

KNN

- [Instructor] Of all the techniques that we're gonna discuss K- Nearest Neighbors is arguably the most straightforward conceptually. It's actually kind of fun talking about K- Nearest Neighbors but it can also be quite effective. So K-Nearest Neighbors is a so called lazy learner and makes it quite different from the other choices. No model per se is built. Basically what's happening is that we have a technique that although memory intensive is not computationally intensive at model building at all because it doesn't build a model. It simply memorizes the locations of all the cases. Now at scoring when you go to deploy this thing then it has to find all the nearest neighbors and that then takes some work So it kind of turns the typical process on its head. It's virtually instant at model building but then at scoring it can be a little bit slower at scoring than some other techniques. And that's the notion of a "Lazy" learner. Also the notion of nearest is essentially Euclidean…

Contents