The goal of this lecture is to tackle more advanced topics related to KNN and dive into more details. I recommend you to read Part 1 if you haven’t. The important points that you need to know about KNN are summarized there.

One of the drawbacks of KNN seen in Part 1 is that it is not built on any probabilistic framework: no posterior probabilities of class membership and no way to infer number of neighbours or metric parameters probabilistically.

A probabilistic classifier returns a probability distribution over outputs given an input, what we call the likelihood p(y|x; parameters). When…

Walid Hadri

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store