K-NN(K-Nearest Neighbors)

Pradeep Dhote
2 min readAug 5, 2020

K-Nearest Neighbors (K-NN) algorithm is a supervised machine learning algorithm that can be used to solve both classification and regression problems.

sometime is call Lazy algorithm.because it just memories the process does not learn itself.K-NN not required any pre-explicity training .

K-NN captures the idea of similarity (sometimes called distance, proximity, or closeness) with some mathematics we might have learned in our childhood — calculating the distance between points on a graph.

An understanding of how we calculate the distance between points on a graph is necessary before moving on. If you are unfamiliar with or need a refresher on how this calculation is done

Distance, proximity, or closeness measures by Euclidean Distance & Manhattan Distance

The K-NN Algorithm

  1. Load the data
  2. Initialize K (Should be a odd number)K → Represent the number of neighbors

3. For each example in the data

3.1 Calculate the distance between the query point and the current point from the data.

3.2 Add the distance and the index to an ordered collection

4. Sort the ordered collection of distances and index from smallest to largest (in ascending order) by the distances

5. Pick the first K entries from the sorted collection

6. Get the labels of the selected K entries

7. If regression, return the mean of the K labels

8. If classification, return the mode of the K labels

--

--