This video belongs to the openHPI course Künstliche Intelligenz und maschinelles Lernen für Einsteiger. Do you want to see more?
An error occurred while loading the video player, or it takes a long time to initialize. You can try clearing your browser cache. Please try again later and contact the helpdesk if the problem persists.
Scroll to current position
- 00:00In this unit we want to look at ourselves, how the K-neighbours can be used as models.
- 00:05K-Neighbours are a very simple model approach and are often used in standard books for learning machine learning as an introduction.
- 00:13As a scenario we will look at an example of this, which is also often used to explain K-Neighbours.
- 00:19It is a relatively simple example, and that is what we want to recognize for an election campaign,
- 00:24for a person who lives in an area how likely it is that she will vote for a particular party.
- 00:30What we will use for this, so the model input, will be the residence of the person.
- 00:36In this scenario we will demonstrate this with a very simplified example easily with x and y coordinates.
- 00:41You can imagine this as length and width coordinates.
- 00:44The output of the model should then be the party of the person, that is, we will look at parties, red and blue,
- 00:52and for one person we want to identify, which of these parties she is most likely to choose.
- 00:57As a database we will use a really simple database here, which has only the attributes x-coordinate, y-coordinate and the party to predict.
- 01:10Important here is, again, to refer to the no-free lunch theorem, we will make an assumption.
- 01:16That is, we use this model because we believe that neighboring objects so in this case the persons who elect a party, have certain similarities to each other.
- 01:27We would not be able to make this assumption at all, it would perhaps not make sense to use this very model approach.
- 01:33What does the model ultimately look like?
- 01:37We actually always ask ourselves the question, what would the neighbors say, namely the K-next-neighbors, where K can be chosen freely.
- 01:44Look at this point, which is colored gray in the coordinate system.
- 01:49If you now had to say which party this person chooses, which one would you most likely choose?
- 01:54Well, what you could do is, that you ask the neighbors.
- 01:57In this case we could decide to do so, always ask the nearest neighbor.
- 02:02We select a means of distance, for example how far away the person is as the crow flies and see that the next person is a person who chooses the party blue.
- 02:14So if we decide to ask only the nearest neighbor, we would say that this person apparently chooses the blue party.
- 02:20Now we can choose this K freely, can also say, for example, select K equal to 3.
- 02:26That is, if we now want to determine the party for this person, we will not only ask the nearest neighbor, but the three nearest neighbors.
- 02:34As we can see here now, this person would apparently then choose the Red Party, because the closest neighbors vote for the Red Party in a ratio of two to three.
- 02:44But now we also have scenarios, in which the different neighbors are at different distances.
- 02:51That means we can look at a radius, which contains the three nearest neighbors, but the nearest neighbor is much closer.
- 03:01What would we do in this case? Well, we can now also introduce a remedy, with which we evaluate distances once again.
- 03:08That is, we say neighbors who are really close, we would like to judge them again.
- 03:13And neighbors who are further away, they are not so incredibly important to us for the statement.
- 03:18That means we can also build a model, where we interview the three nearest neighbors
- 03:23and in these three nearest neighbors only one person chooses the Red Party, but ultimately we say that the person, that we would like to predict will also elect the Red Party.
- 03:32Merely because the directly next neighbor, which is much closer than the other two, chooses the Red Party.
- 03:40At this point you can also think about it, which use cases would come to your mind.
- 03:46This often helps very well, to understand the K-neighbour-neighbour approach.
- 03:50You are welcome to post them in the forum and let us discuss your use cases.
- 03:54That was it for the unit K-Neighbours.
- 03:58We will look at further model approaches in the next units.
To enable the transcript, please select a language in the video player settings menu.