During last week’s blog post, we had a look at artificial neural networks. These were trained using data where a developer would know the input and correct output. Unlike them the algorithms we are looking today require no supervised learning.

First, we will be taking a look at Self-Organised Maps (SOM). It uses unsupervised learning and learns by mapping all the input data in clusters, usually placed on a 2-dimensional area, with each cluster associated with a different value/weight. This is done by calculating the Euclidean distance of the input to all the clusters and it being added to the closest match. When a cluster has new input added it and its neighbours update their weight value. SOM’s are also know as the “Kohonen Networks”.

Hierarchical Clustering creates a tower of clusters until it reaches the cut-off point. Clusters on one level of the tower are connected with the ones directly below and above. The algorithm finds similarities between data pairs and it groups together similar data to begin the tower.

Follow-the-leader Clustering uses the clusters centre. The number of clusters is automatically found by the distance threshold so it needs no cut-off point. It stops when the cluster centre point is made stable and no further updating takes place.

Finally, K-means Clustering determines a set number of clusters and their centre points. Their centre points will be updating each time they have data added to them until that point is stable. This keeps happening until all data is added to a cluster. There is also the fuzzy version of K-means which uses fuzzy logic to determine which cluster an input belongs to.

**REFERENCES**

Trevino, Andrea. (1997). K-means-clustering

Available from: https://www.datascience.com/blog/k-means-clustering