Predictive Hacks

Non-Negative Matrix Factorization for Dimensionality Reduction

Non-Negative Matrix Factorization

We have explained how we can reduce the dimensions by applying the following algorithms:

We will see how we can also apply Dimensionality Reduction by applying Non-Negative Matrix Factorization. We will work with the Eurovision 2016 dataset as what we did in the Hierarchical Clustering post.


Few Words About Non-Negative Matrix Factorization

This is a very strong algorithm which many applications. For example, it can be applied for Recommender Systems, for Collaborative Filtering for topic modelling and for dimensionality reduction.

In Python, it can work with sparse matrix where the only restriction is that the values should be non-negative.

The logic for Dimensionality Reduction is to take our \(m \times n\) data and to decompose it into two matrices of \(m \times features\) and \(features \times n\) respectively. The \(features\) will be the reduced dimensions.


Dimensionality Reduction in Eurovision Data

Load and Reshape the Data

In our dataset, the rows will be referred to the Countries that voted and the columns will be the countries that have been voted. The values will refer to the televote ranking.

import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import seaborn as sns
%matplotlib inline 

eurovision = pd.read_csv("eurovision-2016.csv")
televote_Rank = eurovision.pivot(index='From country', columns='To country', values='Televote Rank')
# fill NAs by min per country
televote_Rank.fillna(televote_Rank.min(), inplace=True)
televote_Rank.head()
 
To countryArmeniaAustraliaAustriaAzerbaijanBelgiumBulgariaCroatiaCyprusCzech RepublicFranceLithuaniaMaltaPolandRussiaSerbiaSpainSwedenThe NetherlandsUkraineUnited Kingdom
From country
Albania9.01.013.019.014.03.020.012.022.011.07.016.06.04.026.023.08.024.05.018.0
Armenia1.012.07.025.017.015.022.05.018.04.021.06.010.01.023.013.09.011.02.020.0
Australia12.01.08.022.01.02.018.013.025.04.010.05.015.06.017.09.020.016.03.07.0
Austria12.08.02.024.014.06.019.016.023.010.020.025.01.03.07.015.04.05.02.022.0
Azerbaijan25.09.011.02.016.03.021.017.020.07.018.06.08.01.024.015.012.019.02.023.0
televote_Rank.shape
 
(42, 26)

Non-Negative Matrix Factorization

Since we have the data in the right form, we are ready to run the NNMF algorithm. We will choose two components because our goal is to reduce the dimensions into 2.

# Import NMF
from sklearn.decomposition import NMF

# Create an NMF instance: model
model = NMF(n_components=2)

# Fit the model to televote_Rank
model.fit(televote_Rank)

# Transform the televote_Rank: nmf_features
nmf_features = model.transform(televote_Rank)

# Print the NMF features
print(nmf_features.shape)

print(model.components_.shape)
 
(42, 2)
(2, 26)

As we can see we created two matrices of (42,2) and (2,26) dimensions respectively. Our two dimensions are the (42,2) matrix.

Plot the 42 Countries in two Dimensions

Let’s see how the scatter plot of the 42 countries into two dimensions.

plt.figure(figsize=(20,12))
countries = np.array(televote_Rank.index)
xs = nmf_features[:,0]

# Select the 1th feature: ys
ys = nmf_features[:,1]

# Scatter plot
plt.scatter(xs, ys, alpha=0.5)

# Annotate the points
for x, y, countries in zip(xs, ys, countries):
    plt.annotate(countries, (x, y), fontsize=10, alpha=0.5)
plt.show()
 

Can you see a pattern?

The 2D graph here is somehow consistent with the dendrogram that we got by applying the linkage distance. Again, we can see a “cluster” of the cluster from “Yugoslavia” and also that the Baltic countries are close as well as the Scandinavian and the countries of the United Kingdom.

Share This Post

Share on facebook
Share on linkedin
Share on twitter
Share on email

1 thought on “Non-Negative Matrix Factorization for Dimensionality Reduction”

Leave a Comment

Subscribe To Our Newsletter

Get updates and learn from the best

More To Explore

Python

Intro to Chatbots with HuggingFace

In this tutorial, we will show you how to use the Transformers library from HuggingFace to build chatbot pipelines. Let’s

Python

NER with OpenAI and LangChain

Named Entity Recognition (NER) is a natural language processing (NLP) technique used to identify and classify named entities within a