Member-only story

Understanding Wasserstein Distance: A Powerful Metric in Machine Learning

Moklesur Rahman
4 min readJun 18, 2023

--

In the vast field of machine learning, one often encounters the need to compare and measure the dissimilarity between probability distributions. Traditional distance metrics like Euclidean distance or Kullback-Leibler divergence may fall short when dealing with distributions that have different supports or significant overlap. Enter Wasserstein distance, a powerful tool that has gained popularity for its ability to capture the nuanced differences between distributions and overcome the limitations of other metrics. In this blog post, we will explore the concept of Wasserstein distance, its applications, and why it has become an indispensable tool in various machine learning domains.

Photo by fikry anshor on Unsplash

What is Wasserstein Distance?

Wasserstein distance, also known as Earth Mover’s Distance (EMD), is a metric that quantifies the dissimilarity between two probability distributions. Unlike other distance measures that rely on point-wise differences, Wasserstein distance takes into account the underlying structures of the distributions being compared. It measures the minimum “cost” required to transform one distribution into another, where the cost is determined by the amount of “mass” that needs to be moved and the distance it needs to travel.

--

--

Moklesur Rahman
Moklesur Rahman

Written by Moklesur Rahman

PhD student | Computer Science | University of Milan | Data science | AI in Cardiology | Writer | Researcher

No responses yet