Showing posts with label Python. Show all posts
Showing posts with label Python. Show all posts

Thursday, May 2, 2024

Posts History

I delve into a diverse range of topics, spanning programming languages, machine learning, data engineering tools, and DevOps. Our articles are enriched with practical code examples, ensuring their applicability in real-world scenarios.

Follow me on LinkedIn

2024

Wednesday, May 1, 2024

K-means on Riemann Manifolds

Target audience: Advanced
Estimated reading time: 7'

Traditional clustering models often fail complex datasets commonly found in advanced applications like medical imaging, 3D shape analysis, and natural language processing where data is highly interrelated.
K-means on manifolds respects the intrinsic geometry of the data, such as curvature and metric.


Table of contents

       Setup
       Euclidean space
       Hypersphere
Follow me on LinkedIn

What you will learn: How to apply k-means clustering on a Riemann manifold (Hypersphere) using Geomstats, contrasted with its implementation in Euclidean space, using scikit-learn library.

Notes

  • Environments: Python  3.10.10, Geomstats 2.7.0, scikit-learn 1.4.2
  • This article assumes that the reader is somewhat familiar with differential and tensor calculus [ref 1]. Please refer to our previous articles related to geometric learning [ref 2, 3, 4].
  • Source code is available at  Github.com/patnicolas/Data_Exploration/manifolds
  • To enhance the readability of the algorithm implementations, we have omitted non-essential code elements like error checking, comments, exceptions, validation of class and method arguments, scoping qualifiers, and import statements.

Introduction

The primary goal of learning Riemannian geometry is to understand and analyze the properties of curved spaces that cannot be described adequately using Euclidean geometry alone. Riemannian geometry enables to describe the geometric structures of manifolds equipped with a metric, which defines the concept of distance and angle on these spaces.

This article is the seventh part of our ongoing series focused on geometric learning. In this installment, we utilize the Geomstats Python library [ref. 5] and explore the ubiquitous K-means clustering algorithm on the hypersphere manifold. The hypersphere which was introduced in a previous piece, Geometric Learning in Python: Manifolds - Hypersphere  and is detailed in the Geomstats API [ref. 6]. 

I highly recommend watching the comprehensive series of 22 YouTube videos Tensor Calculus - Eigenchris  to familiarize yourself with fundamental concepts of differential geometry. 
Summaries of my earlier articles on this topic can be found in the Appendix

There are many benefits for clustering data on a manifold for complex data sets [ref 7]:
  • Grouping of dense, continuous non-linear data depends on the 'shape' of data
  • Projection to Euclidean space may introduce distortion
  • Loss, distances are better assessed and computed through geodesics than Euclidean metrics (i.e. sphere)

K-means


Among the array of unsupervised learning algorithms, K-means stands out as one of the most well-known. This algorithm has a straightforward goal: to divide the data space so that data points within the same cluster are as similar as possible (intra-cluster similarity), and data points in different clusters are as dissimilar as possible (inter-cluster similarity). K-means aims to identify a predetermined number of clusters in an unlabeled dataset. It employs an iterative approach to finalize the clustering, which depends on the number of clusters specified by the user (denoted by the variable K).

Given K, Ck clusters each with a centroid mk, the input data xi is distributed across the cluster so to minimize the reconstruction error:\[Rerr(K)=\max_{C_k}\sum_{k=1}^{K}\sum_{x_i\in C_k}^{}\left \| x_i-m_k\right \|^2\]

Clustering data on a manifold


To assess and contrast the K-means model in both Euclidean space and on a hypersphere, it's necessary to create clustered data. This involves a two-step process:
  1. Generate a template cluster by employing a random generator on the manifold.
  2. Generate 4 clusters from the template using a special orthogonal Lie group in 3-dimensional space, SO(3).

Randomly generated manifold data

Let's evaluate and compare the following random generators for data points on the hypersphere we introduced in a previous article [ref 8]
  • Uniform distribution
  • Uniform distribution with constraints
  • von Mises-Fisher distribution

Uniform distribution
We start with the basic random uniform generator over interval [0, 1].\[r=rand_{[0,1]}(x)\]The data points for the 4 clusters are visualized in the following plot.
4-Cluster random generation using uniform distribution


Constrained uniform random generator.
In this scenario, we constrain random values r on each of the 3 dimension (or axis) within a sub-interval [ai, bi].\[r=rand_{[0,1]}(x)\ \ \ a_i < r_i < b_i\]

4-Cluster random generation using constrained uniform distribution

von mises-Fisher random generator
This approach relies on a generative mixture-model approach to clustering directional data based on the von Mises-Fisher distribution [ref 9].
Given a d-dimensional unit random vector x on a hypersphere of dimension d-1, the d-variate von Moses-Fisher distribution is defined by the following probability density distribution:\[f(x|\mu , \kappa )=C_d(\kappa).e^{\kappa \mu^Tx} \ \ \  \ C_d(\kappa)=\frac{\kappa^{\frac{d}{2}-1}}{2\pi^{\frac{d}{2}}I_{\frac{d}{2}-1}(\kappa)}\]Id is the Bessel function and Cd is the normalization factor.

I
4-Cluster random generation using Von Mises-Fisher distribution

As anticipated, using a pure uniform random generator distributes data evenly across the hypersphere, rendering it ineffective for evaluating KMeans.
Instead, we will employ the von Mises-Fisher distribution and a constrained uniform random generator to more effectively analyze the performance of KMeans on a Riemann manifold.

Synthetic clusters using SO(3)

We leverage the SO(3) Lie group to replicate the randomly generated cluster.

Although the discussion of Lie groups and special orthogonal group in 3-dimensional space is beyond the scope of this article, here is a short summary:
In differential geometry, Lie groups play a crucial role by connecting the concepts of algebra and geometry. A Lie group is a mathematical structure that is both a group and a differentiable manifold. This means that the group operations of multiplication and taking inverses are smooth (differentiable), and it allows the application of calculus within the group structure.

The Special Orthogonal Lie group in 3-dimension space SO(3) is simply a group of 3 x 3 orthogonal matrices with determinant = 1. These represent rotation in
n-dimensional space and form a compact Lie group.

Implementation

Setup 

Let's wraps the random generators and k-means training methods in a class, KMeansOnManifold.
The von Mises-Fisher generator for the data in the initial cluster is initialized with a mean, _mu and a kappa arbitrary value. The constrained uniform random generator, accepts random values in each dimension x: [-1, -0.35], y: [0.3, 1] and z: [-1. 0.4].
The SO(3) Lie group is initialized without metric (equip=False) to generate the 4 synthetic clusters.

from geomstats.geometry.hypersphere import Hypersphere
from geomstats.geometry.special_orthogonal import SpecialOrthogonal

class KMeansOnManifold(object):

   def __init__(self, num_samples: int, num_clusters: int, random_gen: AnyStr):
     # Step 1: Initialize the manifold
     self.hypersphere = Hypersphere(dim=2, equip=True)

     # Step 2: Generate a single cluster with random data points on hypersphere
     match random_gen:
        case 'random_von_mises_fisher':
             # Select a pivot or mean value
             _mu = self.hypersphere.random_uniform(n_samples=1)
             # Generate the cluster
             cluster = self.hypersphere.random_von_mises_fisher(
                      mu=_mu[0], 
                      kappa=60, 
                      n_samples=num_samples, 
                      max_iter=200)
        
        case 'random_riemann_normal':
             cluster = self.hypersphere.random_riemannian_normal(n_samples=num_samples, max_iter=300)
        
        case 'random_uniform':
            cluster = self.hypersphere.random_uniform(n_samples=num_samples)
        
        case 'constrained_random_uniform'
            # Generate random values with constrains on each dimension.
            y = [x for x in self.hypersphere.random_uniform(n_samples=100000)
            if x[0] <= -0.35 and x[1] >= 0.3 and x[2] <= -0.40]
            cluster = np.array(y)[0:num_samples]

        case _:
            raise ValueError(f'{random_gen} generator is not supported')
        
     # Step 3: Generate other clusters using SO(3) manifolds
     so3_lie_group = SpecialOrthogonal(3, equip=False)
     # Generate the clusters
     self.clusters = [cluster @ so3_lie_group.random_uniform() for _ in range(num_clusters)]



Data in Euclidean space

The data class, KMeansResult encapsulates the output (centroid and label) of the training of the k-means algorithm on the synthetic clustered data.

@dataclass
class KMeansResult:
    center: np.array
    label: np.array

We rely k-means implementation in scikit-learn library [ref 10] (class KMeanson the  to identify the clusters in the Euclidean space, selecting the elkan algorithm, and k-means++ initialization.

def euclidean_clustering(self) -> List[KMeansResult]:
   from sklearn.cluster import KMeans

   kmeans = KMeans(
       n_clusters=len(self.clusters), 
       init='k-means++', 
       algorithm='elkan',
       max_iter=140)
  
   # Create a data set from points in clusters
   data = np.concatenate(self.clusters, axis=0)
   kmeans.fit(data)

   # Extract centroids and labels
   centers = kmeans.cluster_centers_
   labels = kmeans.labels_

   return [KMeansResult(center, label) for center, label in zip(centers, labels)]

Output:
Cluster Center: [ 0.56035023 -0.4030522   0.70054776], Label: 0
Cluster Center: [-0.1997325  -0.38496744  0.8826764 ], Label: 2
Cluster Center: [0.04443849 0.86749237 0.46118632], Label: 3
Cluster Center: [-0.83876485 -0.45621187  0.23570083], Label: 1

Clearly, KMeans was not able to identify the proper clusters

Data on hypersphere

The methods to train k-means on the hypersphere uses the same semantic as its sklearn counterpart. It leverage the Geomstats, RiemannianKMeans class method.

def riemannian_clustering (self) -> List[KMeansResult]:
    from geomstats.learning.kmeans import RiemannianKMeans

    # Invoke the Geomstats Riemann Means
    kmeans = RiemannianKMeans(space=self.hypersphere, n_clusters=len(self.clusters))
    
    # Build the data set from the clustered data points
    data = gs.concatenate(self.clusters, axis =0)
        
    kmeans.fit(data)

    # Extract predictions, centroids and labels
centers = kmeans.centroids_ labels = kmeans.labels_ return [KMeansResult(center, label) for center, label in zip(centers, labels)]


Similar to k-means in Euclidean space, we identify the centroids for 4 clusters using 500 randomly generated samples.

num_samples = 500
num_clusters = 4
kmeans = KMeansOnManifold(num_samples, num_clusters,  'random_von_mises_fisher')
kmeans_result = kmeans.riemannian_clustering()


Output:
500 random samples on 4 clusters with von-mises-Fisher distribution
Cluster Center: [ 0.17772496 -0.36363422  0.91443097], Label: 2
Cluster Center: [ 0.44403679  0.06735507 -0.89347335], Label: 0
Cluster Center: [ 0.85407911 -0.50905801  0.10681211], Label: 3
Cluster Center: [ 0.90899637  0.02635062 -0.41597025], Label: 1

500 random samples on 4 clusters with constrained uniform distribution
Cluster Center: [-0.05344069 -0.91613807  0.3972847 ], Label: 1
Cluster Center: [ 0.6796575   0.39400079 -0.61873181], Label: 2
Cluster Center: [ 0.51799972 -0.67116261 -0.530299 ], Label: 0
Cluster Center: [ 0.49290501 -0.45790221 -0.73984473], Label: 3

Note: The labels are arbitrary indices assigned to each cluster for the purpose of visualization and validation against true labels.

References



--------------------------------------
Patrick Nicolas has over 25 years of experience in software and data engineering, architecture design and end-to-end deployment and support with extensive knowledge in machine learning. 
He has been director of data engineering at Aideo Technologies since 2017 and he is the author of "Scala for Machine Learning", Packt Publishing ISBN 978-1-78712-238-3

Appendix

Here is the list of published articles related to geometric learning:
Geometric Learning in Python: Basics introduced differential geometry as an applied to machine learning and its basic components.
Geometric Learning in Python: Manifolds described manifold components such as tangent vectors, geodesics with implementation in Python for Hypersphere using the Geomstats library.
Geometric Learning in Python: Intrinsic Representation reviewed the various coordinates system using extrinsic and intrinsic representation.
Geometric Learning in Python: Vector and Covector fields described vector and covector fields with Python implementation in 2 and 3-dimension spaces.
Geometric Learning in Python: Vector Operators illustrated the differential operators, gradient, divergence, curl and laplacian using SymPy library.
Geometric Learning in Python: Functional Data Analysis described the key elements of non-linear functional data analysis to analysis curves, images, or functions in very  high-dimensional spaces
Geometric Learning in Python: Riemann Metric & Connection reviewed Riemannian metric tensor, Levi-Civita connection and parallel transport for hypersphere.
Riemann Curvature in Python describes the intricacies of Riemannian metric curvature tensor and its implementation in Python using Geomstats library.

#geometriclearning #riemanngeometry #manifold #ai #python #geomstats #Liegroups #kmeans

 

Thursday, April 18, 2024

Riemann Curvature in Python

Target audience: Advanced
Estimated reading time: 7'

Beyond theoretical physics, the Riemann curvature tensor is essential for understanding motion planning in robotics, object recognition in computer vision, visualization and training of Physics-Inspired Neural Networks.
This article describes the curvature tensor, its derivative and its implementation in Python.


Table of contents
       Curvature tensor
Follow me on LinkedIn

What you will learn: The intricacies of Riemannian metric curvature tensor and its implementation in Python using Geomstats library.

Notes

  • Environments: Python  3.10.10, Geomstats 2.7.0
  • This article assumes that the reader is somewhat familiar with differential and tensor calculus [ref 1, 2]. Please refer to the previous articles related to geometric learning [ref 456, 7].
  • Source code is available at  Github.com/patnicolas/Data_Exploration/manifolds
  • To enhance the readability of the algorithm implementations, we have omitted non-essential code elements like error checking, comments, exceptions, validation of class and method arguments, scoping qualifiers, and import statements.


Introduction

Geometric learning tackles the challenges posed by scarce data, high-dimensional spaces, and the requirement for independent representations in the creation of advanced machine learning models. The fundamental aim of studying Riemannian geometry is to comprehend and scrutinize the characteristics of curved spaces, which are not sufficiently explained by Euclidean geometry alone. 

In this article, we delve into the Riemann curvature tensors, which are based on parallel transport and the covariant derivative discussed in earlier articles.

This article is the eight part of our ongoing series focused on geometric learning. In this installment, we utilize the Geomstats Python library [ref. 3] and explore the hypersphere manifold, which was introduced in a previous piece, Geometric Learning in Python: Manifolds - Hypersphere  and is detailed in the Geomstats API [ref. 8]. 

I highly recommend watching the comprehensive series of 22 YouTube videos Tensor Calculus - Eigenchris  to familiarize yourself with fundamental concepts of differential geometry.  Summaries of my earlier articles on this topic can be found in the Appendix

Riemann curvature

Curvature tensor

Curvature measures the extent to which a geometric object like a curve strays from being straight or how a surface diverges from being flat. It can also be defined as a vector that incorporates both the direction and the magnitude of the curve's deviation. 

The Riemann curvature tensor is a widely used method for quantifying the curvature of Riemannian manifolds. It is calculated from the metric tensor and provides a tensor field at each point on the manifold.

Given a Riemann manifold M with a Levi-Civita connection and a tensor metric g, the Riemann curvature tensor related to vector fields X, Y and Z can be written:
\[\begin{matrix} \mathbf{\Omega (N) }\ \times  \ \mathbf{\Omega (N)} \ \times  \ \mathbf{\Omega (N)} \rightarrow \mathbf{\Omega (N)} \ \ \ \ \ \ \ \ \ \ \ \\ X, Y, Z \to R(X, Y)Z \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \\ R(X, Y)Z= \triangledown_{X} \triangledown_{Y}Z-\triangledown_{Y}\triangledown_{X}Z - \triangledown_{[X, Y]}Z \\ [X, Y] = X.\frac{\partial }{\partial Y} - Y.\frac{\partial }{\partial X} \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \end{matrix}\] [X, Y] is the lie bracket.
The Riemann curvature tensor can be written using the Christoffel symbols as: \[R_{jkl}^{i}=\frac{\partial \Gamma _{jl}^{i} } {\partial x^k} - \frac{\partial \Gamma _{jk}^{i} } {\partial x^l} + \Gamma_{rk}^{i} \Gamma_{jl}^{r} - \Gamma_{rl}^{i} \Gamma_{jk}^{r}\] This formula is complex and generate a lot of components for the curvature tensor. As the curvature has 4 indices, the number of curvature components is dim4
  • For a 2-dimension manifold (i.e., Plane, Sphere) 24=16 components
  • 3-dimension manifold  34= 81
  • For the space-time manifold 44= 256

In the case of the Levi-Civita connection, the properties of Torsion free and Metric compatibility produce symmetries that zero out many of these values. These symmetries are known as 1-2 indices symmetryBianchi identity3-4 indices symmetry and Flip symmetry [ref 9].

Note: All the 16 components of the curvature tensor in 2-dimension for a flat space are 0, as expected.

Geodesic deviation

A key challenge is determining whether geodesics in a curved space converge or diverge, a concept known as geodesic deviation, as depicted in the accompanying diagram.

Fig 1 Illustration of divergence of geodesics

The convergence and divergence of the geodesics is computed using the Riemann curvature tensor established in the previous section, more specifically the sign of the product of the Riemann curvature R and the vector orthogonal to the tangent vector on the geodesics.\[[R(X, Y)Y].X \ \ \ > 0 \ divergence, \ \ \ <0 \ convergence\]The geodesic deviation depends on the selection of the tangent vectors. This dependency is eliminated through normalization known as the sectional curvature tensor.

Sectional curvature tensor

The curvature tensor is undoubtedly a complex concept accompanied by intimidating formulas. However, a simpler expression of the curvature tensor can be obtained through sectional curvature. Essentially, sectional curvature refers to the curvature of two-dimensional sections of our manifold.

As outlined in the previous section, it is necessary to normalize the geodesic deviation to eliminate any dependency on the basis vectors of the manifold.
Given any point p ∈ M, any tangent 2-plane K to M at base point p, and a curvature tensor R, the sectional curvature of K(X, Y) is the real number\[K(X, Y)= \frac{\left \langle R(X,Y)Y, X \right \rangle}{\left \langle X, X \right \rangle \left \langle Y, Y \right \rangle - \left \langle X, Y \right \rangle^{2}}\]Note: The sectional curvature tensor is a prerequisite to the computation of the Ricci tensor which will be described in a future article.

Implementation

As in the previous article on Riemann geometry, we rely on the Geomstats library [ref 3] to evaluate the curvature tensor.

Hypersphere curvature

For a hypersphere (unit sphere) the Riemann curvature tensor can be expressed as: \[R_{ijkl}=g_{ik}g_{jl} - g_{il}g_{jk} \ \ \ \  \ R_{ikl}^{j} = \frac{R_{ijkl}}{g_{ij}}\]After taking into account the symmetry properties of Levi-Civita connection, only one of the 16 components of the curvature tensor is not null. For the intrinsic coordinates u, v \[R_{212}^{1}=sin(u)^{2}\] The computation of the curvature tensor implemented by the method curvature_tensor added to the class RiemannianConnection. It leverages the Geomstats curvature method of the class RiemannianMetric.
The computation requires 3 tangent unit vectors and the base point on the manifold.

def curvature_tensor(self, tgt_vectors: List[np.array], base_pt: np.array) -> np.array:
  
  if len(tgt_vectors) != 3 or any(vec is None for vec in tgt_vectors):
     raise GeometricException(f'Tangent vectors for the curvature {str(tgt_vectors)} is not properly defined')

  return self.riemannian_metric.curvature(
            tgt_vectors[0], 
            tgt_vectors[1], 
            tgt_vectors[2], 
            base_pt)

Let's apply this method to the computation of the Riemann  curvature for a sphere. 

hypersphere = Hypersphere(dim=2, equip=True, default_coords_type='extrinsic')
riemann_connection = RiemannianConnection(hypersphere, 'HyperSphere')
base_pt = np.array([1.5, 2.0, 1.6])

X = np.array([0.4, 0.1, 0.8])
Y = np.array([0.5, 0.1, -0.2])
Z = np.array([0.4, 0.9, 0.0])

curvature = riemann_connection.curvature_tensor([X, Y, Z], base_pt)
print(f'Curvature: {curvature}')

Output:
Curvature: [ 0.009 -0.004 -0.282]

Sectional curvature of hypersphere

The method sectional_curvature_tensor added to the class RiemannianConnection implements the computation of the sectional curvature tensor that requires two tangent unit vectors and a base point on the sphere. It invokes the sectional_curvature method of the Geomstats class RiemannianMetric.

def sectional_curvature_tensor(self,
                              tgt_vec1: np.array,
                              tgt_vec2: np.array,
                              base_pt: Optional[np.array] = None) -> np.array:
 
   if len(tgt_vec1) != len(tgt_vec2):
      raise GeometricException(f'Dimension of tangent vectors for sectional curvature {len(tgt_vec1)} '
                                                f'and {len(tgt_vec2)} should be identical')

   return self.riemannian_metric.sectional_curvature(tgt_vec1, tgt_vec2, base_pt)

Our evaluation has 3 test cases for the tangent vectors:
  • V(x, y, z) & V(x, y, -z)
  • V(x, y, z) & -V(x, y, z)
  • V(x, y, z) & 2V(x, y, z)
hypersphere = Hypersphere(dim=2, equip=True, default_coords_type='intrinsic')
riemann_connection = RiemannianConnection(hypersphere, 'HyperSphere')
base_pt = np.array([1.5, 2.0, 1.6])

# Case 1 (X, Y=X with inverse on Z component)
X = np.array([0.4, 0.1, 0.8])
Y = np.array([0.4, 0.1, -0.8])
sec_curvature = riemann_connection.sectional_curvature_tensor(X, Y, base_pt)
print(f'Sectional curvature 1: {sec_curvature}')

# Case 2: (X, Y=-X)
X = np.array([0.4, 0.1, 0.8])
Y = np.array([-0.4, -0.1, -0.8])
sec_curvature = riemann_connection.sectional_curvature_tensor(X, Y, base_pt)
print(f'Sectional curvature 2: {sec_curvature}')

# Case 3:  (X, Y = 2.X)
X = np.array([0.4, 0.1, 0.8])
Y = np.array([0.8, 0.2, 1.6])
sec_curvature = riemann_connection.sectional_curvature_tensor(X, Y, base_pt)
print(f'Sectional curvature 3: {sec_curvature}')

Output
Sectional curvature: -1.0
Sectional curvature: 0.0
Sectional curvature: 0.0

References


-------------
Patrick Nicolas has over 25 years of experience in software and data engineering, architecture design and end-to-end deployment and support with extensive knowledge in machine learning. 
He has been director of data engineering at Aideo Technologies since 2017 and he is the author of "Scala for Machine Learning", Packt Publishing ISBN 978-1-78712-238-3

Appendix

Here is the list of published articles related to geometric learning:
Geometric Learning in Python: Basics introduced differential geometry as an applied to machine learning and its basic components.
Geometric Learning in Python: Manifolds described manifold components such as tangent vectors, geodesics with implementation in Python for Hypersphere using the Geomstats library.
Geometric Learning in Python: Intrinsic Representation reviewed the various coordinates system using extrinsic and intrinsic representation.
Geometric Learning in Python: Vector and Covector fields described vector and covector fields with Python implementation in 2 and 3-dimension spaces.
Geometric Learning in Python: Vector Operators illustrated the differential operators, gradient, divergence, curl and laplacian using SymPy library.
Geometric Learning in Python: Functional Data Analysis described the key elements of non-linear functional data analysis to analysis curves, images, or functions in very  high-dimensional spaces
Geometric Learning in Python: Riemann Metric & Connection reviewed Riemannian metric tensor, Levi-Civita connection and parallel transport for hypersphere.

#geometriclearning #riemanngeometry #manifold #differential geometry #ai #python #geomstats