Research

Large Scale Correlation Using Matrix Operations

9 Mar , 2015  

Computing correlation between vectors is a pretty standard process – and that functionality is included in many standard programming packages spanning, I would venture, all common programming languages.

The basic idea – given two vectors v1 and v2 what is the mutual relationship between the two. A classic measure of “mutual relationship” defined by the pearson’s correlation coefficient is simply the covariance of the two vectors divided by the product of their standard deviations \frac{cov(v1,v2)}{\sigma_{v1} \times \sigma_{v2}}.

The problem can be also translated to a set of vector operations involving the computation of the mean of each vector, the Sum of Squares (SSQ) of each vector and the Sum of Products between the two vectors. (If readers are interested – we can dig deeper here).

orig_pcorr

But what if you have a matrix M that is  n \times m can you efficiently compute the n \times n correlation matrix?

First off, let’s define a few helpers:

  1. Sum of Products: MM^{T} = SP and;
  2. the diagonal of SP as a vector.

addons 

Next, using the Sum of Products Matrix, a n-column version of the diagonal (DiExt) as well it’s transpose (DiExt^{T}), we can use these to compute the pearson’s correlation coefficient using Sum of Products.

Pearson’s correlation = \frac{SP}{\sqrt{DiExt \times DiExt^{T}}}:

PCorr

Here’s a python function that returns that matrix:

And now for some sample code using that function:

So here are some results:

1. 10 vectors of length 1000:

v1_10

2.  100 vectors of length 1000:

v1

As expected, given random normal distributions, vectors should have low correlation between any other vector (and trivially perfect correlation to themselves).

, , , ,



Comments are closed.