One a Day One Liners with Python — Week 3

Matrix and Vector Operations

Jeremy Brown
Python in Plain English

--

Photo by Vlado Paunovic on Unsplash

I began my journey into the field of Machine Learning around 2013. Having little background in mathematics, I quickly realized that I had a lot of foundational knowledge to acquire. At the top of my list of things to learn were linear algebra and calculus.

Coding the Matrix” from Brown University and a book called “Matrices and Linear Algebra” by Dover were my two main resources. The class was brutally challenging, but very rewarding. In it, we used Python to solve a wide variety of problems with linear algebra. Everything was coded from scratch and we didn’t use Numpy at all!

As an homage to those formative days, the theme for this week’s One Liners is Matrix and Vector operations.

Jan 21, 2023

Calculate the Frobenius norm of a matrix √

from math import sqrt

a = [
[0, 1, 2, 3],
[4, 5, 6, 7]
]

f_norm = sqrt(sum([sum([j**2 for j in i]) for i in a]))

Discussion

Our final post on Matrix and Vector operations! Maybe we’ll revisit the topic in another week. It’s quite a rich area to explore and lends itself nicely to the One Liner lexicon.

Resources

Jan 20, 2023

Calculate the cosine similarity of two vectors ← ↑ →

from math import sqrt
a = [1, 3, -5]
b = [4, -2, -1]
sim = sum([x*y for x,y in zip(a,b)]) / (sqrt(sum([i**2 for i in a])) * (sqrt(sum([i**2 for i in b]))))

Discussion

Cosine similarity between measures how similar the angles of two vectors are, without regard for their magnitude.

Following from our matrix multiplication One Liner from a few days, we take the dot product of the two vectors and divide it by the product of their magnitudes. This calculation will return a value on the interval [-1, 1]. Where -1 indicates opposite angles, 0 indicates orthogonality and 1 indicates that the angles of the vectors are equal.

As a bonus, you can easily calculate the angle (in radians) between the vectors using arccosine of the similarity.

from math import acos
theta = acos(sim)

Resources

Jan 19, 2023

Compute the direct sum of two matrices ⊕

a = [
[1, 1, 1, 1],
[2, 2, 2, 2]
]

b = [
[3, 3, 3, 3, 3, 3],
[4, 4, 4, 4, 4, 4]
]

ds = [row+[0]*len(b[0]) for row in a] + [[0]*len(a[0])+row for row in b]

Discussion

The direct sum of two matrices is defined as a block diagonal matrix of matrices A & B. The result of this routine looks like this:

[
[1, 1, 1, 1, 0, 0, 0, 0, 0, 0],
[2, 2, 2, 2, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 3, 3, 3, 3, 3, 3],
[0, 0, 0, 0, 4, 4, 4, 4, 4, 4]
]

On the left side of this One Liner, we effectively extend the rows a of with len(b[0]) zeros. On the right side, we do the opposite. We fill the first len(a[0]) indices of each row with zeros and then concatenate each row of b. Then we concat both sides to create a new matrix.

Notice this little trick: [0] * len(b). This creates a list of zeros of len(b). Originally, I used another list comprehension to make the list of zeros, but this syntactic sugar saves us a few key strokes.

Jan 18, 2023

Partition a matrix into a sub-matrix 🍕

a = [  
[0, 1, 2, 3, 4, 5],
[5, 4, 3, 2, 1, 0],
[2, 4, 6, 8, 10, 12],
[3, 6, 9, 12, 15, 18]
]

x, y = [0, 2], [1, 4]
sub = [m[y[0]:y[1]] for m in a[x[0]:x[1]]]

Discussion

Occasionally, you might need to take a slice out of a matrix. The variables x and y define the start and end indices of the row and column slices. In this case the partition or “slice” is:

[
[1, 2, 3],
[4, 3, 2]
]

Jan 17, 2023

Multiply two matrices 🅰️✖🅱️

a = [
[1, 2, 3],
[4, 5, 6]
]

b = [
[7, 8, 0],
[9, 10, 0],
[11, 12, 0]
]

result = [[sum([x*y for x,y in zip(row, col)]) for col in zip(*b)] for row in a]

Discusion

This One Liner is quite clever. IMO, it’s easier to read it from right to left. Doing so, we see that to start, we’re simply iterating over the rows of the matrix a. Then we iterate over the columns of b. This is where it starts to get interesting. The zip method creates a generator over sets of elements that exist at the same index in different lists. However, we can’t just call zip directly on b. Instead, we must “unpack” b first, and that is what the asterisk is does. Doing so gives us the columns of b.

Next we take the dot product of each row of a and column of b. Once again, the zip function proves quite useful. Zipping a row and column gives us pairs of values, which we then multiply and finally sum.

Jan 16, 2023

Find the transpose a square matrix 🙃

matrix = [ 
[0, 1, 2],
[3, 4, 5],
[6, 7, 8]
]

transpose = [[matrix[n][m] for n in range(len(i))] for m, i in enumerate(matrix)]

Discussion

The transpose can be thought of as an operation that flips a matrix over on it’s diagonal. The tricky part in this One Liner is where we flip the indices m and n around to access elements of the original matrix.

Here is the result of this operation on the matrix above:

[
[0, 3, 6],
[1, 4, 7],
[2, 5, 8]
]

Jan 15, 2023

Create an identity matrix 🪪

n = 5
matrix = [[1 if i == j else 0 for j in range(n)] for i in range(n)]

Discussion

Yet another nested list comprehension! An identity matrix is an n by n matrix with ones on the diagonal and zeroes elsewhere. It is the matrix analog of the multiplicative identity 1. That is, any square matrix A multiplied by an identity matrix of the same shape is equal to A. Additionally, a square matrix, multiplied by it’s inverse is equal to the identity matrix.

Join in…

Feel free to leave comments here or clone the repo on Github and make a pull request if you think you’ve got a better solution. Benchmarks are welcome too!

More content at PlainEnglish.io. Sign up for our free weekly newsletter. Follow us on Twitter, LinkedIn, YouTube, and Discord.

Interested in scaling your software startup? Check out Circuit.

--

--

Software Engineer, originally from the U.S. and now living and working in Vienna, Austria. I write about Programming, Music, Machine Learning & NLP