0

I'm taking a numerical linear algebra course and I've chosen to use python as my language of choice (want to be employable). Is there a way to evaluate (AB)C vs A(BC), where A,B,C are conformable matrices? I want to check cpu time and operation count for each of these. In addition, is there a way to force python to calculate AB as a sum of outer products and as a Matrix whose entries are the inner product of the rows and columns of A, and B respectively. I'm new to python and haven't had any luck with a google search, which is rare. Im using python 3.5* that uses @ for matrix multiplication. I searched for resources where numerical linear algebra is done using python but haven't found anything useful. Thanks for your help.

1 Answers1

0

The answer to the first question is trivial: write (A @ B) @ C versus A @ (B @ C). CPython is smart enough that know that it is too dumb to try to rewrite expressions. The language definitiion does not require that the special methods (such as __add__ for +) have any particular properties. (In fact, float add is not associative.)

>>> from dis import dis
>>> dis('(a + b) + c')
  1           0 LOAD_NAME                0 (a)
              3 LOAD_NAME                1 (b)
              6 BINARY_ADD
              7 LOAD_NAME                2 (c)
             10 BINARY_ADD
             11 RETURN_VALUE
>>> dis('a + (b + c)')
  1           0 LOAD_NAME                0 (a)
              3 LOAD_NAME                1 (b)
              6 LOAD_NAME                2 (c)
              9 BINARY_ADD
             10 BINARY_ADD
             11 RETURN_VALUE

Replace + with @ and the output is the same with ADD changed to MATRIX_MULTIPLY.

I do not understand the details of the second question, but you should be able to define your own Matrix class with a __matmul__ method that does what you want. It could either be 'pure' python or build on numpy.

Terry Jan Reedy
  • 18,414
  • 3
  • 40
  • 52
  • Thank you Terry, this is exactly what I was looking for. I wasn't sure I python would evaluate it the way I specify because this will affect number of operarions for large matrices. for my second question if you look at A which is an (m,n) matrix as row vector of n entries where each entry is an (m,1) column vector and matrix B which is (n,p) as a column vector with n entries where each entry is an (1,p) row vector, then AB looks like an inner product in the big picture and in the small picture if I multiply the first entry of A times the first entry of B i get an (m,1)(1,p) = (m,p) matrix. I – William Hardy Jan 11 '16 at 01:14
  • do this for all entries in A and B yielding a sum of n (m,p) matrices which equal AB . I want to mimic this in python – William Hardy Jan 11 '16 at 01:19
  • If I switch roles of A and B then in the big picture A times B looks like an outer product where each entry is an inner product of AB. When I timeit it in python I'm expecting it to be much faster in the latter case since the first case is much more memory intensive and require more operations. – William Hardy Jan 11 '16 at 01:32