Further Limitations of the Known Approaches for Matrix Multiplication

© Josh Alman and Virginia V. Williams. We consider the techniques behind the current best algorithms for matrix multiplication. Our results are threefold. (1) We provide a unifying framework, showing that all known matrix multiplication running times since 1986 can be achieved from a single very nat...

Full description

Bibliographic Details
Main Authors: Alman, Josh, Williams, Virginia Vassilevska
Other Authors: Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
Format: Article
Language:English
Published: 2021
Online Access:https://hdl.handle.net/1721.1/137754
Description
Summary:© Josh Alman and Virginia V. Williams. We consider the techniques behind the current best algorithms for matrix multiplication. Our results are threefold. (1) We provide a unifying framework, showing that all known matrix multiplication running times since 1986 can be achieved from a single very natural tensor - the structural tensor Tq of addition modulo an integer q. (2) We show that if one applies a generalization of the known techniques (arbitrary zeroing out of tensor powers to obtain independent matrix products in order to use the asymptotic sum inequality of Schönhage) to an arbitrary monomial degeneration of Tq, then there is an explicit lower bound, depending on q, on the bound on the matrix multiplication exponent ω that one can achieve. We also show upper bounds on the value α that one can achieve, where α is such that n × nα × n matrix multiplication can be computed in n2+o(1) time. (3) We show that our lower bound on ω approaches 2 as q goes to infinity. This suggests a promising approach to improving the bound on ω: for variable q, find a monomial degeneration of Tq which, using the known techniques, produces an upper bound on ω as a function of q. Then, take q to infinity. It is not ruled out, and hence possible, that one can obtain ω = 2 in this way.