Oblivious sketching has emerged as a powerful approach to speeding up numerical linear
algebra over the past decade, but our understanding of oblivious sketching solutions for kernel matrices has remained quite limited, suffering from the aforementioned exponential dependence on input parameters. Our main contribution is a general method for applying sketching solutions developed in numerical linear algebra over the past decade to a tensoring of data points without forming the tensoring explicitly. This leads to the first oblivious sketch for the polynomial kernel with a target dimension that is only polynomially dependent on the degree of the kernel
function, as well as the first oblivious sketch for the Gaussian kernel on bounded datasets that
does not suffer from an exponential dependence on the dimensionality of input data points.
|Titel||SODA '20: Proceedings of the Thirty-First Annual ACM-SIAM Symposium on Discrete Algorithms|
|Forlag||Association for Computing Machinery|
|Status||Udgivet - 2020|