Treffer: A FASTER MULTIPOLE LEGENDRE-CHEBYSHEV TRANSFORM.

Title:
A FASTER MULTIPOLE LEGENDRE-CHEBYSHEV TRANSFORM.
Source:
SIAM Journal on Scientific Computing; 2024, Vol. 46 Issue 6, pA3803-A3826, 24p
Database:
Complementary Index

Weitere Informationen

This paper describes a fast algorithm for transforming Legendre coefficients into Chebyshev coefficients, and vice versa. The algorithm is based on the fast multipole method and is similar to the approach described by Alpert and Rokhlin [SIAM J. Sci. Comput., 12 (1991), pp. 158--179]. The main difference is that we utilize a modal Galerkin approach with Chebyshev basis functions instead of a nodal approach with a Lagrange basis. Part of the algorithm is a novel method that facilitates faster spreading of intermediate results through neighboring levels of hierarchical matrices. This enhancement leads to a method that is approximately 20 percent faster to execute, due to fewer floating point operations. We also describe an efficient initialization algorithm that for the Lagrange basis is roughly 5 times faster than the original method for large input arrays. The described method has both a planning and an execution stage that asymptotically require O(N) flops. The algorithm is simple enough that it can be implemented in 100 lines of vectorized Python code. Moreover, its efficiency is such that a single-threaded C implementation can transform 1,000,000 coefficients in approximately 20 milliseconds on a new MacBook Pro M3, representing about 3 times the execution time of a well-planned (single-threaded) type 2 discrete cosine transform from FFTW (www.fftw.org). Planning for these coefficients requires approximately 50 milliseconds. [ABSTRACT FROM AUTHOR]

Copyright of SIAM Journal on Scientific Computing is the property of Society for Industrial & Applied Mathematics and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)