Thomas Lux, Layne T. Watson, Tyler Chang


Increases in the quantity of available data have allowed all fields of science to generate more accurate models of multivariate phenomena. Regression and interpolation become challenging when the dimension of data is large, especially while maintaining tractable computational complexity. Regression is a popular approach to solving approximation problems with high dimension; however, there are often advantages to interpolation. This paper presents a novel and insightful error bound for (piecewise) linear interpolation in arbitrary dimension and contrasts the performance of some interpolation techniques with popular regression techniques. Empirical results demonstrate the viability of interpolation for moderately high-dimensional approximation problems, and encourage broader application of interpolants to multivariate approximation in science.


Thomas Lux

Tyler Chang

Layne T. Watson

Publication Details

Date of publication:
November 13, 2020
Numerical Algorithms
Page number(s):
Publication note:

Thomas C. H. Lux, Layne T. Watson, Tyler H. Chang, Yili Hong, Kirk W. Cameron: Interpolation of sparse high-dimensional data. Numer. Algorithms 88(1): 281-313 (2021)