TY - JOUR T1 - Linear Regression to Minimize the Total Error of the Numerical Differentiation AU - Jengnan Tzeng JO - East Asian Journal on Applied Mathematics VL - 4 SP - 810 EP - 826 PY - 2018 DA - 2018/02 SN - 7 DO - http://doi.org/10.4208/eajam.161016.300517a UR - https://global-sci.org/intro/article_detail/eajam/10722.html KW - Truncation error, leading coefficient, asymptotic constant, rounding error. AB -
It is well known that numerical derivative contains two types of errors. One is
truncation error and the other is rounding error. By evaluating variables with rounding
error, together with step size and the unknown coefficient of the truncation error, the
total error can be determined. We also know that the step size affects the truncation
error very much, especially when the step size is large. On the other hand, rounding
error will dominate numerical error when the step size is too small. Thus, to choose a
suitable step size is an important task in computing the numerical differentiation. If we
want to reach an accuracy result of the numerical difference, we had better estimate the
best step size. We can use Taylor Expression to analyze the order of truncation error,
which is usually expressed by the big O notation, that is, $E(h)=Ch^k$. Since the leading
coefficient $C$ contains the factor $f^{(k)}(ξ)$ for high order $k$ and unknown $ξ$, the truncation
error is often estimated by a roughly upper bound. If we try to estimate the high order
difference $f^{(k)}(ξ)$, this term usually contains larger error. Hence, the uncertainty of $ξ$ and the rounding errors hinder a possible accurate numerical derivative.
We will introduce the statistical process into the traditional numerical difference. The
new method estimates truncation error and rounding error at the same time for a given
step size. When we estimate these two types of error successfully, we can reach much
better modified results. We also propose a genetic approach to reach a confident numerical derivative.