6

Given the Lagrange basis polynomial as:

$L_i(x)= \prod_{m=0, m \neq i}^n \frac{x-x_m}{x_i-x_m} $

is there a generic equation for the first derivative ${L_i}'(x)$ for any order,t hat is for any $n$?

3 Answers3

11

By the "logarithmic derivative" method, $$\frac{L'_i(x)}{L_i(x)}=\sum_{m=0,\ m\neq i}^n\frac1{x-x_m}.$$

  • I know, this question and answer is already a year old, but I'm having some problems understanding the formula. By construction $L_i(x_i)=1$ is ans extrema of the $i-$th basis function (isn't it?). But then, $L_i'(x_i)$ should be zero. But that isn't the case when I'm using the above formula. I understand the derivation but still can't solve this contradiction. Could you help? – Thomas Jan 14 '15 at 12:39
  • "By construction $L_i (x_i )=1$ is an extremum of the $i^{th}$ basis function": not at all. –  Jan 14 '15 at 13:47
  • Well then this has to be due to the specific choice of my basis points, I'll take a look into that. Thank you! – Thomas Jan 15 '15 at 09:20
  • Could you maybe take a look at the question I formulated? Thanks in advance! – Thomas Jan 15 '15 at 09:49
4

$$L_{j}(x) = \prod_{i\neq j} \frac{x-x_{i}}{x_{j}-x_{i}} $$

then $$ ln\Big(L_{j}(x)\Big) = ln\Big(\prod_{i\neq j} \frac{x-x_{i}}{x_{j}-x_{i}} \Big) = \sum_{i \neq j } ln\Big( \frac{x-x_{i}}{x_{j}-x_{i}} \Big) $$

if we derivate we have:

$$ \frac{L'_{j}(x)}{L_{j}(x)} =\sum_{i \neq j} \frac{\frac{1}{x_{j}-x_{i}}}{\frac{x-x_{i}}{x_{j}-x_{i}}} = \sum_{i \neq j} \frac{1}{x-x_{i}} $$ then

$$ L'_{j}(x) = L_{j}(x) \Big( \sum_{i \neq j} \frac{1}{x-x_{i}} \Big) $$

if we use the product rule for derivatives we have that:

$$ L''_{j}(x) = L'_{j}(x) \Big( \sum_{i \neq j} \frac{1}{x-x_{i}} \Big)+L_{j}(x) \Big( \sum_{i \neq j} \frac{1}{x-x_{i}} \Big)' \\ = L'_{j}(x) \Big( \sum_{i \neq j} \frac{1}{x-x_{i}} \Big)+L_{j}(x) \Big( \sum_{i \neq j} \frac{-1}{(x-x_{i})^2} \Big) $$ we know $ L'_{j}(x) $ so $$ \\ = L_{j}(x) \Big( \sum_{i \neq j} \frac{1}{x-x_{i}} \Big)\Big( \sum_{i \neq j} \frac{1}{x-x_{i}} \Big)-L_{j}(x) \Big( \sum_{i \neq j} \frac{1}{(x-x_{i})^2} \Big)\\ = L_{j}(x) \Big( \sum_{i \neq j} \frac{1}{x-x_{i}} \Big)^2 -L_{j}(x) \Big( \sum_{i \neq j} \frac{1}{(x-x_{i})^2} \Big) \\ = L_{j}(x) \Big\{\Big( \sum_{i \neq j} \frac{1}{x-x_{i}} \Big)^2 - \sum_{i \neq j} \frac{1}{(x-x_{i})^2} \Big\} $$

tnt235711
  • 421
2

Let me suggest an alternative approach. You can find coefficients of Lagrange interpolation polynomial or any of its derivatives relatively easy if you use a matrix form of Lagrange interpolation presented in "Beginner's guide to mapping simplexes affinely", section "Lagrange interpolation". General formula for the polynomial looks as follows $$ f(x) = (-1) \frac{ \det \begin{pmatrix} 0 & f_0 & f_1 & \cdots & f_n \\ x^n & x_0^n & x_1^n & \cdots & x_n^n \\ x^{n-1} & x_0^{n-1} & x_1^{n-1} & \cdots & x_n^{n-1} \\ \cdots & \cdots & \cdots & \cdots & \cdots \\ x & x_0 & x_1 & \cdots & x_n \\ 1 & 1 & 1 & \cdots & 1 \\ \end{pmatrix} }{ \det \begin{pmatrix} x_0^n & x_1^n & \cdots & x_n^n \\ x_0^{n-1} & x_1^{n-1} & \cdots & x_n^{n-1} \\ \cdots & \cdots & \cdots & \cdots \\ x_0 & x_1 & \cdots & x_n \\ 1 & 1 & \cdots & 1 \\ \end{pmatrix} }. $$ Here $(x_0;f_0)$, $\dots$, $(x_n;f_n)$ are the points it passes through. Using Laplace expansion along the first column you can get expressions for coefficients at $x^i$. If instead we perform Laplace expansion along the first row, we will get sum containing $f_i\,L_i(x)$, where $L_i(x)$ is a basis polynomial.

If we take derivative of the expression above, it will only act on the first column of the matrix in the numerator (the only one containing $x$'s). One can prove this by expanding determinant in the numerator along the first column, taking derivative and then collecting everything back. As a result, first derivative looks as follows $$ f'(x) = (-1) \frac{ \det \begin{pmatrix} 0 & f_0 & f_1 & \cdots & f_n \\ n x^{n-1} & x_0^n & x_1^n & \cdots & x_n^n \\ (n-1) x^{n-2} & x_0^{n-1} & x_1^{n-1} & \cdots & x_n^{n-1} \\ \cdots & \cdots & \cdots & \cdots & \cdots \\ 1 & x_0 & x_1 & \cdots & x_n \\ 0 & 1 & 1 & \cdots & 1 \\ \end{pmatrix} }{ \det \begin{pmatrix} x_0^n & x_1^n & \cdots & x_n^n \\ x_0^{n-1} & x_1^{n-1} & \cdots & x_n^{n-1} \\ \cdots & \cdots & \cdots & \cdots \\ x_0 & x_1 & \cdots & x_n \\ 1 & 1 & \cdots & 1 \\ \end{pmatrix} }. $$ Higher-order derivatives can be calculated the same way --- you change the first column appropriately, all the rest remains.

If you would like to have the derivative of the basis polynomial only, you can do it either way: consider appropriate cofactor only (remove the first row and the column that corresponds to index $i$ you are interested in), or put some formal orthonormal vectors as the first row and find $L_i'$ as appropriate factor at $\vec{e}_i$. The second approach will give you $$ L'_0(x) \vec{e}_0 + L'_1(x) \vec{e}_1 + \dots + L'_n(x) \vec{e}_n = (-1) \frac{ \det \begin{pmatrix} 0 & \vec{e}_0 & \vec{e}_1 & \cdots & \vec{e}_n \\ n x^{n-1} & x_0^n & x_1^n & \cdots & x_n^n \\ (n-1) x^{n-2} & x_0^{n-1} & x_1^{n-1} & \cdots & x_n^{n-1} \\ \cdots & \cdots & \cdots & \cdots & \cdots \\ 1 & x_0 & x_1 & \cdots & x_n \\ 0 & 1 & 1 & \cdots & 1 \\ \end{pmatrix} }{ \det \begin{pmatrix} x_0^n & x_1^n & \cdots & x_n^n \\ x_0^{n-1} & x_1^{n-1} & \cdots & x_n^{n-1} \\ \cdots & \cdots & \cdots & \cdots \\ x_0 & x_1 & \cdots & x_n \\ 1 & 1 & \cdots & 1 \\ \end{pmatrix} }. $$

guest
  • 1,814