logo

다변수 함수에 대한 테일러 정리 📂다변수벡터해석

다변수 함수에 대한 테일러 정리

정리1

f:RnRf : \mathbb{R}^{n} \to \mathbb{R}CkC^{k} 함수, a=(a1,,an)Rn\mathbf{a} = (a_{1}, \dots, a_{n}) \in \mathbb{R}^{n}라고 하자. 그러면 다음을 만족하는 Ck2C^{k-2} 함수 hijh_{ij}가 존재한다.

f(x)=f(a)+i(xiai)fxi(a)+i,jhij(x)(xiai)(xjaj) f(\mathbf{x}) = f(\mathbf{a}) + \sum_{i} (x_{i} - a_{i})\dfrac{\partial f}{\partial x_{i}}(\mathbf{a}) + \sum_{i,j}h_{ij}(\mathbf{x})(x_{i} - a_{i}) (x_{j} - a_{j})

설명

테일러 정리를 다변수함수로 일반화한 것이다.

  • second-order

    f(x)=f(a)+i=1n(xiai)fxi(a)+12!i,j=1n(xiai)22fxixj(a)+Remainder=f(a)+(xa)Tf(a)+12!(xa)T(H(a))(xa)+Remainder \begin{align*} f(\mathbf{x}) &= f(\mathbf{a}) + \sum\limits_{i=1}^{n} (x_{i} - a_{i}) \dfrac{\partial f}{\partial x_{i}}(\mathbf{a}) + \dfrac{1}{2!}\sum\limits_{i,j=1}^{n} (x_{i} - a_{i})^{2} \dfrac{\partial^{2} f}{\partial x_{i} \partial x_{j}}(\mathbf{a}) + \text{Remainder} \\ &= f(\mathbf{a}) + (\mathbf{x} - \mathbf{a})^{T} \nabla f (\mathbf{a}) + \dfrac{1}{2!}(\mathbf{x} - \mathbf{a})^{T} (H(\mathbf{a})) (\mathbf{x} - \mathbf{a}) + \text{Remainder} \end{align*}

여기서 f\nabla fff 그래디언트, HHff헤시안이다.

나머지 항remainder term에 대해서 다음과 같은 꼴도 유용하게 쓰인다.

f(x+p)=f(x)+pTf(x+tp)for some t(0,1) f(\mathbf{x} + \mathbf{p}) = f(\mathbf{x}) + \mathbf{p}^{T}\nabla f(\mathbf{x} + t \mathbf{p}) \quad \text{for some } t \in (0,1) f(x+p)=f(x)+pTf(x)+12!pTH(x+tp)pfor some t(0,1) f(\mathbf{x} + \mathbf{p}) = f(\mathbf{x}) + \mathbf{p}^{T}\nabla f(\mathbf{x}) + \dfrac{1}{2!}\mathbf{p}^{T} H(\mathbf{x} + t \mathbf{p}) \mathbf{p} \quad \text{for some } t \in (0,1)

f(x+p)=f(x)+01pTf(x+tp)dt f(\mathbf{x} + \mathbf{p}) = f(\mathbf{x}) + \int_{0}^{1}\mathbf{p}^{T}\nabla f (\mathbf{x} + t\mathbf{p})dt

증명

f(x)f(a)= 01ddt[f(t(xa)+a)]dt= 01(ifxi(t(xa)+a)(xiai))dtby \href= i(xiai)01(fxi(t(xa)+a))dt \begin{align*} f(\mathbf{x}) - f(\mathbf{a}) =&\ \int_{0}^{1} \dfrac{d}{dt} \left[ f(t(\mathbf{x} - \mathbf{a}) + \mathbf{a}) \right]dt \\ =&\ \int_{0}^{1} \left( \sum_{i} \dfrac{\partial f}{\partial x_{i}}\left( t(\mathbf{x} - \mathbf{a}) + \mathbf{a} \right)(x_{i}-a_{i}) \right) dt & \text{by } \href{https://freshrimpsushi.github.io/posts/3134}{\text{chain rule}} \\ =&\ \sum_{i}(x_{i} - a_{i}) \int_{0}^{1} \left( \dfrac{\partial f}{\partial x_{i}}\left( t(\mathbf{x} - \mathbf{a}) + \mathbf{a} \right) \right) dt \end{align*}

적분부분을 gi(x)g_{i}(\mathbf{x})라고 표기하자. gi(x)=01(fxi(t(xa)+a))dtg_{i}(\mathbf{x}) = \displaystyle \int_{0}^{1} \left( \dfrac{\partial f}{\partial x_{i}}\left( t(\mathbf{x} - \mathbf{a}) + \mathbf{a} \right) \right) dt라고 하면,

f(x)f(a)=i(xiai)01(fxi(t(xa)+a))dt=igi(x)(xiai) \begin{equation} f(\mathbf{x}) - f(\mathbf{a}) = \sum_{i}(x_{i} - a_{i}) \int_{0}^{1} \left( \dfrac{\partial f}{\partial x_{i}}\left( t(\mathbf{x} - \mathbf{a}) + \mathbf{a} \right) \right) dt = \sum_{i} g_{i}(\mathbf{x}) (x_{i} - a_{i}) \end{equation}

gi(a)g_{i}(\mathbf{a})의 값은 다음과 같다.

gi(a)=01fxi(t(aa)+a)dt=01fxi(a)dt=fxi(a) g_{i}(\mathbf{a}) = \int_{0}^{1} \dfrac{\partial f}{\partial x_{i}} \left(t(\mathbf{a} - \mathbf{a}) + \mathbf{a} \right) dt = \int_{0}^{1} \dfrac{\partial f}{\partial x_{i}}\left( \mathbf{a} \right) dt = \dfrac{\partial f}{\partial x_{i}}\left( \mathbf{a} \right)

그러면 (1)(1)을 이끌어냈던 것과 같은 방법으로 다음의 식을 얻을 수 있다.

gi(x)gi(a)=jhij(x)(xjaj) g_{i}(\mathbf{x}) - g_{i}(\mathbf{a}) = \sum_{j} h_{ij}(\mathbf{x}) (x_{j}-a_{j})

이제 정리하면

f(x)= f(a)+igi(x)(xiai)= f(a)+i(gi(a)+jhij(x)(xjaj))(xiai)= f(a)+igi(a)(xiai)+i,jhij(x)(xiai)(xjaj)= f(a)+ifxi(a)(xiai)+i,jhij(x)(xiai)(xjaj) \begin{align*} f(\mathbf{x}) =&\ f(\mathbf{a}) + \sum_{i}g_{i}(\mathbf{x})(x_{i}-a_{i}) \\ =&\ f(\mathbf{a}) + \sum_{i}\left( g_{i}(\mathbf{a}) + \sum_{j} h_{ij}(\mathbf{x}) (x_{j}-a_{j}) \right)(x_{i}-a_{i}) \\ =&\ f(\mathbf{a}) + \sum_{i} g_{i}(\mathbf{a})(x_{i}-a_{i}) + \sum_{i,j} h_{ij}(\mathbf{x})(x_{i}-a_{i})(x_{j}-a_{j}) \\ =&\ f(\mathbf{a}) + \sum_{i} \dfrac{\partial f}{\partial x_{i}}\left( \mathbf{a} \right)(x_{i}-a_{i}) + \sum_{i,j} h_{ij}(\mathbf{x})(x_{i}-a_{i})(x_{j}-a_{j}) \end{align*}

같이보기


  1. Richard S. Millman and George D. Parker, Elements of Differential Geometry (1977), p213-214 ↩︎