In calculus, the Product Rule (or Leibniz rule[1] or Leibniz product rule) is a formula used to find the derivatives of products of two or more functions. For two functions, it may be stated in Lagrange’s notation as
{\displaystyle (u\cdot v)'=u'\cdot v+u\cdot v'}
or in Leibniz’s notation as
{\displaystyle {\dfrac {d}{dx}}(u\cdot v)={\dfrac {du}{dx}}\cdot v+u\cdot {\dfrac {dv}{dx}}.}
The rule may be extended or generalized to products of three or more functions, to a rule for higher-order derivatives of a product, and to other contexts.
Discovery
Discovery of this rule is credited to Gottfried Leibniz, who demonstrated it using differentials.[2] (However, J. M. Child, a translator of Leibniz’s papers,[3] argues that it is due to Isaac Barrow.) Here is Leibniz’s argument: Let u(x) and v(x) be two differentiable functions of x. Then the differential of uv is
{\begin{aligned}d(u\cdot v)&{}=(u+du)\cdot (v+dv)-u\cdot v\\&{}=u\cdot dv+v\cdot du+du\cdot dv.\end{aligned}}
Since the term du·dv is “negligible” (compared to du and dv), Leibniz concluded that
{\displaystyle d(u\cdot v)=v\cdot du+u\cdot dv}
and this is indeed the differential form of the product rule. If we divide through by the differential dx, we obtain
{\displaystyle {\frac {d}{dx}}(u\cdot v)=v\cdot {\frac {du}{dx}}+u\cdot {\frac {dv}{dx}}}
which can also be written in Lagrange’s notation as
{\displaystyle (u\cdot v)'=v\cdot u'+u\cdot v'.}
Examples
- Suppose we want to differentiate f(x) = x2 sin(x). By using the product rule, one gets the derivative f′(x) = 2x sin(x) + x2 cos(x) (since the derivative of x2 is 2x and the derivative of the sine function is the cosine function).
- One special case of the product rule is is the constant multiple rule, which states: if c is a number and f(x) is a differentiable function, then cf(x) is also differentiable, and its derivative is (cf)′(x) = c f′(x). This follows from the product rule since the derivative of any constant is zero. This, combined with the sum rule for derivatives, shows that differentiation is linear.
- The rule for integration by parts is derived from the product rule, as is (a weak version of) the quotient rule. (It is a “weak” version in that it does not prove that the quotient is differentiable, but only says what its derivative is if it is differentiable.)
Proofs
Proof by factoring (from first principles)
Let h(x) = f(x)g(x) and suppose that f and g are each differentiable at x. We want to prove that h is differentiable at x and that its derivative, h′(x), is given by f′(x)g(x) + f(x)g′(x). To do this,
{\displaystyle f(x)g(x+\Delta x)-f(x)g(x+\Delta x)}
(which is zero, and thus does not change the value) is added to the numerator to permit its factoring, and then properties of limits are used.
{\displaystyle {\begin{aligned}h'(x)&=\lim _{\Delta x\to 0}{\frac {h(x+\Delta x)-h(x)}{\Delta x}}\\[5pt]&=\lim _{\Delta x\to 0}{\frac {f(x+\Delta x)g(x+\Delta x)-f(x)g(x)}{\Delta x}}\\[5pt]&=\lim _{\Delta x\to 0}{\frac {f(x+\Delta x)g(x+\Delta x)-f(x)g(x+\Delta x)+f(x)g(x+\Delta x)-f(x)g(x)}{\Delta x}}\\[5pt]&=\lim _{\Delta x\to 0}{\frac {{\big [}f(x+\Delta x)-f(x){\big ]}\cdot g(x+\Delta x)+f(x)\cdot {\big [}g(x+\Delta x)-g(x){\big ]}}{\Delta x}}\\[5pt]&=\lim _{\Delta x\to 0}{\frac {f(x+\Delta x)-f(x)}{\Delta x}}\cdot \underbrace {\lim _{\Delta x\to 0}g(x+\Delta x)} _{\text{See the note below.}}+\lim _{\Delta x\to 0}f(x)\cdot \lim _{\Delta x\to 0}{\frac {g(x+\Delta x)-g(x)}{\Delta x}}\\[5pt]&=f'(x)g(x)+f(x)g'(x).\end{aligned}}}
The fact that
{\displaystyle \lim _{\Delta x\to 0}g(x+\Delta x)=g(x)}
is deduced from a theorem that states that differentiable functions are continuous.
Brief proof
By definition, if f, g : R –> R are differentiable at x then we can write
f(x+h)=f(x)+f'(x)h+\psi _{1}(h)\qquad \qquad g(x+h)=g(x)+g'(x)h+\psi _{2}(h)
such ~~that~~ {\displaystyle \lim _{h\to 0}{\frac {\psi _{1}(h)}{h}}=\lim _{h\to 0}{\frac {\psi _{2}(h)}{h}}=0,} ~~also ~~written ~~\psi _{1},\psi _{2}\sim o(h). Then:
{\displaystyle {\begin{aligned}fg(x+h)-fg(x)&=(f(x)+f'(x)h+\psi _{1}(h))(g(x)+g'(x)h+\psi _{2}(h))-fg(x)\\&=f(x)g(x)+f'(x)g(x)h+f(x)g'(x)h-fg(x)+{\text{other terms}}\\&=f'(x)g(x)h+f(x)g'(x)h+o(h)\\[12pt]\end{aligned}}}
The “other terms” consist of items such as
{\displaystyle f(x)\psi _{2}(h),f'(x)g'(x)h^{2}}~~ and ~~{\displaystyle hf'(x)\psi _{1}(h).}
It is not difficult to show that they are all o (h) Dividing by h and taking the limit for small h gives the result.
Quarter squares
There is a proof using quarter square multiplication which relies on the chain rule and on the properties of the quarter square function (shown here as q, i.e., with
{\displaystyle q(x)={\tfrac {x^{2}}{4}}}):
f=q(u+v)-q(u-v),
Differentiating both sides:
{\displaystyle {\begin{aligned}f'&=q'(u+v)(u'+v')-q'(u-v)(u'-v')\\[4pt]&=\left({1 \over 2}(u+v)(u'+v')\right)-\left({1 \over 2}(u-v)(u'-v')\right)\\[4pt]&={1 \over 2}(uu'+vu'+uv'+vv')-{1 \over 2}(uu'-vu'-uv'+vv')\\[4pt]&=vu'+uv'\\[4pt]&=uv'+u'v\end{aligned}}}
Chain rule
The product rule can be considered a special case of the chain rule for several variables.
{\displaystyle {d(ab) \over dx}={\frac {\partial (ab)}{\partial a}}{\frac {da}{dx}}+{\frac {\partial (ab)}{\partial b}}{\frac {db}{dx}}=b{\frac {da}{dx}}+a{\frac {db}{dx}}.}
Non-standard analysis
Let u and v be continuous functions in x, and let dx, du and dv be infinitesimals within the framework of non-standard analysis, specifically the hyperreal numbers. Using st to denote the standard part function that associates to a finite hyperreal number the real infinitely close to it, this gives
{\displaystyle {\begin{aligned}{\frac {d(uv)}{dx}}&=\operatorname {st} \left({\frac {(u+du)(v+dv)-uv}{dx}}\right)\\[4pt]&=\operatorname {st} \left({\frac {uv+u\cdot dv+v\cdot du+dv\cdot du-uv}{dx}}\right)\\[4pt]&=\operatorname {st} \left({\frac {u\cdot dv+(v+dv)\cdot du}{dx}}\right)\\[4pt]&=u{\frac {dv}{dx}}+v{\frac {du}{dx}}.\end{aligned}}}
This was essentially Leibniz’s proof exploiting the transcendental law of homogeneity (in place of the standard part above).
Smooth infinitesimal analysis
In the context of Lawvere’s approach to infinitesimals, let dx be a nilsquare infinitesimal. Then du = u′ dx and dv = v ′ dx, so that
{\displaystyle {\begin{aligned}d(uv)&=(u+du)(v+dv)-uv\\&=uv+u\cdot dv+v\cdot du+du\cdot dv-uv\\&=u\cdot dv+v\cdot du+du\cdot dv\\&=u\cdot dv+v\cdot du\,\!\end{aligned}}}
since
{\displaystyle du\,dv=u'v'(dx)^{2}=0.}
Generalizations
Product of more than two factors
The product rule can be generalized to products of more than two factors. For example, for three factors we have
{\displaystyle {\frac {d(uvw)}{dx}}={\frac {du}{dx}}vw+u{\frac {dv}{dx}}w+uv{\frac {dw}{dx}}.}
For a collection of functions f1 ,…. , fk , we have
{\displaystyle {\frac {d}{dx}}\left[\prod _{i=1}^{k}f_{i}(x)\right]=\sum _{i=1}^{k}\left(\left({\frac {d}{dx}}f_{i}(x)\right)\prod _{j=1,j\neq i}^{k}f_{j}(x)\right)=\left(\prod _{i=1}^{k}f_{i}(x)\right)\left(\sum _{i=1}^{k}{\frac {f'_{i}(x)}{f_{i}(x)}}\right).}
The logarithmic derivative provides a simpler expression of the last form, as well as a direct proof that does not involve any recursion. The logarithmic derivative of a function f, denoted here Logder(f), is the derivative of the logarithm of the function. It follows that
{\displaystyle \operatorname {Logder} (f)={\frac {f'}{f}}.}
Using that the logarithm of a product is the sum of the logarithms of the factors, the sum rule for derivatives gives immediately
{\displaystyle \operatorname {Logder} (f_{1}\cdots f_{k})=\sum _{i=1}^{k}\operatorname {Logder} (f_{i}).}
The last above expression of the derivative of a product is obtained by multiplying both members of this equation by the product of the fi
Higher derivatives
It can also be generalized to the general Leibniz rule for the nth derivative of a product of two factors, by symbolically expanding according to the binomial theorem:
{\displaystyle d^{n}(uv)=\sum _{k=0}^{n}{n \choose k}\cdot d^{(n-k)}(u)\cdot d^{(k)}(v).}
Applied at a specific point x, the above formula gives:
(uv)^{(n)}(x)=\sum _{k=0}^{n}{n \choose k}\cdot u^{(n-k)}(x)\cdot v^{(k)}(x).
Furthermore, for the nth derivative of an arbitrary number of factors:
{\displaystyle \left(\prod _{i=1}^{k}f_{i}\right)^{(n)}=\sum _{j_{1}+j_{2}+\cdots +j_{k}=n}{n \choose j_{1},j_{2},\ldots ,j_{k}}\prod _{i=1}^{k}f_{i}^{(j_{i})}.}
Higher partial derivatives
For partial derivatives, we have
{\partial ^{n} \over \partial x_{1}\,\cdots \,\partial x_{n}}(uv)=\sum _{S}{\partial ^{|S|}u \over \prod _{i\in S}\partial x_{i}}\cdot {\partial ^{n-|S|}v \over \prod _{i\not \in S}\partial x_{i}}
where the index S runs through all 2n subsets of {1, …, n}, and |S| is the cardinality of S. For example, when n = 3,
{\displaystyle {\begin{aligned}&{\partial ^{3} \over \partial x_{1}\,\partial x_{2}\,\partial x_{3}}(uv)\\[6pt]={}&u\cdot {\partial ^{3}v \over \partial x_{1}\,\partial x_{2}\,\partial x_{3}}+{\partial u \over \partial x_{1}}\cdot {\partial ^{2}v \over \partial x_{2}\,\partial x_{3}}+{\partial u \over \partial x_{2}}\cdot {\partial ^{2}v \over \partial x_{1}\,\partial x_{3}}+{\partial u \over \partial x_{3}}\cdot {\partial ^{2}v \over \partial x_{1}\,\partial x_{2}}\\[6pt]&+{\partial ^{2}u \over \partial x_{1}\,\partial x_{2}}\cdot {\partial v \over \partial x_{3}}+{\partial ^{2}u \over \partial x_{1}\,\partial x_{3}}\cdot {\partial v \over \partial x_{2}}+{\partial ^{2}u \over \partial x_{2}\,\partial x_{3}}\cdot {\partial v \over \partial x_{1}}+{\partial ^{3}u \over \partial x_{1}\,\partial x_{2}\,\partial x_{3}}\cdot v.\end{aligned}}}
Banach space
Suppose X, Y, and Z are Banach spaces (which includes Euclidean space) and B : X × Y → Z is a continuous bilinear operator. Then B is differentiable, and its derivative at the point (x,y) in X × Y is the linear map D(x,y)B : X × Y → Z given by
(D_{\left(x,y\right)}\,B)\left(u,v\right)=B\left(u,y\right)+B\left(x,v\right)\qquad \forall (u,v)\in X\times Y.
Derivations in abstract algebra
In abstract algebra, the product rule is used to define what is called a derivation, not vice versa.
In vector calculus
The product rule extends to scalar multiplication, dot products, and cross products of vector functions, as follows.[5]
Scalar multiplication:
{\displaystyle (f\cdot \mathbf {g} )'=f'\cdot \mathbf {g} +f\cdot \mathbf {g} '}
For dot products:
{\displaystyle (\mathbf {f} \cdot \mathbf {g} )'=\mathbf {f} '\cdot \mathbf {g} +\mathbf {f} \cdot \mathbf {g} '}
For cross products:
{\displaystyle (\mathbf {f} \times \mathbf {g} )'=\mathbf {f} '\times \mathbf {g} +\mathbf {f} \times \mathbf {g} '}
There are also analogues for other analogs of the derivative: if f and g are scalar fields then there is a product rule with the gradient:
{\displaystyle \nabla (f\cdot g)=\nabla f\cdot g+f\cdot \nabla g}
Applications
Among the applications of the product rule is a proof that
{\displaystyle {d \over dx}x^{n}=nx^{n-1}}
when n is a positive integer (this rule is true even if n is not positive or is not an integer, but the proof of that must rely on other methods). The proof is by mathematical induction on the exponent n. If n = 0 then xn is constant and nxn − 1 = 0. The rule holds in that case because the derivative of a constant function is 0. If the rule holds for any particular exponent n, then for the next value, n + 1, we have
{\begin{aligned}{d \over dx}x^{n+1}&{}={d \over dx}\left(x^{n}\cdot x\right)\\[12pt]&{}=x{d \over dx}x^{n}+x^{n}{d \over dx}x\qquad {{(the~~ product~~ rule~~ is~~ used ~~here)}}\\[12pt]&{}=x\left(nx^{n-1}\right)+x^{n}\cdot 1\qquad {{(the~~ induction ~~hypothesis ~~is ~~used ~~here)}}\\[12pt]&{}=(n+1)x^{n}.\end{aligned}}
Therefore, if the proposition is true for n, it is true also for n + 1, and therefore for all natural n.