Newton's and Maclaurin's Inequalities
Symmetric functions and averages
Symmetric (or elementary symmetric) polynomials arise in Viète's formulas for the roots of polynomial equations. The formulas relate the coefficients of the polynomial to its roots. An $n\text{-th}$ degree polynomial has $n$ roots and $n+1$ coefficients. A polynomial is said to be monic when its leading coefficient equals $1.$
For a set $X_n=\{x_1,x_2,\ldots,x_n\}$ of $n$ variables, denote by $X_{n}^{k}$ the set of all products of the $n$ variables taken $k$ at a time. There are $n\choose k$ such products: $\left|X_{n}^{k}\right|={n\choose k}.$
The symmetric functions in $n$ variables are defined as$\displaystyle\begin{align} e_0(x_1,x_2,\ldots ,x_n) &= 1,\\ e_1(x_1,x_2,\ldots ,x_n) &= \sum_{t\in X_{n}^{1}}t = x_1+x_2+\ldots+x_n,\\ e_2(x_1,x_2,\ldots ,x_n) &= \sum_{t\in X_{n}^{2}}t = x_1x_2+x_1x_3+\ldots+x_{n-1}x_n,\\ &\cdots \cdots\\ e_n(x_1,x_2,\ldots ,x_n) &= \sum_{t\in X_{n}^{n}}t = x_1x_2\cdots x_n, \end{align}$
and their averages as
$\displaystyle\begin{align} E_0(X) &= E_0(x_1,x_2,\ldots ,x_n) = \frac{1}{{n\choose 0}}e_0(x_1,x_2,\ldots ,x_n),\\ E_1(X) &= E_1(x_1,x_2,\ldots ,x_n) = \frac{1}{{n\choose 1}}e_1(x_1,x_2,\ldots ,x_n),\\ E_2(X) &= E_0(x_1,x_2,\ldots ,x_n) = \frac{1}{{n\choose 2}}e_0(x_1,x_2,\ldots ,x_n),\\ &\cdots \cdots\\ E_n(X) &= E_n(x_1,x_2,\ldots ,x_n) = \frac{1}{{n\choose n}}e_n(x_1,x_2,\ldots ,x_n). \end{align}$
Newton's and Maclaurin's inequalities deal with those averages, although they can be rewritten in terms of the symmetric functions themselves.
Newton's and Maclaurin's Inequalities
To formulate both Newton's inequality and Maclaurin's inequality we assume that all $x_i,$ $i=1,\ldots,n$ are non-negative.
Newton's inequality
$\left(E_k(X)\right)^2\gt E_{k-1}(X)E_{k+1}(X),$ $1\le k\le n-1.$
Maclaurin's inequality
$E_1(X)\gt \left(E_{2}(X)\right)^\frac{1}{2}\gt \left(E_{3}(X)\right)^\frac{1}{3}\gt\ldots\gt \left(E_{n}(X)\right)^\frac{1}{n}.$
The inequalities become equalities only when all $x_i$ are equal.
Note that $E_1(X)\gt \left(E_{n}(X)\right)^\frac{1}{n}$ is none other than the better known AM-GM inequality.
Newton's inequality implies that of Maclaurin
Indeed, multiply Newton's inequalities in a sequence, as below:
$\displaystyle (E_0E_2)(E_1E_3)^2(E_2E_3)^3\cdots (E_{k-1}E_{k+1})^k\lt (E_1)^2(E_2)^4\cdots (E_k)^{2k}$
which reduces to $(E_k)^{k+1}\gt (E_{k+1})^k,$ or $\displaystyle (E_k)^{\frac{1}{k}}\gt (E_{k+1})^\frac{1}{k+1}.$
Preliminaries for a proof of Newton's inequality
We'll prove a somewhat more general statement:
Suppose that $\alpha,\beta\gt 0$ are real and $j,k\ge 0$ integers such that
$\alpha+\beta=1$ and $j\alpha+k\beta\in\{0,1,\ldots,n\}.$
Then
$E_{j\alpha+k\beta}(X)\ge(E_j(X))^{\alpha}(E_k(X))^{\beta},$
for every $n$-tuple $X$ of non-negative real numbers. Moreover, equality occurs if and only if all $x_i$ are equal.
The proof is by induction on the number of variables $n.$
According to Rolle's theorem, if all roots of a polynomial $P(x)$ are real, or real and distinct, then the same is true for its derivative $P'(x).$ This remark will be applied to the polynomial
$\displaystyle P_X(x)=\prod_{i=1}^{n}(x-x_i)=\sum_{k=0}^{n}(-1)^{k}e_{k}(x_1,x_2,\ldots,x_n)x^{n-k}.$
If $y_1,y_2,\ldots,y_{n-1}$ the roots of the derivative of $P_X(x)$ then the $(n-1)-\text{tuple}$ $X'=\{y_1,y_2,\ldots,y_{n-1}\}$ is denoted $X'$ and is said to be derived from the $n-\text{tuple}$ $X.$
Observe that
$\displaystyle\prod_{i=1}^{n-1}(x-y_i)=\sum_{k=0}^{n_1}(-1)^ke_{k}(y_1,\ldots,y_{n-1})x^{n-1-k}$
and also that
$\displaystyle\begin{align} \prod_{i=1}^{n-1}(x-y_i) &=\frac{1}{n}\frac{P_X(x)}{dx}\\ &= \sum_{k=0}^{n_1}(-1)^k\frac{n-k}{n}e_{k}(x_1,\ldots,x_{n})x^{n-1-k}. \end{align}$
By comparing the two expressions we conclude that $\displaystyle e_{k}(y_1,\ldots,y_{n-1})=\frac{n-k}{n}e_{k}(x1,\ldots,x_{n}).$ Which gives the following
Lemma 1
$E_k(X)=E_k(X'),$ for every $k\in\{0,1,\ldots,|X|-1\}.$
Another simple and useful result is the following
Lemma 2
Assume that $X$ is an $n-\text{tuple}$ of real numbers that does not contain $0.$ Define $\displaystyle X^{-1}=\{\frac{1}{a}:\;a\in X\}.$ Then
$\displaystyle E_k(X^{-1})=\frac{E_{n-k}(X)}{E_n(X)},$
for every $k\in\{0,1,\ldots,n\}.$
The induction
Now, for $|X|=2,$ we have to prove just one inequality:
$\displaystyle x_1x_2\le\left(\frac{x_1+x_2}{2}\right)^2,$
which the AM-GM inequality for $n=2,$ with the equality only when $x_1=x_2.$
Assume that the assertion holds for all $k-\text{tuples},$ with $k\le n-1.$ Let $|X|=n\ge 3,$ all $x_i$ non-negative and let $j,k$ non-negative integers, while $\alpha,\beta$ positive real numbers such that
$\alpha+\beta=1$ and $j\alpha+k\beta\in\{0,1,\ldots,n\}.$
According to Lemma 1,
$E_{j\alpha+k\beta}(X)\ge\left(E_j(X)\right)^{\alpha}\cdot\left(E_k(X)\right)^{\beta},$
except possibly for the case where $j\lt k=n,$ or $k\lt j=n.$ Assume, for example, that $j\lt k=n;$ then necessarily $j\alpha+n\beta\lt n.$ We have to show that
$E_{j\alpha+n\beta}(X)\ge\left(E_j(X)\right)^{\alpha}\cdot\left(E_n(X)\right)^{\beta}.$
If $0\in X,$ then $E_n(X)=0,$ and the inequality is obvious. The equality occurs only if $E_{j\alpha+n\beta}(X')=E_{j\alpha+n\beta}(X)=0,$ i.e. (according to the induction hypothesis), when all entries of $X$ coincide.
If none of $x_i$ is $0,$ then, by Lemma 2, we have to prove that
$E_{n-j\alpha-n\beta}(X^{-1})\ge\left(E_{n-j}(X^{-1})\right)^{\alpha},$
or equivalently (due to Lemma 1),
$E_{n-j\alpha-n\beta}((X^{-1})')\ge\left(E_{n-j}((X^{-1})')\right)^{\alpha}.$
The latter is true by virtue of our induction hypothesis.
References
Constantin P. Niculescu, A NEW LOOK AT NEWTON’S INEQUALITIES, Journal of Inequalities in Pure and Applied Mathematics, Volume 1, Issue 2, Article 17, 2000
- An Inequality for Grade 8
- An Extension of the AM-GM Inequality
- Schur's Inequality
- Newton's and Maclaurin's Inequalities
- Rearrangement Inequality
- Chebyshev Inequality
- A Mathematical Rabbit out of an Algebraic Hat
- An Inequality With an Infinite Series
- An Inequality: 1/2 * 3/4 * 5/6 * ... * 99/100 less than 1/10
- A Low Bound for 1/2 * 3/4 * 5/6 * ... * (2n-1)/2n
- An Inequality: Easier to prove a subtler inequality
- Inequality with Logarithms
- An inequality: 1 + 1/4 + 1/9 + ... less than 2
- Inequality with Harmonic Differences
- An Inequality by Uncommon Induction
- From Triangle Inequality to Inequality in Triangle
- Area Inequality in Triangle II
- An Inequality in Triangle
- Hlawka's Inequality
- An Application of Hlawka's Inequality
- An Inequality in Determinants
- An Application of Schur's Inequality
- An Inequality from Tibet
- Application of Cauchy-Schwarz Inequality
- Area Inequalities in Triangle
- An Inequality from Tibet
- An Inequality with Constraint
- An Inequality with Constraints II
|Contact| |Front page| |Contents| |Algebra| |Store|
Copyright © 1996-2015 Alexander Bogomolny| 49555021 |

