Linear Programming Duality
Buildup
For $x_{1} , x_{2} \ge 0$, let’s say we have the following linear programming problem. $$ \begin{matrix} \text{Maximize} & & 2x_{1} & + & 3x_{2} \\ \text{subject to} & & 4x_{1} & + & 8x_{2} & \le & 12 \\ & & 2x_{1} & + & x_{2} & \le & 3 \\ & & 3x_{1} & + & 2x_{2} & \le & 4 \end{matrix} $$
Our goal is to find the optimal solution $x_{1}^{\ast}, x_{2}^{\ast}$ that maximizes the objective function $\zeta = 2x_{1} + 3x_{2}$ under given constraints. However, upon examining the equation, it’s clear that there’s an upper bound to the maximum value even without solving the problem. From the first constraint, $$ 2x_{1} + 3x_{2} \le 4x_{1} + 8x_{2} \le 12 $$ and indeed, if both sides of $4x_{1} + 8x_{2} \le 12$ are divided by $2$, $$ 2x_{1} + 3x_{2} \le 2x_{1} + 4x_{2} \le 6 $$ it can be seen that the value of the objective function cannot exceed $6$. Not only that, but considering the second constraint, $$ 2x_{1} + 3x_{2} = {{ 1 } \over { 3 }} \left( 4x_{1} + 8x_{2} + 2x_{1} + x_{2} \right) \le {{ 1 } \over { 3 }} (12 + 3) = 5 $$ it’s possible to further reduce the upper bound as shown. In other words, $$ d_{1} x_{1} + d_{2} x_{2} \le h $$ when there’s such an inequality, it suggests we could approach by narrowing down $h$. Since the objective function itself is bounded, continuing to reduce this upper limit until it can’t be reduced any further will eventually lead us to the minimum value of the upper limit, which turns out to be another optimization problem in itself. Intuitively, whether solving the original maximization problem or the problem of minimizing the upper limit, it seems we can arrive at the same conclusion.
Especially, the form of $d_{1} x_{1} + d_{2} x_{2} \le h$ looks like it could be a constraint of ‘another linear programming problem.’ Therefore, the ‘other linear programming problem’ is defined as follows.
Definition 1
$$ \begin{matrix} \text{Maximize} & \mathbf{c}^{T} \mathbf{x} \\ \text{subject to} & A \mathbf{x} \le \mathbf{b} \\ & \mathbf{x} \ge \mathbf{0} \end{matrix} $$
Let’s say there’s a linear programming problem given like above for matrices $A \in \mathbb{R}^{m \times n}$, $\mathbf{b} \in \mathbb{R}^{m \times 1}$, and $\mathbf{c} \in \mathbb{R}^{n}$.
- The linear programming problem to find the corresponding minimum upper bound is called the Dual Linear Program. $$ \begin{matrix} \text{Minimize} & \mathbf{b}^{T} \mathbf{y} \\ \text{subject to} & A^{T} \mathbf{y} \ge \mathbf{c} \\ & \mathbf{y} \ge \mathbf{0} \end{matrix} $$
- As an expression opposed to the dual linear programming problem, the original maximization problem is called the Primal Linear Program.
Explanation 2
Visually presenting the solutions of dual and primal illustrates it as above. The primal problem maximizes the objective function from below upwards, whereas the dual problem reduces the solution space from above downwards. When they exactly align, we can unequivocally accept that as the optimal solution.
What is Duality?
Duality universally appears in mathematics, denoting properties at a higher abstract level similar to symmetry, such as the dual space in functional analysis or geometric dual graphs.
See Also
Duality in Linear Programming
If one is not majoring in optimization but studying it as a single subject, knowing just the simplex method and duality in linear programming can be considered as covering everything. Of course, that doesn’t mean the process is necessarily easy.
While the simplex method discusses a specific approach, duality is more about the theoretical—a purely mathematical discussion—that there exists a dual problem to the primal problem, and their optimal solutions are equal. This is guaranteed by the following Duality Theorems.
Weak Duality Theorem
$$ \sum_{j=1}^{n} c_{j} x_{j}^{\ast} \le \sum_{i=1}^{m} b_{i} y_{i}^{\ast} $$
Strong Duality Theorem
$$ \sum_{j=1}^{n} c_{j} x_{j}^{\ast} = \sum_{i=1}^{m} b_{i} y_{i}^{\ast} $$