logo

Proof of the Existence of an Optimal Solution in the Equation Form of Linear Programming Problems 📂Optimization

Proof of the Existence of an Optimal Solution in the Equation Form of Linear Programming Problems

Theorem 1

MaximizecTxsubject toAx=bx0 \begin{matrix} \text{Maximize} & \mathbf{c}^{T} \mathbf{x} \\ \text{subject to} & A \mathbf{x} = \mathbf{b} \\ & \mathbf{x} \ge \mathbf{0} \end{matrix}

Let’s say the linear programming problem is represented in the form of equation form for matrices ARm×nA \in \mathbb{R}^{m \times n}, bRm×1\mathbf{b} \in \mathbb{R}^{m \times 1}, and cRn\mathbf{c} \in \mathbb{R}^{n}. If there exists at least one feasible solution, and the set of feasible solutions is Bounded Above by cTx\mathbf{c}^{T} \mathbf{x}, then an optimal solution exists.


  • cT\mathbf{c}^{T} denotes the transpose.
  • A feasible solution refers to a solution that satisfies the constraints, regardless of optimization.

Proof

Strategy: This proof is essentially the same as the proof of the theorem that if an optimal solution exists, then one of them is a basic feasible solution.

Definition of basic feasible solution: For a feasible solution xRn\mathbf{x} \in \mathbb{R}^{n}, there exists a set B[n]B \subseteq [n] with cardinality mm such that AB1\exists A_{B}^{-1} and xj=0jBx_{j} = 0 \forall j \notin B, and this x=(x1,,xn)\mathbf{x} = \left( x_{1} , \cdots , x_{n} \right) is called a basic feasible solution. This set BB is called the basis.

For any feasible solution x0\mathbf{x}_{0}, cTx0cTx \mathbf{c}^{T} \mathbf{x}_{0} \le \mathbf{c}^{T} \mathbf{x} let’s call the solution with the most number of 00 among those that satisfy x\mathbf{x}, as x~\tilde{\mathbf{x}}.

  • If x~\tilde{\mathbf{x}} is proved to be a basic feasible solution, since there are finitely many basic feasible solutions, the existence of an optimal solution can be shown.
  • Thus, since any optimal solution is a feasible solution before it is optimal, there exists a (optimal) basic feasible solution with the same objective function value. Therefore, at least one of the optimal solutions must be a basic feasible solution.

Now, when we define the set of indices K={k[n]:(x~)k>0} K = \left\{ k \in [n] : \left( \tilde{\mathbf{x}} \right)_{k} > 0 \right\} , let’s consider whether the column vectors of AKA_{K} are linearly independent or not.


Case 1. Linear Independence

By the definition of a basic feasible solution, x~\tilde{\mathbf{x}} is naturally a basic feasible solution, and there is nothing further to prove.


Case 2. Linear Dependence

If they are linearly dependent, not linearly independent, then there exists some K|K|-dimensional vector vRK\mathbf{v} \in \mathbb{R}^{|K|} that satisfies the following. AKv=0 A_{K} \mathbf{v} = \mathbf{0} We can think of an extended vector wRn\mathbf{w} \in \mathbb{R}^{n} that satisfies wK=v\mathbf{w}_{K} = \mathbf{v} and Aw=AKv=0A \mathbf{w} = A_{K} \mathbf{v} = \mathbf{0}. For example, if K={2,3,5}K = \left\{ 2,3,5 \right\} then w\mathbf{w} becomes a vector w=(0,2,1,0,7)\mathbf{w}= (0,-2,1,0,7) whose 2,3,52,3,5-th component is v\mathbf{v} and the rest are 00. At this moment, w\mathbf{w} can also satisfy cTw>0 \mathbf{c}^{T} \mathbf{w} > 0 by readjusting the sign of the component of v\mathbf{v}. Let’s suppose w\mathbf{w} mentioned onward as a vector that satisfies the above inequality.

Now, let’s consider the following vector x(t)\mathbf{x}(t) that is dependent on t0t \ge 0 for w\mathbf{w}. x(t)=x~+tw \mathbf{x} (t) = \tilde{\mathbf{x}} + t \mathbf{w} First, x~\tilde{\mathbf{x}} is a feasible solution, and by the definition of w\mathbf{w} that satisfies Aw=AKv=0A \mathbf{w} = A_{K} \mathbf{v} = \mathbf{0}, for all t0t \ge 0, Ax(t)=Ax~+tAw=Ax~=b \begin{align*} A \mathbf{x} (t) =& A \tilde{\mathbf{x}} + t A \mathbf{w} \\ =& A \tilde{\mathbf{x}} \\ =& \mathbf{b} \end{align*} is satisfied. In other words, x(t)\mathbf{x} (t) also satisfies all the constraints except for x(t)0\mathbf{x}(t) \ge 0. Meanwhile, if we consider the jj-th component xj(t)x_{j}(t) of x(t)\mathbf{x} (t), it appears as xj(t)=x~j+twjx_{j}(t) = \tilde{x}_{j} + t w_{j}, and at least one of them must be wj<0w_{j} < 0. If for all jj, wj>0w_{j} > 0, then for all tt, x(t)=x~+tw0\mathbf{x} (t) = \tilde{\mathbf{x}} + t \mathbf{w} \ge 0, making all x(t)\mathbf{x} (t) feasible, which contradicts the assumption that the objective function is bounded above. If at least one of wj0w_{j_{0}} is negative, then as we increase tt from 00 to \infty, at some point, there will be a t00t_{0} \ge 0 that satisfies xj0(t0)=xj0~+t0wj0=0 x_{j_{0}} \left( t_{0} \right) = \tilde{x_{j_{0}}} + t_{0} w_{j_{0}} = 0 for the first time. Then, x(t0)\mathbf{x} \left( t_{0} \right) will have one more component of 00 than x~\tilde{\mathbf{x}} while also satisfying cTx(t0)=cTx~+t0cTwcTx0+t0cTwcTx0 \begin{align*} \mathbf{c}^{T} \mathbf{x} \left( t_{0} \right) =& \mathbf{c}^{T} \tilde{\mathbf{x}} + t_{0} \mathbf{c}^{T} \mathbf{w} \\ \ge & \mathbf{c}^{T} \mathbf{x}_{0} + t_{0} \mathbf{c}^{T} \mathbf{w} \\ \ge & \mathbf{c}^{T} \mathbf{x}_{0} \end{align*} This contradicts the definition of x~\tilde{\mathbf{x}}, which was supposed to have the most number of 00 while satisfying cTx0cTx\mathbf{c}^{T} \mathbf{x}_{0} \le \mathbf{c}^{T} \mathbf{x}.


Uniqueness of the basic feasible solution: The basic feasible solution is uniquely determined by the basis BB.

Consequently, AKA_{K} cannot be linearly dependent, so x~\tilde{\mathbf{x}} must be a basic feasible solution, and since the basic feasible solution determined by the choice of basis is unique, there exist finitely many basic feasible solutions, implying the existence of an optimal solution as well. Since all optimal solutions are also feasible solutions, for any optimal solution, there exists a basic feasible solution with the same objective function value, and they are also optimal solutions; therefore, at least one of the optimal solutions must be a basic feasible solution.

See Also

Basic Theorem of Linear Programming

This theorem is used in the proof of Basic Theorem of Linear Programming.


  1. Matousek. (2007). Understanding and Using Linear Programming: p46. ↩︎