-
Notifications
You must be signed in to change notification settings - Fork 0
/
1-comp_slack.Rmd
157 lines (130 loc) · 5.26 KB
/
1-comp_slack.Rmd
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
# Complementary slackness
```{theorem, label="weak-duality-cs"}
Let (P) and (D) denote a primal-dual pair of
linear programming problems in generic form as defined
[previously](#primal-dual).
Let \(\vec{x}^*\) be a feasible solution to \((P)\)
and \(\vec{y}^*\) is a feasible solution to (D). Then
the following hold:
1. \(\vec{c}^\T\vec{x}^* \geq \vec{y^*}^\T\vec{b}\).
2. \(\vec{x}^*\) and \(\vec{y}^*\) are optimal solutions to
the respective problems if and only if the following
conditions (known as the **complementary slackness conditions**)
hold:
\[
\begin{array}{rcll}
x^*_j = 0 & \text{ or } & \vec{y^*}^\T \mm{A}_j = c_j & \text{ for }
j = 1,\ldots, n \\
y^*_i = 0 & \text{ or } & {\vec{a}^{(i)}}^\T
\vec{x^*} = b_i & \text{ for } i = 1,\ldots, m
\end{array}
\]
```
Part 1 of the theorem is known as **weak duality**. Part 2 of the
theorem is often called the **Complementary Slackness Theorem**.
*Proof* of Theorem \@ref(thm:weak-duality-cs)
Note that if \(x^*_j\) is constrained to be nonnegative,
its corresponding
dual constraint is \(\vec{y^*}^\T \mm{A}_j \leq c_j\).
Hence, \((c_j - \vec{y^*}^\T \mm{A}_j)x^*_j \geq 0\) with
equality if and only if \(x^*_j = 0\) or
\(\vec{y^*}^\T \mm{A}_j = c_j\) (or both).
If \(x^*_j\) is constrained to be nonpositive,
its corresponding
dual constraint is \(\vec{y^*}^\T \mm{A}_j \geq c_j\).
Hence, \((c_j - \vec{y^*}^\T \mm{A}_j)x^*_j \geq 0\) with
equality if and only if \(x^*_j = 0\) or
\(\vec{y^*}^\T \mm{A}_j = c_j\) (or both).
If \(x^*_j\) is free, its corresponding
dual constraint is \(\vec{y^*}^\T \mm{A}_j = c_j\).
Hence, \((c_j - \vec{y^*}^\T \mm{A}_j)x^*_j = 0\).
We can combine these three cases and obtain that
\((\vec{c}^\T - \vec{y^*}^\T \mm{A})\vec{x^*}
=
\displaystyle\sum_{j = 1}^n (c_j - \vec{y^*}^\T \mm{A}_j) x^*_j
\geq 0\) with equality if and only if
for each \(j = 1,\ldots, n\),
\[x^*_j = 0 \text{ or } \vec{y^*}^\T \mm{A}_j = c_j.\]
(Here, the usage of “or” is not exclusive.)
Similarly,
\(\vec{y^*}^\T(\mm{A}\vec{x^*} - \vec{b})
=
\displaystyle\sum_{i = 1}^n y^*_i({\vec{a}^{(i)}}^\T \vec{x^*} - b_i)
\geq 0\) with equality if and only if
for each \(i = 1,\ldots, n\),
\[y^*_i = 0 \text{ or } {\vec{a}^{(i)}}^\T \vec{x^*} = b_i.\]
(Again, the usage of “or” is not exclusive.)
Adding the inequalities
\((\vec{c}^\T - \vec{y^*}^\T \mm{A})\vec{x^*}
\geq 0\) and
\(\vec{y^*}^\T(\mm{A}\vec{x^*} - \vec{b}) \geq 0\),
we obtain
\(\vec{c}^\T\vec{x}^* -
\vec{y^*}^\T\vec{b} \geq 0\)
with equality if and only if the complementary slackness conditions hold.
By strong duality, \(\vec{x}^*\) is optimal (P)
and \(\vec{y}^*\) is optimal for (D) if and only
if \(\vec{c}^\T \vec{x}^* = \vec{y^*}^\T\vec{b}.\) The result now follows.
\(\qed\)
The complementary slackness conditions give a characterization of
optimality which can be useful in solving certain problems
as illustrated by the following example.
```{example}
Let (P) denote the following linear programming problem:
\[\begin{array}{rrcrcrlll}
\mbox{min} & 2 x_1 & +& 4x_2 & + & 2x_3 \\
\mbox{s.t.}
& x_1 & + & x_2 & + & 3x_3 & \leq & 1 \\
& -x_1 & + & 2x_2 & + & x_3 & \geq & 1 \\
& & & 3x_2 & - & 6x_3 & = & 0 \\
& x_1 & , & & & x_3 & \geq & 0 \\
& & & x_2 & & & & \mbox{free.}
\end{array}\]
Is \(\vec{x}^*
=\left[\begin{array}{c} 0\\\frac{2}{5}\\ \frac{1}{5} \end{array}\right]\)
an optimal solution to (P)?
One could answer this question by solving (P) and then see if
the objective function value of \(\vec{x}^*\), assuming that its
feasibiilty has already been verified, is equal to the optimal
value. However, there is a way to make use of the given information to
save some work.
Let (D) denote the dual problem of (P):
\[\begin{array}{rrcrcrlll}
\max & y_1 & +& y_2 & & \\
\mbox{s.t.}
& y_1 & - & y_2 & & & \leq & 2 \\
& y_1 & + & 2y_2 & + & 3y_3 & = & 4 \\
& 3y_1 & + & y_2 & - & 6y_3 & \leq & 2 \\
& y_1 & & & & & \leq & 0 \\
& & & y_2 & & & \geq & 0 \\
& & & & & y_3 & & \mbox{free.}
\end{array}\]
One can check that \(\vec{x}^*\) is a feasible solution to (P).
If \(\vec{x}^*\) is optimal, then there must exist a feasible
solution \(\vec{y}^*\) to (D) satisfying together with \(\vec{x}^*\)
the complementary slackness conditions:
\[
\begin{array}{llr}
y_1^* = 0 & \mbox{ or } & x_1^* + x_2^* + 3x_3^* = 1 \\
y_2^* = 0 & \mbox{ or } & -x_1^* + 2x_2^* + x_3^* = 1 \\
y_3^* = 0 & \mbox{ or } & 3x_2^* - 6x_3^* = 0 \\
x_1^* = 0 & \mbox{ or } & y_1^* - y_2^* = 2 \\
x_2^* = 0 & \mbox{ or } & y_1^* + 2y_2^* + 3y_3^* = 4 \\
x_3^* = 0 & \mbox{ or } & 3y_1^* + y_2^* - 6y_3^* = 2. \\
\end{array}
\]
As \(x_2^*, x_3^* \gt 0\), satisfying the above conditions require that
\begin{align*}
y_1^* + 2y_2^* + 3y_3^* &= 4 \\
3y_1^* + y_2^* - 6y_3^* &= 2.
\end{align*}
Solving for \(y_2^*\) and \(y_3^*\) in terms of \(y_1^*\) gives
\(y_2^* = 2 - y_1^*\), \(y_3^* = \frac{1}{3}y_1^*\).
To make \(\vec{y}^*\) feasible to (D),
we can set \(y_1^* = 0\) to obtain the feasible solution
\(y_1^* = 0, y_2^* = 2\), \(y_3^* = 0\).
We can check that this \(\vec{y}^*\) satisfies the complementary
slackness conditions with \(\vec{x}^*\).
Hence, \(\vec{x}^*\) is an optimal solution to (P)
by Theorem \@ref(thm:weak-duality-cs), part 2.
```