Sensitivity Analysis and LP Duality in Optimization Methods

 
Optimization Methods in Energy and
Power Systems
 
 
 
 
 
Lecture 4: Sensitivity Analysis & LP Duality
 
Mahdi Pourakbari Kasmaei, 2019
 
Thursday, 21 March 2019
 
LP Problems 
(
Sensitivity Analysis & LP Duality
)
 
 
Marginal Values
 (Shadow Prices and Reduced Costs)
and 
Sensitivity Ranges
 are used in conducting
Sensitivity Analysis
.
Simplex method (used in solvers) provide extra
information than the optimal solution. It provides
Marginal Values
.
 
2
 
LP Problems 
(
Sensitivity Analysis
)
 
Marginal values give information on the variability of
the 
Optimal Solution
 due to changes in the 
Input Data
.
 
Shadow Prices
 are associated with the right-hand side
(
RHS
) of the constraints.
 
Reduced Costs
 are associated with the basic variables
and their corresponding bounds.
 
3
 
LP Problems 
(
Sensitivity Analysis
)
 
 
4
Only use the Concepts
of Marginal Values for
LP Problems
.
 
LP Problems 
(
Shadow Prices
)
 
 
5
 
One unit change in the 
RHS
 of the constraints results
in a change in the Objective Function.
 
Positive
 Shadow Price means that the objective function
will increase by increasing the RHS by one unit.
 
Negative
 Shadow Price means the objective function
will decrease by increasing the RHS by one unit.
 
LP Problems 
(
Shadow Prices
)
 
 
6
 
If there exist non-binding constraints in the LP
problem, changing the 
RHS
 results in 
no change
 in
the Objective Function. The 
Shadow Price
 is 
Zero
.
 
Binding Constraints
!?? Those constraint that restrict the
optimal solution; 
Intersected Lines
.
 
Non-Binding Constraints
!?? Those constraint that are
not affecting the Optimal Solution.
LP Problems 
(
Shadow Prices
)
7
 
 
Non-binding Constraint
 
LP Problems
 (
Shadow Prices
)
 
 
8
Binding
 or 
Non-Binding
 Constraints
Could become 
Non-Binding
 or
Binding
 ones as 
data Change
.
 
LP Problems 
(
Shadow Prices
)
 
 
9
 
How we can take the advantage of 
Shadow Prices
?!
To objective function is improved by weakening
(relaxing) the binding constraint, i.e., enlarging the
feasible region.
Improving for Minimization and Maximization
problems mean decreases and increases the 
RHS
,
respectively.
 
LP Problems 
(
Shadow Prices
)
 
 
10
 
LP Problems 
(
Shadow Prices
)
 
 
11
 
As discussed, any 
Equality Constraint
 can be
transformed into 
two
 inequality constraints; in
this case, 
at most
 
one
 of them will have a nonzero
Shadow Price
.
 
Solving LP Problems 
(
Shadow Prices
)
 
Sign of the 
Shadow Price 
can be used to infer the
sign 
of the 
binding constraint
:
Considering a 
Minimization Problem
, if the 
Shadow
Price 
of a binding 
Equality
 Constraint is 
Negative
, it
means that the 
Objective Function 
will 
decrease
(improve) if the 
RHS
 increases.
 
12
 
LP Problems 
(
Shadow Prices
)
 
 
13
 
 
                             LOWER     LEVEL     UPPER    MARGINAL
 
---- EQU Eq1           
62.000    62.000        +INF        2.000
---- EQU Eq2           
-INF       54.667         68.000     0.000
---- EQU Eq3           
50.000    56.000        +INF        0.000
---- EQU Eq4           -INF       22.000        50.000      0.000
---- EQU Eq5           
-INF       40.000        40.000     -1.000
Example 4.1
 
VARIABLES
   Z;
POSITIVE VARIABLES  
x1, x2;
EQUATIONS
    EqObj, Eq1, Eq2, Eq3, Eq4, Eq5;
EqObj.. z =e= 2*x1 + x2;
Eq1.. x1 + x2 =g= 62;
Eq2.. (2/3)*x1 + x2 =l= 68;
Eq3.. 2*x1 + 0.3*x2 =g= 50;
Eq4.. x1 =l= 50;
Eq5.. x2 =l= 40;
MODEL LP_ED /ALL/;
Solve LP_ED using LP 
Minimizing
 Z;
 
                     LOWER       LEVEL   UPPER    MARGINAL
---- VAR Z     -INF       
 
84.000     +INF                .
---- VAR x1        .       
 
22.000     +INF                .
---- VAR x2        .       
 
40.000     +INF                .
LP Problems 
(
Shadow Prices
)
14
 
                             LOWER     LEVEL     UPPER    MARGINAL
---- EQU Eq1           
62.000    62.000        +INF        2.000
  
---- EQU Eq2           
-INF       54.667         68.000     0.000

50.000    56.000        +INF        0.000
---- EQU Eq4           -INF       22.000        50.000      0.000
---- EQU Eq5           
-INF       40.000        40.000     -1.000
Example 4.1
Moving up tightens
the Feasible Region
Moving up enlarges
the Feasible Region
 
LP Problems 
(
Shadow Prices
)
 
15
 
 
Sufficiently Small Changes
 in the 
RHS
 of 
Non-
binding
 
Constraints
 has 
no effect
 on the 
Objective
Function
, while a sufficiently small change in the
RHS
 of 
Binding Constraints
 results in 
a change
 in
the 
Objective Function
.
 
Big Changes
 may change the nature of Binding and
Non-Binding Constraints.
LP Problems 
(
Shadow Prices
)
16
 
Let us change
the RHSs.
Example 4.2
VARIABLES
   Z;
POSITIVE VARIABLES  
x1, x2;
EQUATIONS
    EqObj, Eq1, Eq2, Eq3, Eq4, Eq5;
EqObj.. z =e= 2*x1 + x2;
Eq1.. x1 + x2 =g= 
63
;
Eq2.. (2/3)*x1 + x2 =l= 68;
Eq3.. 2*x1 + 0.3*x2 =g= 50;
Eq4.. x1 =l= 50;
Eq5.. x2 =l= 40;
MODEL LP_ED /ALL/;
Solve LP_ED using 
LP Minimizing Z;
  
  LOWER
 
LEVEL   UPPER    MARGINAL
---- VAR Z
 
  -INF       
 
86.000
     +INF      
 
 .
---- VAR x1
 
  .       
  
23.000
     +INF       .
---- VAR x2
 
  .       
  
40.000     +INF       .
84 + 2
                             LOWER     LEVEL     UPPER    MARGINAL
---- EQU Eq1           
62.000    62.000        +INF        2.000
  
---- EQU Eq2           
-INF       54.667         68.000     0.000
---- EQU Eq3           
50.000    56.000        +INF        0.000
---- EQU Eq4           -INF       22.000        50.000      0.000
---- EQU Eq5           
-INF       40.000        40.000     -1.000
LP Problems 
(
Shadow Prices
)
17
 
Let us change
the RHSs.
Example 4.3
VARIABLES
   Z;
POSITIVE VARIABLES  
x1, x2;
EQUATIONS
    EqObj, Eq1, Eq2, Eq3, Eq4, Eq5;
EqObj.. z =e= 2*x1 + x2;
Eq1.. x1 + x2 =g= 
62
;
Eq2.. (2/3)*x1 + x2 =l= 68;
Eq3.. 2*x1 + 0.3*x2 =g= 50;
Eq4.. x1 =l= 50;
Eq5.. x2 =l= 
41
;
MODEL LP_ED /ALL/;
Solve LP_ED using 
LP Minimizing Z;
  
  LOWER
 
LEVEL   UPPER    MARGINAL
---- VAR Z
 
  -INF       
 
83.000
     +INF      
 
 .
---- VAR x1
 
  .       
  
21.000
     +INF       .
---- VAR x2
 
  .       
  
41.000
     +INF       .
84 - 1
                             LOWER     LEVEL     UPPER    MARGINAL
---- EQU Eq1           
62.000    62.000        +INF        2.000
  
---- EQU Eq2           
-INF       54.667         68.000     0.000
---- EQU Eq3           
50.000    56.000        +INF        0.000
---- EQU Eq4           -INF       22.000        50.000      0.000
---- EQU Eq5           
-INF       40.000        40.000     -1.000
LP Problems 
(
Shadow Prices
)
18
 
Let us change
the RHSs.
Example 4.4
VARIABLES
   Z;
POSITIVE VARIABLES  
x1, x2;
EQUATIONS
    EqObj, Eq1, Eq2, Eq3, Eq4, Eq5;
EqObj.. z =e= 2*x1 + x2;
Eq1.. x1 + x2 =g= 
62
;
Eq2.. (2/3)*x1 + x2 =l= 68;
Eq3.. 2*x1 + 0.3*x2 =g= 50;
Eq4.. x1 =l= 
51
;
Eq5.. x2 =l= 
40
;
MODEL LP_ED /ALL/;
Solve LP_ED using 
LP Minimizing Z;
  
  LOWER
 
LEVEL   UPPER    MARGINAL
---- VAR Z
 
  -INF       
 
84.000
     +INF      
 
 .
---- VAR x1
 
  .       
  
22.000     +INF       .
---- VAR x2
 
  .       
  
40.000     +INF       .
84 + 0
                             LOWER     LEVEL     UPPER    MARGINAL
---- EQU Eq1           
62.000    62.000        +INF        2.000
  
---- EQU Eq2           
-INF       54.667         68.000     0.000
---- EQU Eq3           
50.000    56.000        +INF        0.000
---- EQU Eq4           -INF       22.000        50.000      0.000
---- EQU Eq5           
-INF       40.000        40.000     -1.000
 
LP Problems 
(
Reduced Costs
)
 
19
 
 
The 
Marginal Value
 corresponds to a decision
variable is interpreted as its 
Reduced Cost
.
 
 
Reduced Cost 
stand for the rate of change of the
Objective Function
 for one unit increase in the
Bound
 of a 
Variable
.
 
LP Problems 
(
Reduced Costs
)
 
20
 
 
If a 
non-basic variable
 has a 
Positive
 
Reduced Cost
,
the 
Objective Function
 will 
increase
 with a one unit
increase in the binding bound, while if a 
non-basic
variable
 has a 
Negative
 
Reduced Cost
, the 
Objective
Function 
will 
decrease
.
The 
Reduced Cost
 of a 
basic variable
 is 
Zero
 
if
 its
bounds are nonbinding and therefore do not constrain
the optimal solution.
 
LP Problems 
(
Reduced Costs
)
 
21
 
 
The 
Reduced Cost 
of a variable is non-zero if the
variable is at its bound.
How to define bounds in GAMS???
 
.LO
 is used to define the lower bound (x
1
.
lo 
= c
1
)
 
.UP
 is used to define the upper bound (x
2
.
up 
= c
2
)
LP Problems 
(
Shadow Prices
)
22
 
Let us change
the RHSs.
Example 4.5
VARIABLES
   Z;
POSITIVE VARIABLES  
x1, x2;
EQUATIONS
    EqObj, Eq1, Eq2, Eq3;
EqObj.. z =e= 2*x1 + x2;
Eq1.. x1 + x2 =g= 
62
;
Eq2.. (2/3)*x1 + x2 =l= 68;
Eq3.. 2*x1 + 0.3*x2 =g= 50;
x1.
up
 = 50;
X2.
up
 = 40;
MODEL LP_ED /ALL/;
Solve LP_ED using 
LP Minimizing Z;
  
  LOWER
 
LEVEL   UPPER    MARGINAL
---- VAR Z
 
  -INF       
 
84.000     
+INF      
 
     .
---- VAR x1
 
  .       
  
22.000     50                     .
---- VAR x2
 
  .       
  
40.000     40                  -1.00
LP Problems 
(
Shadow Prices
)
23
 
Let us change
the RHSs.
Example 4.6
VARIABLES
   Z;
POSITIVE VARIABLES  
x1, x2;
EQUATIONS
    EqObj, Eq1, Eq2, Eq3;
EqObj.. z =e= 2*x1 + x2;
Eq1.. x1 + x2 =g= 
62
;
Eq2.. (2/3)*x1 + x2 =l= 68;
Eq3.. 2*x1 + 0.3*x2 =g= 50;
x1.
up
 = 50;
X2.
up
 = 
41
;
MODEL LP_ED /ALL/;
Solve LP_ED using 
LP Minimizing Z;
  
  LOWER
 
LEVEL   UPPER    MARGINAL
---- VAR Z
 
  -INF       
 
83.000
     
+INF      
 
     .
---- VAR x1
 
  .       
  
22.000     50                     .
---- VAR x2
 
  .       
  
40.000     40                  -1.00
  
  LOWER
 
LEVEL   UPPER    MARGINAL
---- VAR Z
 
  -INF       
 
84.000     
+INF      
 
     .
---- VAR x1
 
  .       
  
22.000     50                     .
---- VAR x2
 
  .       
  
40.000     40                  -1.00
84 - 1
 
LP Problems 
(
Reduced Costs
)
 
24
 
 
In 
Minimization Problem
 it is favorable to have a
variable with 
Negative
 
Reduced Cost
.
 
In 
Maximization Problem
, it is favorable to have a
variable with 
Positive
 Reduced Cost.
 
LP Problems 
(
Sensitivity Analysis
)
 
 
25
Analysis based on
Marginal Values
 are
only Valid for 
Small
Changes
.
 
LP Problems 
(
Sensitivity Ranges
)
 
26
 
 
Sensitivity Ranges with constant 
Objective
Function 
Value
.
 
Ranges of Decision Variables
 under this condition
depends on
 the 
Objective Function
.
It can be a unique Optimal Point, or it can be a
range of variables corresponding to the Optimal
Solution.
 
27
 
 
 
LP Problems 
(
Sensitivity Ranges
)
 
For the 
red
 Objective Function
,
unique points exist, however for
the 
green
 
Objective Contour
, the
ranges of variables are as follows:
 
LP Problems 
(
Sensitivity Ranges
)
 
28
 
 
Sensitivity ranges with 
Constant Basis
.
Ranges of the Objective Function 
is determined
by changing the slope of Objective Function
until the Optimal Solution Changes (It
converges to another Extreme Point)
29
 
 
LP Problems 
(
Sensitivity Ranges
)
The ranges of
Objective Function
.
 
30
 
 
 
LP Problems
Duality
31
 
 
LP Problems 
(
Duality in 
Canonical Form
)
 
Dual Variable
Primal:
:Dual
 
Primal Variable
 
32
 
 
 
LP Problems 
(
Duality in 
Canonical Form
)
 
Each 
constraint
 in 
Primal
 is associated with a 
Variable
of 
Dual
, and each 
Variable
 of 
Primal
 is associated with
a 
Constraint
 of 
Dual
.
If 
Primal
 has 
m
 constraints and 
n
 variables, 
Dual
 has 
m
variables and 
n
 constraints.
 
33
 
 
 
LP Problems 
(
Duality in 
Canonical Form
)
 
Example 4.7: Finding the Dual
 
 
Note that:
34
 
 
LP Problems 
(
Duality in 
Standard Form
)
 
Dual Variable
Primal:
:Dual
 
Primal Variable
 
35
 
 
 
LP Problems 
(
Duality in 
Standard Form
)
 
Example 4.8: Finding the Dual
 
 
Note:
 
36
 
 
 
LP Problems 
(
Duality in 
General Form
)
 
Primal
 
Dual
 
37
 
 
 
LP Problems 
(
Duality in 
General Form
)
 
How to obtain the 
Dual
 of a Problem in 
General Form
?
Step 1: Transform the Problem in 
Canonical Form
of a Minimization Problem.
Step 2: Obtain the Dual of the Transformed
Problem in Step 1.
 
38
 
 
 
LP Problems 
(
Duality in 
General Form
)
 
The following relations are inferred from the
Transformation and Finding the Dual (Previous Slide).
 
    (
Min Problem
)
 
   
(
Max Problem
)
 
Variabes
 
Constraints
 
Constraints
 
Variabes
39
 
 
LP Problems 
(
Duality in 
General Form
)
 
Example 4.9: Finding the Dual
 
40
 
 
 
Thanks!
Slide Note
Embed
Share

Sensitivity analysis and LP duality play crucial roles in optimization methods for energy and power systems. Marginal values, shadow prices, and reduced costs provide valuable insights into the variability of the optimal solution and the impact of changes in input data. Understanding shadow prices helps determine the effect of changing constraints on the objective function, with positive shadow prices indicating an increase and negative shadow prices indicating a decrease. Additionally, the distinction between binding and non-binding constraints is essential in optimizing LP problems. Utilizing concepts like marginal values helps in analyzing sensitivity and making informed decisions in energy and power system optimization.


Uploaded on Aug 09, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Optimization Methods in Energy and Power Systems Lecture 4: Sensitivity Analysis & LP Duality Mahdi Pourakbari Kasmaei, 2019 Thursday, 21 March 2019

  2. LP Problems (Sensitivity Analysis & LP Duality) Marginal Values (Shadow Prices and Reduced Costs) and Sensitivity Ranges are used in conducting Sensitivity Analysis. Simplex method (used in solvers) provide extra information than the optimal solution. It provides Marginal Values. 2

  3. LP Problems (Sensitivity Analysis) Marginal values give information on the variability of the Optimal Solution due to changes in the Input Data. Shadow Prices are associated with the right-hand side (RHS) of the constraints. Reduced Costs are associated with the basic variables and their corresponding bounds. 3

  4. LP Problems (Sensitivity Analysis) Only use the Concepts of Marginal Values for LP Problems. 4

  5. LP Problems (Shadow Prices) One unit change in the RHS of the constraints results in a change in the Objective Function. Positive Shadow Price means that the objective function will increase by increasing the RHS by one unit. Negative Shadow Price means the objective function will decrease by increasing the RHS by one unit. 5

  6. LP Problems (Shadow Prices) If there exist non-binding constraints in the LP problem, changing the RHS results in no change in the Objective Function. The Shadow Price is Zero. Binding Constraints!?? Those constraint that restrict the optimal solution; Intersected Lines. Non-Binding Constraints!?? Those constraint that are not affecting the Optimal Solution. 6

  7. LP Problems (Shadow Prices) 2 1 m in x x = + z x x 1 2 , s = + 1 2 .t 2 1 z x x Non-binding Constraint 1 2 . + 1 2 3 2 1 1 x x 1 62 x x 1 2 + 1 68 x x 1 2 + 0.3 50 4 0 5 0 x x x x 1 2 1 2 , 0 1 2 7

  8. LP Problems (Shadow Prices) Binding or Non-Binding Constraints Could become Non-Binding or Binding ones as data Change. 8

  9. LP Problems (Shadow Prices) How we can take the advantage of Shadow Prices?! To objective function is improved by weakening (relaxing) the binding constraint, i.e., enlarging the feasible region. Improving for Minimization and Maximization problems mean decreases and increases the RHS, respectively. 9

  10. LP Problems (Shadow Prices) Enlarging the feasible region happens by changing the RHS; for constraint, the RHS should be decreased while for constraint, the RHS should be increased. Therefore, the signs of the Shadow Prices for binding inequality constraints of the form and are opposite. 10

  11. LP Problems (Shadow Prices) As discussed, any Equality Constraint can be transformed into two inequality constraints; in this case, at most one of them will have a nonzero Shadow Price. 11

  12. Solving LP Problems (Shadow Prices) Sign of the Shadow Price can be used to infer the sign of the binding constraint: Considering a Minimization Problem, if the Shadow Price of a binding Equality Constraint is Negative, it means that the Objective Function will decrease (improve) if the RHS increases. 12

  13. LP Problems (Shadow Prices) VARIABLES Z; POSITIVE VARIABLES x1, x2; EQUATIONS EqObj, Eq1, Eq2, Eq3, Eq4, Eq5; EqObj.. z =e= 2*x1 + x2; Eq1.. x1 + x2 =g= 62; Eq2.. (2/3)*x1 + x2 =l= 68; Eq3.. 2*x1 + 0.3*x2 =g= 50; Eq4.. x1 =l= 50; Eq5.. x2 =l= 40; MODEL LP_ED /ALL/; Solve LP_ED using LP Minimizing Z; Example 4.1 = + m x x in 2 1 z x x 1 2 , s 1 2 .t . LOWER LEVEL UPPER MARGINAL ---- VAR Z -INF 84.000 +INF . ---- VAR x1 . 22.000 +INF . ---- VAR x2 . 40.000 +INF . + 1 2 3 2 1 1 x x 1 62 x x 1 2 + 1 68 x x 1 2 LOWER LEVEL UPPER MARGINAL + 0.3 50 4 0 5 0 x x x x ---- EQU Eq1 62.000 62.000 +INF 2.000 ---- EQU Eq2 -INF 54.667 68.000 0.000 ---- EQU Eq3 50.000 56.000 +INF 0.000 ---- EQU Eq4 -INF 22.000 50.000 0.000 ---- EQU Eq5 -INF 40.000 40.000 -1.000 1 2 1 2 , 0 1 2 13

  14. LP Problems (Shadow Prices) LOWER LEVEL UPPER MARGINAL ---- EQU Eq1 62.000 62.000 +INF 2.000 ---- EQU Eq2 -INF 54.667 68.000 0.000 ---- EQU Eq3 50.000 56.000 +INF 0.000 ---- EQU Eq4 -INF 22.000 50.000 0.000 ---- EQU Eq5 -INF 40.000 40.000 -1.000 Example 4.1 = + 2 1 z x x 1 2 = + m x x in 2 1 z x x 1 2 , s 1 2 .t . Moving up enlarges the Feasible Region + 1 2 3 2 1 1 x x 1 62 x x 1 2 + 1 68 x x 1 2 Moving up tightens the Feasible Region + 0.3 50 4 0 5 0 x x x x 1 2 1 2 , 0 1 2 14

  15. LP Problems (Shadow Prices) Sufficiently Small Changes in the RHS of Non- binding Constraints has no effect on the Objective Function, while a sufficiently small change in the RHS of Binding Constraints results in a change in the Objective Function. Big Changes may change the nature of Binding and Non-Binding Constraints. 15

  16. LP Problems (Shadow Prices) LOWER LEVEL UPPER MARGINAL ---- EQU Eq1 62.000 62.000 +INF 2.000 ---- EQU Eq2 -INF 54.667 68.000 0.000 ---- EQU Eq3 50.000 56.000 +INF 0.000 ---- EQU Eq4 -INF 22.000 50.000 0.000 ---- EQU Eq5 -INF 40.000 40.000 -1.000 Let us change the RHSs. Example 4.2 = + min x x 2 1 z x x 1 2 , s.t. VARIABLES Z; POSITIVE VARIABLES x1, x2; EQUATIONS EqObj, Eq1, Eq2, Eq3, Eq4, Eq5; EqObj.. z =e= 2*x1 + x2; Eq1.. x1 + x2 =g= 63; Eq2.. (2/3)*x1 + x2 =l= 68; Eq3.. 2*x1 + 0.3*x2 =g= 50; Eq4.. x1 =l= 50; Eq5.. x2 =l= 40; MODEL LP_ED /ALL/; Solve LP_ED using LP Minimizing Z; 1 2 + + 1 2 3 2 1 1 x x 1 62 1 x x 1 2 84 + 2 + 1 68 x x 1 2 + 0.3 50 40 50 x x x x 1 2 1 ---- VAR Z ---- VAR x1 ---- VAR x2 LOWER -INF . . LEVEL UPPER MARGINAL 86.000 +INF 23.000 +INF . 40.000 +INF . . 2 , 0 1 2 16

  17. LP Problems (Shadow Prices) LOWER LEVEL UPPER MARGINAL ---- EQU Eq1 62.000 62.000 +INF 2.000 ---- EQU Eq2 -INF 54.667 68.000 0.000 ---- EQU Eq3 50.000 56.000 +INF 0.000 ---- EQU Eq4 -INF 22.000 50.000 0.000 ---- EQU Eq5 -INF 40.000 40.000 -1.000 Let us change the RHSs. Example 4.3 = + min x x 2 1 z x x 1 2 , s.t. VARIABLES Z; POSITIVE VARIABLES x1, x2; EQUATIONS EqObj, Eq1, Eq2, Eq3, Eq4, Eq5; EqObj.. z =e= 2*x1 + x2; Eq1.. x1 + x2 =g= 62; Eq2.. (2/3)*x1 + x2 =l= 68; Eq3.. 2*x1 + 0.3*x2 =g= 50; Eq4.. x1 =l= 50; Eq5.. x2 =l= 41; MODEL LP_ED /ALL/; Solve LP_ED using LP Minimizing Z; 1 2 + 1 2 3 2 1 1 x x 1 62 x x 1 2 84 - 1 + 1 68 x x 1 2 + 0.3 50 4 0 50 x x x x 1 2 1 + ---- VAR Z ---- VAR x1 ---- VAR x2 LOWER -INF . . LEVEL UPPER MARGINAL 83.000 +INF 21.000 +INF . 41.000 +INF . 1 . 2 , 0 1 2 17

  18. LP Problems (Shadow Prices) LOWER LEVEL UPPER MARGINAL ---- EQU Eq1 62.000 62.000 +INF 2.000 ---- EQU Eq2 -INF 54.667 68.000 0.000 ---- EQU Eq3 50.000 56.000 +INF 0.000 ---- EQU Eq4 -INF 22.000 50.000 0.000 ---- EQU Eq5 -INF 40.000 40.000 -1.000 Let us change the RHSs. Example 4.4 = + min x x 2 1 z x x 1 2 , s.t. VARIABLES Z; POSITIVE VARIABLES x1, x2; EQUATIONS EqObj, Eq1, Eq2, Eq3, Eq4, Eq5; EqObj.. z =e= 2*x1 + x2; Eq1.. x1 + x2 =g= 62; Eq2.. (2/3)*x1 + x2 =l= 68; Eq3.. 2*x1 + 0.3*x2 =g= 50; Eq4.. x1 =l= 51; Eq5.. x2 =l= 40; MODEL LP_ED /ALL/; Solve LP_ED using LP Minimizing Z; 1 2 + 1 2 3 2 1 1 x x 1 62 x x 1 2 84 + 0 + 1 68 x x 1 2 + + 0.3 50 40 50 1 x x x x 1 2 1 ---- VAR Z ---- VAR x1 ---- VAR x2 LOWER -INF . . LEVEL UPPER MARGINAL 84.000 +INF 22.000 +INF . 40.000 +INF . . 2 , 0 1 2 18

  19. LP Problems (Reduced Costs) The Marginal Value corresponds to a decision variable is interpreted as its Reduced Cost. Reduced Cost stand for the rate of change of the Objective Function for one unit increase in the Bound of a Variable. 19

  20. LP Problems (Reduced Costs) If a non-basic variable has a Positive Reduced Cost, the Objective Function will increase with a one unit increase in the binding bound, while if a non-basic variable has a Negative Reduced Cost, the Objective Function will decrease. The Reduced Cost of a basic variable is Zero if its bounds are nonbinding and therefore do not constrain the optimal solution. 20

  21. LP Problems (Reduced Costs) The Reduced Cost of a variable is non-zero if the variable is at its bound. How to define bounds in GAMS??? .LO is used to define the lower bound (x1.lo = c1) .UP is used to define the upper bound (x2.up = c2) 21

  22. LP Problems (Shadow Prices) Let us change the RHSs. Example 4.5 = + m x x in 2 1 z x x 1 2 , s VARIABLES Z; POSITIVE VARIABLES x1, x2; EQUATIONS EqObj, Eq1, Eq2, Eq3; EqObj.. z =e= 2*x1 + x2; Eq1.. x1 + x2 =g= 62; Eq2.. (2/3)*x1 + x2 =l= 68; Eq3.. 2*x1 + 0.3*x2 =g= 50; x1.up = 50; X2.up = 40; MODEL LP_ED /ALL/; Solve LP_ED using LP Minimizing Z; 1 2 .t . + 1 2 3 2 1 1 x x 1 62 x x 1 2 + 1 68 x x 1 2 + 0.3 50 4 0 5 0 x x x x 1 2 1 ---- VAR Z ---- VAR x1 ---- VAR x2 LOWER -INF . . LEVEL UPPER MARGINAL 84.000 +INF 22.000 50 . 40.000 40 -1.00 . 2 , 0 1 2 22

  23. LP Problems (Shadow Prices) LOWER -INF . . LEVEL UPPER MARGINAL 84.000 +INF 22.000 50 . 40.000 40 -1.00 Let us change the RHSs. ---- VAR Z ---- VAR x1 ---- VAR x2 . Example 4.6 = + min x x 2 1 z x x 1 2 , s.t. VARIABLES Z; POSITIVE VARIABLES x1, x2; EQUATIONS EqObj, Eq1, Eq2, Eq3; EqObj.. z =e= 2*x1 + x2; Eq1.. x1 + x2 =g= 62; Eq2.. (2/3)*x1 + x2 =l= 68; Eq3.. 2*x1 + 0.3*x2 =g= 50; x1.up = 50; X2.up = 41; MODEL LP_ED /ALL/; Solve LP_ED using LP Minimizing Z; 1 2 + 1 2 3 2 1 1 x x 1 62 x x 1 2 84 - 1 + 1 68 x x 1 2 + 0.3 50 4 0 50 x x x x 1 2 1 + ---- VAR Z ---- VAR x1 ---- VAR x2 LOWER -INF . . LEVEL UPPER MARGINAL 83.000 +INF 22.000 50 . 40.000 40 -1.00 1 . 2 , 0 1 2 23

  24. LP Problems (Reduced Costs) In Minimization Problem it is favorable to have a variable with Negative Reduced Cost. In Maximization Problem, it is favorable to have a variable with Positive Reduced Cost. 24

  25. LP Problems (Sensitivity Analysis) Analysis based on Marginal Values are only Valid for Small Changes. 25

  26. LP Problems (Sensitivity Ranges) Sensitivity Function Value. Ranges with constant Objective Ranges of Decision Variables under this condition depends on the Objective Function. It can be a unique Optimal Point, or it can be a range of variables corresponding to the Optimal Solution. 26

  27. LP Problems (Sensitivity Ranges) For the red Objective Function, unique points exist, however for the green Objective Contour, the ranges of variables are as follows: ?2=[12, 40] ?1=[22, 50] 27

  28. LP Problems (Sensitivity Ranges) Sensitivity ranges with Constant Basis. Ranges of the Objective Function is determined by changing the slope of Objective Function until the Optimal Solution Changes (It converges to another Extreme Point) 28

  29. LP Problems (Sensitivity Ranges) The Objective Function. ranges of ?2=[12, 40] ?1=[22, 50] 29

  30. LP Problems Duality 30

  31. LP Problems (Duality in Canonical Form) = T = T min s.t. z c x max y z b y D P x s.t. Primal: :Dual T : Ax b x y : A y y c x 0 0 Dual Variable Primal Variable 31

  32. LP Problems (Duality in Canonical Form) Each constraint in Primal is associated with a Variable of Dual, and each Variable of Primal is associated with a Constraint of Dual. If Primal has m constraints and n variables, Dual has m variables and n constraints. 32

  33. LP Problems (Duality in Canonical Form) Example 4.7: Finding the Dual min 6 8 s.t. 3 5 2 , 0 x x = + = + max y 4 7 z y y z x x 1 2 D 1 2 P x s.t. + + + + 4 : 7 : x x x y y 3 1 y y 5 2 6 : 8 y y y y x x 1 2 x 1 1 2 1 : 1 2 2 1 , 2 0 2 1 2 1 2 3 1 5 3 5 1 6 8 y y 1 Note that: 6 8 or y y 1 2 2 2 2 33

  34. LP Problems (Duality in Standard Form) = T min s.t. z c x = T max y z b y P x D Primal: :Dual s.t. = : Ax x b y T : A y c x 0 Dual Variable Primal Variable 34

  35. LP Problems (Duality in Standard Form) Example 4.8: Finding the Dual min 6 8 s.t. 3 5 2 , , , x x x = + max y 4 7 z y y 1 2 D = + z x x s.t. 1 2 P x + + 3 1 5 2 0( 0( 6 8 : : y y y y x x x x 1 2 1 + + = 4 : 7: 0 x x x x y y 1 2 x 3 x 1 1 y y 2 2 = 0) : 0) : y y 1 2 4 2 1 1 3 x 1 2 3 4 2 2 4 4 7 1 0 3 1 5 2 0 = Note: D z y y 6 8 0 0 y y & 1 2 1 2 1 35

  36. LP Problems (Duality in General Form) = + + = + + min s.t. z 1 1 c x 2 2 c x 3 3 c x max y z 1 1 yb 2 2 y b 3 3 y b P D x s.t. + + + + + + + + : 11 1 a y a y a y y y y a y a y a y 31 3 a y a y a y c x c c : 11 1 a x a x a x x x x 12 2 a x a x a x 13 3 a x a x a x b b b y y y 21 2 1 1 x x 1 1 Primal : : : : Dual 12 1 22 2 32 3 2 2 21 1 22 2 23 3 2 2 + + = + + = 13 1 23 2 33 3 3 3 31 1 32 2 33 3 3 3 0 0 free 0 0 free 1 1 2 2 3 3 36

  37. LP Problems (Duality in General Form) How to obtain the Dual of a Problem in General Form? Step 1: Transform the Problem in Canonical Form of a Minimization Problem. Step 2: Obtain the Dual of the Transformed Problem in Step 1. 37

  38. LP Problems (Duality in General Form) The Transformation and Finding the Dual (Previous Slide). (Min Problem) ???? Constraints following relations are inferred from the (Max Problem) Constraints = Variabes = Variabes ???? 38

  39. LP Problems (Duality in General Form) Example 4.9: Finding the Dual = min y 2 4 z y y = + m s.t. a x x 8 3 2 z x x x 1 2 D 1 2 3 P s.t. + 5 8 : y y + x x x + 6 + 2 : x x x y y 1 2 1 1 x 2 x x 3 2 0 1 6 7 3 : y y = = 5 x x free 7 4: x 1 2 0 free 2 2 1 2 3 2 2 : y y y y 0; 1 2 3 1 2 1 3 2 39

  40. Thanks! 40

Related


More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#