Mixed-integer programming is used when some of the decision variables need to be integer-valued and/or, more importantly, decisions need to be made as part of the solution procedure, and these "internal decisions" can only be implemented using discrete (typically binary) decision variables.
Mixed-integer program | MIP | NLP + some integer variables |
Mixed-integer linear program | MILP | LP + some integer variables |
Integer linear program | ILP | LP + all integer variables |
Binary integer program | BIP | LP + all binary variables |
In the following, the focus will be on MILP because it has the widest range of applications. Due to the difficultly in implementing effective procedures, MIP with NLP is usually restricted to only a handful of special NLP models like quadratic programming (QP).
$ \begin{eqnarray} \quad \mbox{Maximize} \quad 6x_1 + 8x_2 \\ \quad \mbox{subject to} \quad 2x_1 + 3x_2 &\le& 11 \\ 2x_1 &\le& 7 \\ x_1, x_2 \ge 0 \mbox{, integer} \\ \end{eqnarray} $
For a maximization problem, solving the problem as an LP without integer variable constraints provides an upper bound on the integer-restricted solution. Initially, the lower bound on the solution is zero because the variables are restricted to being nonnegative. Once the first integer feasible solution is found (termed the incumbent solution), it then becomes the current lower bound. The difference between the UB and LB defines the gap, and the search continues through the branch and bound tree until the gap falls below a threshold. Solutions that are not feasible are fathomed, that is, eliminated from the search tree.
First, solve as an LP (branch node 0 in tree):
# Node 0: Initial LP
using JuMP, GLPK
m = Model(GLPK.Optimizer)
@variable(m, 0 <= x₁ ) # Continuous variable
@variable(m, 0 <= x₂ )
@objective(m, Max, 6x₁ + 8x₂ )
@constraint(m, 2x₁ + 3x₂ <= 11 )
@constraint(m, 2x₁ <= 7 )
print(m)
set_optimizer_attribute(m, "msg_lev", GLPK.GLP_MSG_ALL)
optimize!(m)
UB = objective_value(m)
LB = 0
println("\nObj: ", objective_value(m), ", x₁: ", value(x₁), ", x₂: ", value(x₂))
println(" UB: ", UB, ", LB: ", LB, ", Gap: ", UB - LB, "\n")
GLPK Simplex Optimizer 5.0 2 rows, 2 columns, 3 non-zeros * 0: obj = -0.000000000e+000 inf = 0.000e+000 (2) * 2: obj = 3.166666667e+001 inf = 0.000e+000 (0) OPTIMAL LP SOLUTION FOUND Obj: 31.666666666666664, x₁: 3.5, x₂: 1.3333333333333333 UB: 31.666666666666664, LB: 0, Gap: 31.666666666666664
Next, select one of the fractional decision variables and add two integer constraints to the tree. Selection of the variable can be based on the most fractional, least fractional, and many other criteria.
# Node 1
@constraint(m, c1, x₁ <= 3)
set_optimizer_attribute(m, "msg_lev", GLPK.GLP_MSG_ALL)
optimize!(m)
UB = objective_value(m)
println("\nObj: ", objective_value(m), ", x₁: ", value(x₁), ", x₂: ", value(x₂))
println(" UB: ", UB, ", LB: ", LB, ", Gap: ", UB - LB, "\n")
GLPK Simplex Optimizer 5.0 3 rows, 2 columns, 4 non-zeros 2: obj = 3.166666667e+001 inf = 5.000e-001 (1) 3: obj = 3.133333333e+001 inf = 0.000e+000 (0) OPTIMAL LP SOLUTION FOUND Obj: 31.333333333333336, x₁: 3.0, x₂: 1.6666666666666667 UB: 31.333333333333336, LB: 0, Gap: 31.333333333333336
# Node 2: Incumbent (integer-feasible solution) found
@constraint(m, c2, x₂ <= 1)
set_optimizer_attribute(m, "msg_lev", GLPK.GLP_MSG_ALL)
optimize!(m)
LB = objective_value(m)
println("\nObj: ", objective_value(m), ", x₁: ", value(x₁), ", x₂: ", value(x₂))
println(" UB: ", UB, ", LB: ", LB, ", Gap: ", UB - LB, "\n")
GLPK Simplex Optimizer 5.0 4 rows, 2 columns, 5 non-zeros 3: obj = 3.133333333e+001 inf = 6.667e-001 (1) 4: obj = 2.600000000e+001 inf = 0.000e+000 (0) OPTIMAL LP SOLUTION FOUND Obj: 26.0, x₁: 3.0, x₂: 1.0000000000000002 UB: 31.333333333333336, LB: 26.0, Gap: 5.333333333333336
# Node 3
delete(m, c2)
@constraint(m, c3, x₂ >= 2)
set_optimizer_attribute(m, "msg_lev", GLPK.GLP_MSG_ALL)
optimize!(m)
UB = objective_value(m)
println("\nObj: ", objective_value(m), ", x₁: ", value(x₁), ", x₂: ", value(x₂))
println(" UB: ", UB, ", LB: ", LB, ", Gap: ", UB - LB, "\n")
GLPK Simplex Optimizer 5.0 4 rows, 2 columns, 5 non-zeros 4: obj = -0.000000000e+000 inf = 2.000e+000 (1) 5: obj = 1.600000000e+001 inf = 0.000e+000 (0) * 7: obj = 3.100000000e+001 inf = 0.000e+000 (0) OPTIMAL LP SOLUTION FOUND Obj: 31.0, x₁: 2.5, x₂: 2.0 UB: 31.0, LB: 26.0, Gap: 5.0
# Node 4
@constraint(m, c4, x₁ <= 2)
set_optimizer_attribute(m, "msg_lev", GLPK.GLP_MSG_ALL)
optimize!(m)
UB = objective_value(m)
println("\nObj: ", objective_value(m), ", x₁: ", value(x₁), ", x₂: ", value(x₂))
println(" UB: ", UB, ", LB: ", LB, ", Gap: ", UB - LB, "\n")
GLPK Simplex Optimizer 5.0 5 rows, 2 columns, 6 non-zeros 7: obj = 3.100000000e+001 inf = 5.000e-001 (1) 8: obj = 3.066666667e+001 inf = 0.000e+000 (0) OPTIMAL LP SOLUTION FOUND Obj: 30.666666666666668, x₁: 2.0, x₂: 2.3333333333333335 UB: 30.666666666666668, LB: 26.0, Gap: 4.666666666666668
# Node 5: New incumbent found
@constraint(m, c5, x₂ <= 2)
set_optimizer_attribute(m, "msg_lev", GLPK.GLP_MSG_ALL)
optimize!(m)
LB = objective_value(m)
println("\nObj: ", objective_value(m), ", x₁: ", value(x₁), ", x₂: ", value(x₂))
println(" UB: ", UB, ", LB: ", LB, ", Gap: ", UB - LB, "\n")
GLPK Simplex Optimizer 5.0 6 rows, 2 columns, 7 non-zeros 8: obj = 3.066666667e+001 inf = 3.333e-001 (1) 9: obj = 2.800000000e+001 inf = 0.000e+000 (0) * 10: obj = 2.800000000e+001 inf = 0.000e+000 (0) OPTIMAL LP SOLUTION FOUND Obj: 28.0, x₁: 2.0, x₂: 2.0 UB: 30.666666666666668, LB: 28.0, Gap: 2.666666666666668
# Node 6: New incumbent found, could stop since absolute gap < 1
delete(m, c5)
@constraint(m, c6, x₂ >= 3)
set_optimizer_attribute(m, "msg_lev", GLPK.GLP_MSG_ALL)
optimize!(m)
LB = objective_value(m)
println("\nObj: ", objective_value(m), ", x₁: ", value(x₁), ", x₂: ", value(x₂))
println(" UB: ", UB, ", LB: ", LB, ", Gap: ", UB - LB, "\n")
GLPK Simplex Optimizer 5.0 6 rows, 2 columns, 7 non-zeros 10: obj = -0.000000000e+000 inf = 5.000e+000 (2) 12: obj = 2.400000000e+001 inf = 0.000e+000 (0) * 14: obj = 3.000000000e+001 inf = 0.000e+000 (0) OPTIMAL LP SOLUTION FOUND Obj: 30.0, x₁: 1.0, x₂: 3.0 UB: 30.666666666666668, LB: 30.0, Gap: 0.6666666666666679
# Node 7: Fathomed (node deleted from tree) since infeasible
delete(m, [c4, c6])
@constraint(m, c7, x₁ >= 3)
set_optimizer_attribute(m, "msg_lev", GLPK.GLP_MSG_ALL)
optimize!(m)
GLPK Simplex Optimizer 5.0 5 rows, 2 columns, 6 non-zeros 14: obj = -0.000000000e+000 inf = 5.000e+000 (2) 17: obj = 3.133333333e+001 inf = 3.333e-001 (1) LP HAS NO PRIMAL FEASIBLE SOLUTION
# Node 8: Fathomed, infeasible, can stop since tree have been searched
delete(m, [c1, c3, c7])
@constraint(m, c8, x₁ >= 4)
set_optimizer_attribute(m, "msg_lev", GLPK.GLP_MSG_ALL)
optimize!(m)
GLPK Simplex Optimizer 5.0 3 rows, 2 columns, 4 non-zeros 19: obj = -0.000000000e+000 inf = 4.000e+000 (1) 20: obj = 2.100000000e+001 inf = 5.000e-001 (1) LP HAS NO PRIMAL FEASIBLE SOLUTION
At node 6, the absolute value of the gap was less than one, and as a result, the procedure could stop. In practice, only the relative percentage gap is determined, and the search continues until the gap approaches zero or the complete tree has been searched; as a result, in practice, nodes 7 and 8 might be evaluated before being fathomed. Since the branch and bound tree can grow exponentially, in many applications, the search can be stopped when the relative gap falls below a value of 1%, for example.
Solve as an ILP (integer linear program) directly:
m = Model(GLPK.Optimizer)
@variable(m, 0 <= x₁, Int ) # Integer variable
@variable(m, 0 <= x₂, Int )
@objective(m, Max, 6x₁ + 8x₂ )
@constraint(m, 2x₁ + 3x₂ <= 11 )
@constraint(m, 2x₁ <= 7 )
set_optimizer_attribute(m, "msg_lev", GLPK.GLP_MSG_ALL)
optimize!(m)
println("\nObj: ", objective_value(m), ", x₁: ", value(x₁), ", x₂: ", value(x₂), "\n")
GLPK Simplex Optimizer 5.0 2 rows, 2 columns, 3 non-zeros * 0: obj = -0.000000000e+000 inf = 0.000e+000 (2) * 2: obj = 3.166666667e+001 inf = 0.000e+000 (0) OPTIMAL LP SOLUTION FOUND GLPK Integer Optimizer 5.0 2 rows, 2 columns, 3 non-zeros 2 integer variables, none of which are binary Integer optimization begins... Long-step dual simplex will be used + 2: mip = not found yet <= +inf (1; 0) Solution found by heuristic: 28 + 3: >>>>> 3.000000000e+001 <= 3.000000000e+001 0.0% (2; 0) + 3: mip = 3.000000000e+001 <= tree is empty 0.0% (0; 3) INTEGER OPTIMAL SOLUTION FOUND Obj: 30.0, x₁: 1.0, x₂: 3.0
Here, the GLPK
message level GLPK.GLP_MSG_DBG
is being used instead of GLPK.GLP_MSG_ALL
so that details of the brand and bound procedure are output:
m = Model(GLPK.Optimizer)
@variable(m, 0 <= x₁, Int ) # Integer variable
@variable(m, 0 <= x₂, Int )
@objective(m, Max, 6x₁ + 8x₂ )
@constraint(m, 2x₁ + 3x₂ <= 11 )
@constraint(m, 2x₁ <= 7 )
set_optimizer_attribute(m, "msg_lev", GLPK.GLP_MSG_DBG)
optimize!(m)
println("\nObj: ", objective_value(m), ", x₁: ", value(x₁), ", x₂: ", value(x₂), "\n")
GLPK Simplex Optimizer 5.0 2 rows, 2 columns, 3 non-zeros * 0: obj = -0.000000000e+000 inf = 0.000e+000 (2) * 2: obj = 3.166666667e+001 inf = 0.000e+000 (0) OPTIMAL LP SOLUTION FOUND GLPK Integer Optimizer 5.0 2 rows, 2 columns, 3 non-zeros 2 integer variables, none of which are binary Integer optimization begins... Long-step dual simplex will be used ------------------------------------------------------------------------ Processing node 1 at level 0 + 2: mip = not found yet <= +inf (1; 0) Solving LP relaxation... GLPK Simplex Optimizer 5.0 2 rows, 2 columns, 3 non-zeros # 2: obj = 3.166666667e+001 inf = 0.000e+000 (1) # 3: obj = 3.133333333e+001 inf = 0.000e+000 (0) 0% OPTIMAL LP SOLUTION FOUND Found optimal solution to LP relaxation Local bound is 3.000000000e+001 There is one fractional column, integer infeasibility is 3.333e-001 branch_drtom: column 2 chosen to branch on branch_drtom: down-branch bound is 2.600000000e+001 branch_drtom: up-branch bound is 3.066666667e+001 Branching on column 2, primal value is 1.666666667e+000 Node 2 begins down branch, node 3 begins up branch ------------------------------------------------------------------------ Processing node 3 at level 1 + 3: mip = not found yet <= 3.000000000e+001 (2; 0) Solving LP relaxation... GLPK Simplex Optimizer 5.0 2 rows, 2 columns, 3 non-zeros # 3: obj = 3.066666667e+001 inf = 0.000e+000 (0) OPTIMAL LP SOLUTION FOUND Found optimal solution to LP relaxation Local bound is 3.000000000e+001 There is one fractional column, integer infeasibility is 3.333e-001 Solution found by heuristic: 28 branch_drtom: column 2 chosen to branch on branch_drtom: down-branch bound is 2.800000000e+001 branch_drtom: up-branch bound is 3.000000000e+001 Down-branch is hopeless + 3: mip = 2.800000000e+001 <= 3.000000000e+001 7.1% (2; 0) Solving LP relaxation... GLPK Simplex Optimizer 5.0 2 rows, 2 columns, 3 non-zeros # 3: obj = 3.000000000e+001 inf = 0.000e+000 (0) OPTIMAL LP SOLUTION FOUND Found optimal solution to LP relaxation Local bound is 3.000000000e+001 There are no fractional columns New integer feasible solution found + 3: >>>>> 3.000000000e+001 <= 3.000000000e+001 0.0% (2; 0) Node 3 fathomed One hopeless branch has been pruned Active list is empty! + 3: mip = 3.000000000e+001 <= tree is empty 0.0% (0; 3) INTEGER OPTIMAL SOLUTION FOUND Obj: 30.0, x₁: 1.0, x₂: 3.0
JuMP
supports a large number of solvers (full list). The following table lists the most well-known commercial and open-source MIP solvers:
Cplex | IBM, first commercial solver |
Gurobi | Developed by Robert Bixby |
FICO Xpress | Used by LLamasoft |
SAS/OR | Part of SAS system (not supported in JuMP ) |
Cbc | COIN-OR open-source solver |
GLPK | Free Software Foundation GNU open-source solver |
Presolve: Eliminate redundant variables and constraints; e.g.,
$2x_1 + 2x_2 \le 1, x_1, x_2 \ge 0 \mbox{ and integer} \implies x_1 = x_2 = 0$
Cutting planes: Keep all integer solutions and cuts off LP solutions (Gomory cut)
In an LP, no decision variable can be restricted to being integer-valued, and no decisions can be made (internally) as part of the solution procedure. In practice, what usually necessitates representing a problem as a MILP is that that one or more discrete decisions need to be made as part of the solution procedure. This leads to the following uses of discrete decision variables in a MILP:
In a MILP, some of the decision variables can be restricted to being integer-valued so that they can better represent model entities that are inherently discrete.
In Ex 4 in LP 3, although the number of employees is inherently a discrete value, the total number of employees was high enough that just rounding the continuous LP solution was reasonable. In this example, the number of employees is small enough that just rounding results in significant error in the solution and, instead, the number of employees is treated as an integer value:
using JuMP, Gurobi
b = [15, 16, 11, 17, 13, 15, 19] # Only 10% of values used in Ex 4 in LP 3
model = Model(GLPK.Optimizer)
@variable(model, 0 <= x[1:7] ) # Continuous-valued variables
@objective(model, Min, sum(x[i] for i = 1:7) )
for i = 1:7
@constraint(model, sum(x[Int(j - 7*floor((j - 1)/7))] for j = i+3:i+7) >= b[i] )
end
set_optimizer_attribute(model, "msg_lev", GLPK.GLP_MSG_ALL)
optimize!(model)
Nᵒ = objective_value(model)
xᵒ = value.(x)
Nᵒ, xᵒ
GLPK Simplex Optimizer 5.0 7 rows, 7 columns, 35 non-zeros 0: obj = 0.000000000e+000 inf = 1.060e+002 (7) 7: obj = 2.600000000e+001 inf = 0.000e+000 (0) * 9: obj = 2.233333333e+001 inf = 0.000e+000 (0) OPTIMAL LP SOLUTION FOUND
(22.33333333333333, [0.0, 3.3333333333333335, 0.33333333333333326, 5.999999999999998, 5.333333333333332, 0.0, 7.333333333333334])
sum(ceil.(xᵒ)) # Round-up variables (not optimal)
25.0
model = Model(GLPK.Optimizer)
@variable(model, 0 <= x[1:7], Int ) # Integer-valued variables
@objective(model, Min, sum(x[i] for i = 1:7) )
for i = 1:7
@constraint(model, sum(x[Int(j - 7*floor((j - 1)/7))] for j = i+3:i+7) >= b[i] )
end
set_optimizer_attribute(model, "msg_lev", GLPK.GLP_MSG_ALL)
optimize!(model)
Nᵒ = objective_value(model)
xᵒ = value.(x)
Nᵒ, xᵒ
GLPK Simplex Optimizer 5.0 7 rows, 7 columns, 35 non-zeros 0: obj = 0.000000000e+000 inf = 1.060e+002 (7) 7: obj = 2.600000000e+001 inf = 0.000e+000 (0) * 9: obj = 2.233333333e+001 inf = 0.000e+000 (0) OPTIMAL LP SOLUTION FOUND GLPK Integer Optimizer 5.0 7 rows, 7 columns, 35 non-zeros 7 integer variables, none of which are binary Integer optimization begins... Long-step dual simplex will be used + 9: mip = not found yet >= -inf (1; 0) Solution found by heuristic: 24 + 26: >>>>> 2.300000000e+001 >= 2.300000000e+001 0.0% (16; 0) + 26: mip = 2.300000000e+001 >= tree is empty 0.0% (0; 31) INTEGER OPTIMAL SOLUTION FOUND
(23.0, [0.0, 2.0, 6.0, 1.0, 5.0, 1.0, 8.0])
In Ex 1 in LP 2, the size of individual coffee beans and peppers are small enough relative to the size of the knapsack that representing the quantities of both as continuous variables seems reasonable:
using JuMP, GLPK
model = Model(GLPK.Optimizer)
@variable(model, 0 <= x₁ )
@variable(model, 0 <= x₂ )
@objective(model, Max, 4x₁ + 9x₂ ) # $4 and $9 profit per pound
@constraint(model, 2x₁ + 6x₂ <= 120 ) # $2 and $6 cost per pound
@constraint(model, x₁ + x₂ <= 30 )
optimize!(model)
Πᵒ, x₁ᵒ, x₂ᵒ = objective_value(model), value(x₁), value(x₂)
(195.0, 14.999999999999998, 15.0)
If only a few units of each item are being considered, then just rounding a continuous variable may lead to a nonoptimal result; instead, integer-valued variables could be used to represent the quantities of each item. For example, if the coffee and peppers only could be purchased in 4-pound bags, then rounding the continuous result to get the number of 4-bags would give a result that exceeds the 30-pound weight limit:
round(x₁ᵒ/4), round(x₂ᵒ/4),
4*round(x₁ᵒ/4) + 4*round(x₂ᵒ/4),
4*round(x₁ᵒ/4) + 4*round(x₂ᵒ/4) <= 30
(4.0, 4.0, 32.0, false)
In order to correcly account for the 4-pound bags, the model can include integer restrictions on the $x_i$, which makes it a MILP and where the quantities $x_i$ now represent the numbers of bags instead of pounds purchased, which reduces the total profit from \$195 to \\$192:
model = Model(GLPK.Optimizer)
@variable(model, x₁, Int ) # Integer-valued variable
@variable(model, x₂, Int ) # Integer-valued variable
@objective(model, Max, 4*4x₁ + 4*9x₂ ) # $16 and $36 profit per bag
@constraint(model, 4*2x₁ + 4*6x₂ <= 120 ) # $8 and $24 cost per bag
@constraint(model, 4x₁ + 4x₂ <= 30 )
optimize!(model)
Πᵒ, x₁ᵒ, x₂ᵒ = objective_value(model), value(x₁), value(x₂)
(192.0, 3.0, 4.0)
Gurobi is a commercial MIP solver that is free for students. It is able to solve some MIPs that might be difficult for the open-source GLPK solver that we have been using. You will not need to use Gurobi for any of the assignments in this class, but you may want to use it for your other needs if you find GLPK not able to solve your MILP model.
To install, you will need to download a copy of the Individual Academic License of Gurobi to your local machine. You will need to create an account using your @ncsu.edu
email and then access your free named-user academic license while connected to the NCSU network on campus. If you are off-campus and want to connect to the NCSU network using AnyConnect VPN), you will need to first request full tunnel VPN access via this form - https://ncsu.service-now.com/oit_help?id=sc_cat_item_no_crumbs&sys_id=d58ac9b9db47ef00564ef9051d9619b8 (the normal VPN is split tunnel). On Windows, Gurobi can be installed in the default c:\gurobi912
folder, and the license file guorbi.lic
should be stored in this location. After registering the software with the license, you will be able to use Gurobi anywhere (you only need to be on the NCSU network for your initial download).
After closing the window, you can then add Gurobi
and build Gurobi
in the Julia REPL in order to access it in Jupyter and allow JuMP to use it as its solver. You can test your installation by running the following:
using JuMP, Gurobi
model = Model(Gurobi.Optimizer)
@variable(model, 0 <= x₁ )
@variable(model, 0 <= x₂ )
@objective(model, Max, 4x₁ + 9x₂ )
@constraint(model, 2x₁ + 6x₂ <= 120 )
@constraint(model, x₁ + x₂ <= 30 )
set_optimizer_attribute(model, "OutputFlag", 1) # Set 0 for minimal output
optimize!(model)
Πᵒ, x₁ᵒ, x₂ᵒ = objective_value(model), value(x₁), value(x₂)
Academic license - for non-commercial use only - expires 2022-07-31 Gurobi Optimizer version 9.1.2 build v9.1.2rc0 (win64) Thread count: 4 physical cores, 8 logical processors, using up to 8 threads Optimize a model with 2 rows, 2 columns and 4 nonzeros Model fingerprint: 0xb0d5dff6 Coefficient statistics: Matrix range [1e+00, 6e+00] Objective range [4e+00, 9e+00] Bounds range [0e+00, 0e+00] RHS range [3e+01, 1e+02] Presolve time: 0.00s Presolved: 2 rows, 2 columns, 4 nonzeros Iteration Objective Primal Inf. Dual Inf. Time 0 1.3000000e+31 3.000000e+30 1.300000e+01 0s 2 1.9500000e+02 0.000000e+00 0.000000e+00 0s Solved in 2 iterations and 0.00 seconds Optimal objective 1.950000000e+02 User-callback calls 32, time in user-callback 0.01 sec
(195.0, 15.0, 15.0)