How to extract optimization problem matrices A,b,c using JuMP in Julia

孤人 提交于 2020-07-22 06:00:17

问题


I create an optimization model in Julia-JuMP using the symbolic variables and constraints e.g. below

using JuMP
using CPLEX

# model
Mod = Model(CPLEX.Optimizer) 

# sets
I = 1:2;

# Variables
x = @variable( Mod , [I] , base_name = "x" ) 
y = @variable( Mod , [I] , base_name = "y" )  

# constraints
Con1 = @constraint( Mod , [i in I] , 2 * x[i] + 3 * y[i] <= 100 )

# objective
ObjFun = @objective( Mod , Max , sum( x[i] + 2 * y[i] for i in I) ) ;

# solve 
optimize!(Mod)

I guess JuMP creates the problem in the form minimize c'*x subj to Ax < b before it is passes to the solver CPLEX. I want to extract the matrices A,b,c. In the above example I would expect something like:

A
2×4 Array{Int64,2}:
 2  0  3  0
 0  2  0  3

b
2-element Array{Int64,1}:
 100
 100

c
4-element Array{Int64,1}:
 1
 1
 2
 2

In MATLAB the function prob2struct can do this https://www.mathworks.com/help/optim/ug/optim.problemdef.optimizationproblem.prob2struct.html

In there a JuMP function that can do this?


回答1:


This is not easily possible as far as I am aware.

The problem is stored in the underlying MathOptInterface (MOI) specific data structures. For example, constraints are always stored as MOI.AbstractFunction - in - MOI.AbstractSet. The same is true for the MOI.ObjectiveFunction. (see MOI documentation: https://jump.dev/MathOptInterface.jl/dev/apimanual/#Functions-1)

You can however, try to recompute the objective function terms and the constraints in matrix-vector-form.

For example, assuming you still have your JuMP.Model Mod, you can examine the objective function closer by typing:

using MathOptInterface
const MOI = MathOptInterface

# this only works if you have a linear objective function (the model has a ScalarAffineFunction as its objective)
obj = MOI.get(Mod, MOI.ObjectiveFunction{MOI.ScalarAffineFunction{Float64}}())

# take a look at the terms 
obj.terms
# from this you could extract your vector c
c = zeros(4)
for term in obj.terms
    c[term.variable_index.value] = term.coefficient
end
@show(c)

This gives indeed: c = [1.;1.;2.;2.].

You can do something similar for the underlying MOI.constraints.

# list all the constraints present in the model
cons = MOI.get(Mod, MOI.ListOfConstraints())
@show(cons)

in this case we only have one type of constraint, i.e. (MOI.ScalarAffineFunction{Float64} in MOI.LessThan{Float64})

# get the constraint indices for this combination of F(unction) in S(et)
F = cons[1][1]
S = cons[1][2]
ci = MOI.get(Mod, MOI.ListOfConstraintIndices{F,S}())

You get two constraint indices (stored in the array ci), because there are two constraints for this combination F - in - S. Let's examine the first one of them closer:

ci1 = ci[1]
# to get the function and set corresponding to this constraint (index):
moi_backend = backend(Mod)
f = MOI.get(moi_backend, MOI.ConstraintFunction(), ci1)

f is again of type MOI.ScalarAffineFunction which corresponds to one row a1 in your A = [a1; ...; am] matrix. The row is given by:

a1 = zeros(4)
for term in f.terms
    a1[term.variable_index.value] = term.coefficient
end
@show(a1) # gives [2.0 0 3.0 0] (the first row of your A matrix)

To get the corresponding first entry b1 of your b = [b1; ...; bm] vector, you have to look at the constraint set of that same constraint index ci1:

s = MOI.get(moi_backend, MOI.ConstraintSet(), ci1)
@show(s) # MathOptInterface.LessThan{Float64}(100.0)
b1 = s.upper

I hope this gives you some intuition on how the data is stored in MathOptInterface format.

You would have to do this for all constraints and all constraint types and stack them as rows in your constraint matrix A and vector b.




回答2:


I didn't try it myself. But the MathProgBase package seems to be able to provide A, b, and c in matrix form.



来源:https://stackoverflow.com/questions/62800012/how-to-extract-optimization-problem-matrices-a-b-c-using-jump-in-julia

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!