-
Notifications
You must be signed in to change notification settings - Fork 1
Add JuMP interface #20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
Thanks! This sounds like a great idea. I know Yufan is on an internship right now, but we'll revisit in more detail in a few weeks! |
|
No worry, there is no rush :) |
|
Hi @blegat , sorry for the late reply. Thanks for the great work and this is an interesting idea. Is this still ongoing or it's ready? |
|
Hi @luotuoqingshan , Yes I have rebased and it passes the tests so it is ready for review :) |
| @time sdplr(C, As, b, 1, maxmajoriter = 20); | ||
|
|
||
| # SDPLRPlus does not support sparse factor | ||
| As = [SymLowRankMatrix(Diagonal(ones(1)), hcat(e_i(Float64, i, n, sparse = false))) for i in 1:n] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Running this bench.jl gives me the following error on this line.
ERROR: UndefKeywordError: keyword argument `vector` not assigned
Stacktrace:
[1] (::var"#24#25")(i::Int64)
@ Main ./none:0
[2] iterate
@ ./generator.jl:47 [inlined]
[3] collect(itr::Base.Generator{UnitRange{Int64}, var"#24#25"})
@ Base ./array.jl:834
[4] top-level scope
@ /u/subspace_s4/huan1754/SDPLRPlus.jl/exps/bench.jl:19
I saw in LowRankOpt.jl, vector does not have a default value, is this the cause? Function e_i in LowRankOpt.jl
| set_attribute(model, "ranks", [1]) | ||
| set_attribute(model, "maxmajoriter", 0) | ||
| set_attribute(model, "printlevel", 3) | ||
| @profview optimize!(model) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
On this line I got one MKLSparseError, I'm quite confused about this error, do you have any clue? This error keeps showing up in the following benchmarking code too.
ERROR: MKLSparseError(SPARSE_STATUS_INVALID_VALUE)
Stacktrace:
[1] check_status
@ ~/.julia/packages/MKLSparse/EWj0I/src/types.jl:237 [inlined]
[2] #MKLSparseMatrix#9
@ ~/.julia/packages/MKLSparse/EWj0I/src/mklsparsematrix.jl:173 [inlined]
[3] MKLSparseMatrix
@ ~/.julia/packages/MKLSparse/EWj0I/src/mklsparsematrix.jl:167 [inlined]
[4] mv!(transA::Char, alpha::Float64, A::SparseMatrixCSC{…}, descr::MKLSparse.matrix_descr, x::SubArray{…}, beta::Float64, y::SubArray{…})
@ MKLSparse ~/.julia/packages/MKLSparse/EWj0I/src/generic.jl:33
[5] mul!
@ ~/.julia/packages/MKLSparse/EWj0I/src/interface.jl:63 [inlined]
[6] mul!
@ ~/.julia/packages/MKLSparse/EWj0I/src/interface.jl:107 [inlined]
[7] jprod!
@ ~/.julia/packages/LowRankOpt/UcxnZ/src/model.jl:216 [inlined]
[8] jprod!(model::LowRankOpt.Model{…}, x::LowRankOpt.BurerMonteiro.Solution{…}, v::LowRankOpt.BurerMonteiro.Solution{…}, Jv::SubArray{…})
@ LowRankOpt ~/.julia/packages/LowRankOpt/UcxnZ/src/model.jl:27
[9] cons!
@ ~/.julia/packages/LowRankOpt/UcxnZ/src/BurerMonteiro/model.jl:109 [inlined]
[10] 𝒜!(𝒜_UUt::Vector{…}, model::LowRankOpt.BurerMonteiro.Model{…}, x::Vector{…})
@ SDPLRPlus /u/subspace_s4/huan1754/SDPLRPlus.jl/src/lowrankopt.jl:75
[11] f!(data::LowRankOpt.BurerMonteiro.Model{…}, var::SDPLRPlus.SolverVars{…}, aux::LowRankOpt.BurerMonteiro.Model{…})
@ SDPLRPlus /u/subspace_s4/huan1754/SDPLRPlus.jl/src/coreop.jl:16
[12] macro expansion
@ /u/subspace_s4/huan1754/SDPLRPlus.jl/src/coreop.jl:347 [inlined]
[13] macro expansion
@ ./timing.jl:395 [inlined]
[14] fg!(data::LowRankOpt.BurerMonteiro.Model{…}, var::SDPLRPlus.SolverVars{…}, aux::LowRankOpt.BurerMonteiro.Model{…}, normC::Float64, normb::Float64)
@ SDPLRPlus /u/subspace_s4/huan1754/SDPLRPlus.jl/src/coreop.jl:346
[15] _sdplr(data::LowRankOpt.BurerMonteiro.Model{…}, var::SDPLRPlus.SolverVars{…}, aux::LowRankOpt.BurerMonteiro.Model{…}, stats::SDPLRPlus.SolverStats{…}, config::SDPLRPlus.BurerMonteiroConfig{…})
@ SDPLRPlus /u/subspace_s4/huan1754/SDPLRPlus.jl/src/sdplr.jl:143
[16] solve!(solver::SDPLRPlus.Solver, model::LowRankOpt.BurerMonteiro.Model{…}, stats::SolverCore.GenericExecutionStats{…}; kwargs::@Kwargs{…})
@ SDPLRPlus /u/subspace_s4/huan1754/SDPLRPlus.jl/src/lowrankopt.jl:40
[17] solve!
@ /u/subspace_s4/huan1754/SDPLRPlus.jl/src/lowrankopt.jl:26 [inlined]
[18] #solve!#7
@ ~/.julia/packages/LowRankOpt/UcxnZ/src/BurerMonteiro/solver.jl:25 [inlined]
[19] optimize!(model::LowRankOpt.Optimizer{Float64})
@ LowRankOpt ~/.julia/packages/LowRankOpt/UcxnZ/src/MOI_wrapper.jl:142
[20] optimize!
@ ~/.julia/packages/MathOptInterface/zq9bo/src/MathOptInterface.jl:122 [inlined]
[21] optimize!(m::MathOptInterface.Utilities.CachingOptimizer{…})
@ MathOptInterface.Utilities ~/.julia/packages/MathOptInterface/zq9bo/src/Utilities/cachingoptimizer.jl:370
[22] optimize!
@ ~/.julia/packages/MathOptInterface/zq9bo/src/Bridges/bridge_optimizer.jl:367 [inlined]
[23] optimize!
@ ~/.julia/packages/Dualization/ihzlf/src/MOI_wrapper.jl:255 [inlined]
[24] optimize!
@ ~/.julia/packages/MathOptInterface/zq9bo/src/MathOptInterface.jl:122 [inlined]
[25] optimize!(m::MathOptInterface.Utilities.CachingOptimizer{…})
@ MathOptInterface.Utilities ~/.julia/packages/MathOptInterface/zq9bo/src/Utilities/cachingoptimizer.jl:370
[26] optimize!
@ ~/.julia/packages/MathOptInterface/zq9bo/src/Bridges/bridge_optimizer.jl:367 [inlined]
[27] optimize!
@ ~/.julia/packages/MathOptInterface/zq9bo/src/MathOptInterface.jl:122 [inlined]
[28] optimize!(m::MathOptInterface.Utilities.CachingOptimizer{…})
@ MathOptInterface.Utilities ~/.julia/packages/MathOptInterface/zq9bo/src/Utilities/cachingoptimizer.jl:370
[29] optimize!(model::Model; ignore_optimize_hook::Bool, _differentiation_backend::MathOptInterface.Nonlinear.SparseReverseMode, kwargs::@Kwargs{})
@ JuMP ~/.julia/packages/JuMP/N7h14/src/optimizer_interface.jl:609
[30] optimize!(model::Model)
@ JuMP ~/.julia/packages/JuMP/N7h14/src/optimizer_interface.jl:560
[31] macro expansion
@ /u/subspace_s4/software/julia-1.10.1/share/julia/stdlib/v1.10/Profile/src/Profile.jl:27 [inlined]
[32] macro expansion
@ ~/.cursor-server/extensions/julialang.language-julia-1.149.2-universal/scripts/packages/VSCodeServer/src/profiler.jl:141 [inlined]
[33] top-level scope
@ /u/subspace_s4/huan1754/SDPLRPlus.jl/exps/bench.jl:31
Some type information was truncated. Use `show(err)` to see complete types.
luotuoqingshan
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the great work, I took a look and I feel the changes regarding the low-level interfaces look great. I ran into a few problems when testing the lowrankopt.jl interface, in particular when running the bench.jl file, and left a few comments.
I am developing in https://github.com/blegat/LowRankOpt.jl/ a JuMP interface for Burer-Monteiro. This is a work in progress but the plan is to optimize it for a variety of different types of model (low-rank, sparse, etc...) like you have done in this package.
I only needed minor changes in your code to make it work.
My plan is to keep your current low-level interface and to have your solver work both for your current low-level interface and the JuMP interface.
Through the JuMP interface, it should also be possible to support multiple SDPs as well as inequality constraints etc... without needing to change much of your code (e.g. for Lanczos, we probably need to loop overall all PSD matrices instead of considering that we only have one PSD matrix).