Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/src/reference/models.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,7 @@ copy_to
```@docs
AbstractModelAttribute
Name
CoefficientType
ObjectiveFunction
ObjectiveFunctionType
ObjectiveSense
Expand Down
4 changes: 4 additions & 0 deletions src/Bridges/bridge_optimizer.jl
Original file line number Diff line number Diff line change
Expand Up @@ -803,6 +803,10 @@ function MOI.get(b::AbstractBridgeOptimizer, attr::MOI.ListOfModelAttributesSet)
return unbridged_function(b, list)
end

function MOI.get(b::AbstractBridgeOptimizer, attr::MOI.CoefficientType)
return MOI.get(b.model, attr)
end

function MOI.get(
b::AbstractBridgeOptimizer,
attr::Union{MOI.AbstractModelAttribute,MOI.AbstractOptimizerAttribute},
Expand Down
233 changes: 183 additions & 50 deletions src/Utilities/cachingoptimizer.jl
Original file line number Diff line number Diff line change
Expand Up @@ -18,19 +18,21 @@
and links it with an optimizer. It supports incremental model
construction and modification even when the optimizer doesn't.

A `CachingOptimizer` may be in one of three possible states (`CachingOptimizerState`):
A `CachingOptimizer` may be in one of three possible states
(`CachingOptimizerState`):

* `NO_OPTIMIZER`: The CachingOptimizer does not have any optimizer.
* `EMPTY_OPTIMIZER`: The CachingOptimizer has an empty optimizer.
The optimizer is not synchronized with the cached model.
* `ATTACHED_OPTIMIZER`: The CachingOptimizer has an optimizer, and it is synchronized with the cached model.
* `ATTACHED_OPTIMIZER`: The CachingOptimizer has an optimizer, and it is
synchronized with the cached model.

A `CachingOptimizer` has two modes of operation (`CachingOptimizerMode`):

* `MANUAL`: The only methods that change the state of the `CachingOptimizer`
are [`Utilities.reset_optimizer`](@ref), [`Utilities.drop_optimizer`](@ref),
and [`Utilities.attach_optimizer`](@ref).
Attempting to perform an operation in the incorrect state results in an error.
and [`Utilities.attach_optimizer`](@ref). Attempting to perform an operation
in the incorrect state results in an error.
* `AUTOMATIC`: The `CachingOptimizer` changes its state when necessary. For
example, `optimize!` will automatically call `attach_optimizer` (an
optimizer must have been previously set). Attempting to add a constraint or
Expand All @@ -45,25 +47,82 @@ mutable struct CachingOptimizer{OptimizerType,ModelType<:MOI.ModelLike} <:
mode::CachingOptimizerMode
model_to_optimizer_map::IndexMap
optimizer_to_model_map::IndexMap
# CachingOptimizer externally uses the same variable and constraint indices
# as the model_cache. model_to_optimizer_map maps from the model_cache indices to the
# optimizer indices.
auto_bridge::Bool
end

"""
CachingOptimizer(
model_cache::MOI.ModelLike,
optimizer::Union{Nothing,MOI.AbstractOptimizer} = nothing;
mode::CachingOptimizerMode = AUTOMATIC,
state::CachingOptimizerState =
optimizer === nothing ? NO_OPTIMIZER : EMPTY_OPTIMIZER,
auto_bridge::Bool = false,
)

Creates a `CachingOptimizer` using `model_cache` and `optimizer`.

## Notes

* If `auto_bridge == true`, when the caching optimizer encounters a constraint
or objective function that is not supported by `optimizer`, it automatically
adds a bridging layer to `optimizer`.
* If `auto_bridge == true`, and an optimizer is provided, the state is forced
to `EMPTY_OPTIMIZER`.
* If an `optimizer` is passed, the returned CachingOptimizer does not support
the function `reset_optimizer(model, new_optimizer)` if the type of
`new_optimizer` is different from the type of `optimizer`.

## Examples

```julia
model = MOI.Utilities.CachingOptimizer(
MOI.Utilities.Model{Float64}(),
GLPK.Optimizer(),
)
```

```julia
model = MOI.Utilities.CachingOptimizer(
MOI.Utilities.Model{Float64}(),
auto_bridge = true,
)
MOI.Utilities.reset_optimizer(model, GLPK.Optimizer())
```
"""
function CachingOptimizer(
model_cache::MOI.ModelLike,
mode::CachingOptimizerMode,
optimizer::Union{Nothing,MOI.AbstractOptimizer} = nothing;
mode::CachingOptimizerMode = AUTOMATIC,
state::CachingOptimizerState = optimizer === nothing ? NO_OPTIMIZER :
EMPTY_OPTIMIZER,
auto_bridge::Bool = false,
)
return CachingOptimizer{MOI.AbstractOptimizer,typeof(model_cache)}(
nothing,
T = optimizer !== nothing ? typeof(optimizer) : MOI.AbstractOptimizer
if optimizer !== nothing
@assert MOI.is_empty(model_cache)
@assert MOI.is_empty(optimizer)
if auto_bridge
state = EMPTY_OPTIMIZER
T = MOI.AbstractOptimizer
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The type is not concrete so it will be inefficient

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could use a union of the type with and without bridges

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's not sufficient to do this because the bridges may need a caching layer below. The abstract type has worked fine for JuMP, and having a complicated union may not actually help things.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't agree, we get an allocation everytime we need to access the field, we have all these _moi function barriers in JuMP to avoid this. If we have the same issue in the CachingOptimizer, then it doesn't make sense.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is an opt-in feature for users of CachingOptimizer. JuMP will opt-in with no change to performance because it already has the abstract type. If others opt-in, they should check performance and/or implement function barriers.

Overall, this is a big win for JuMP with minimal impact on other users. It's simple to implement, and there are no edge cases.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thing there will be a change in perf even for JuMP. Now there are two fields of abstract type.
We don't know what backend is, so have one hit, then you figure out it's CachingOptimizer so you make a call to it and then the optimizer field is abstract again so you get a second hit.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is no change in the JuMP behavior:

julia> model = Model(Clp.Optimizer);

julia> backend(model)
MOIU.CachingOptimizer{MOI.AbstractOptimizer,MOIU.UniversalFallback{MOIU.Model{Float64}}}
in state EMPTY_OPTIMIZER
in mode AUTOMATIC
with model cache MOIU.UniversalFallback{MOIU.Model{Float64}}
  fallback for MOIU.Model{Float64}
with optimizer MOIB.LazyBridgeOptimizer{MOIU.CachingOptimizer{Clp.Optimizer,MOIU.UniversalFallback{MOIU.Model{Float64}}}}
  with 0 variable bridges
  with 0 constraint bridges
  with 0 objective bridges
  with inner model MOIU.CachingOptimizer{Clp.Optimizer,MOIU.UniversalFallback{MOIU.Model{Float64}}}
    in state ATTACHED_OPTIMIZER
    in mode AUTOMATIC
    with model cache MOIU.UniversalFallback{MOIU.Model{Float64}}
      fallback for MOIU.Model{Float64}
    with optimizer Clp.Optimizer

julia> model2 = Model(Clp.Optimizer; auto_bridge = true);

julia> backend(model2)
MOIU.CachingOptimizer{MOI.AbstractOptimizer,MOIU.UniversalFallback{MOIU.Model{Float64}}}
in state EMPTY_OPTIMIZER
in mode AUTOMATIC
with model cache MOIU.UniversalFallback{MOIU.Model{Float64}}
  fallback for MOIU.Model{Float64}
with optimizer Clp.Optimizer

julia> @variable(model2, x[1:2] in MOI.Nonnegatives(2))
2-element Array{VariableRef,1}:
 x[1]
 x[2]

julia> backend(model2)
MOIU.CachingOptimizer{MOI.AbstractOptimizer,MOIU.UniversalFallback{MOIU.Model{Float64}}}
in state EMPTY_OPTIMIZER
in mode AUTOMATIC
with model cache MOIU.UniversalFallback{MOIU.Model{Float64}}
  fallback for MOIU.Model{Float64}
with optimizer MOIB.LazyBridgeOptimizer{MOIU.CachingOptimizer{Clp.Optimizer,MOIU.UniversalFallback{MOIU.Model{Float64}}}}
  with 0 variable bridges
  with 0 constraint bridges
  with 0 objective bridges
  with inner model MOIU.CachingOptimizer{Clp.Optimizer,MOIU.UniversalFallback{MOIU.Model{Float64}}}
    in state ATTACHED_OPTIMIZER
    in mode AUTOMATIC
    with model cache MOIU.UniversalFallback{MOIU.Model{Float64}}
      fallback for MOIU.Model{Float64}
    with optimizer Clp.Optimizer

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If bridges are applied, we get back to the current JuMP behavior. If bridges are not applied, then we skip all the issues with bridges, but the backend type that JuMP sees is still the same.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we have a call to discuss instead of this back-and-forth?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I think we reached the point where messages with opposite timezone makes this process long and inefficient :D

end
end
return CachingOptimizer{T,typeof(model_cache)}(
optimizer,
model_cache,
NO_OPTIMIZER,
state,
mode,
IndexMap(),
IndexMap(),
auto_bridge,
)
end

# Added for compatibility with MOI 0.9.20
function CachingOptimizer(cache::MOI.ModelLike, mode::CachingOptimizerMode)
return CachingOptimizer(cache; mode = mode)
end

function Base.show(io::IO, C::CachingOptimizer)
indent = " "^get(io, :indent, 0)
MOIU.print_with_acronym(io, summary(C))
Expand All @@ -75,31 +134,12 @@ function Base.show(io::IO, C::CachingOptimizer)
return show(IOContext(io, :indent => get(io, :indent, 0) + 2), C.optimizer)
end

"""
CachingOptimizer(model_cache::MOI.ModelLike, optimizer::AbstractOptimizer)

Creates an `CachingOptimizer` in `AUTOMATIC` mode, with the optimizer
`optimizer`.

The type of the optimizer returned is `CachingOptimizer{typeof(optimizer),
typeof(model_cache)}` so it does not support the function
`reset_optimizer(::CachingOptimizer, new_optimizer)` if the type of
`new_optimizer` is different from the type of `optimizer`.
"""
function CachingOptimizer(
model_cache::MOI.ModelLike,
optimizer::MOI.AbstractOptimizer,
)
@assert MOI.is_empty(model_cache)
@assert MOI.is_empty(optimizer)
return CachingOptimizer{typeof(optimizer),typeof(model_cache)}(
optimizer,
model_cache,
EMPTY_OPTIMIZER,
AUTOMATIC,
IndexMap(),
IndexMap(),
)
function MOI.get(model::CachingOptimizer, attr::MOI.CoefficientType)
if state(model) == NO_OPTIMIZER
return MOI.get(model.model_cache, attr)
else
return MOI.get(model.optimizer, attr)
end
end

## Methods for managing the state of CachingOptimizer.
Expand Down Expand Up @@ -137,8 +177,8 @@ function reset_optimizer(m::CachingOptimizer, optimizer::MOI.AbstractOptimizer)
if attr isa MOI.RawOptimizerAttribute
# Even if the optimizer claims to `supports` `attr`, the value
# might have a different meaning (e.g., two solvers with `logLevel`
# as a RawOptimizerAttribute). To be on the safe side, just skip all raw
# parameters.
# as a RawOptimizerAttribute). To be on the safe side, just skip all
# raw parameters.
continue
elseif !MOI.is_copyable(attr) || !MOI.supports(m.optimizer, attr)::Bool
continue
Expand Down Expand Up @@ -344,19 +384,70 @@ function MOI.add_variables(m::CachingOptimizer, n)
return vindices
end

"""
_bridge_if_needed(
f::Function,
m::CachingOptimizer;
add::Bool = false,
)

Return `f(m)`, under the assumption that the `.optimizer` field of `m` will be
wrapped in a `LazyBridgeOptimizer` if `f(m)` is currently false, and that doing
so would allow `f(m) == true`. However, only modify the `.optimizer` field if
`add == true`.

`f` is a function that takes `m` as a single argument. It is typically a call
like `f(m) = MOI.supports_constraint(m, F, S)` for some `F` and `S`.
"""
function _bridge_if_needed(
f::Function,
model::CachingOptimizer;
add::Bool = false,
)
if !f(model.model_cache)
# If the cache doesn't, we dont.
return false
elseif model.state == NO_OPTIMIZER
# The cache does, and there is no optimizer, so we do.
return true
elseif f(model.optimizer)
# There is an optimizer, and it does.
return true
elseif !model.auto_bridge
# There is an optimizer, it doesn't, and we aren't bridging.
return false
end
reset_optimizer(model)
T = MOI.get(model, MOI.CoefficientType())
bridge = MOI.instantiate(model.optimizer; with_bridge_type = T)
if f(bridge)
if add
model.optimizer = bridge
end
return true # We bridged, and now we support.
end
return false # Everything fails.
end

function MOI.supports_add_constrained_variable(
m::CachingOptimizer,
S::Type{<:MOI.AbstractScalarSet},
)
return MOI.supports_add_constrained_variable(m.model_cache, S) && (
m.state == NO_OPTIMIZER ||
MOI.supports_add_constrained_variable(m.optimizer, S)::Bool
)
return _bridge_if_needed(m) do model
return MOI.supports_add_constrained_variable(model, S)
end
end

function MOI.add_constrained_variable(
m::CachingOptimizer,
set::S,
) where {S<:MOI.AbstractScalarSet}
supports = _bridge_if_needed(m; add = true) do model
return MOI.supports_add_constrained_variable(model, S)
end
if !supports && state(m) == ATTACHED_OPTIMIZER
throw(MOI.UnsupportedConstraint{MOI.SingleVariable,S}())
end
if m.state == MOIU.ATTACHED_OPTIMIZER
if m.mode == MOIU.AUTOMATIC
try
Expand Down Expand Up @@ -399,10 +490,9 @@ function _supports_add_constrained_variables(
m::CachingOptimizer,
S::Type{<:MOI.AbstractVectorSet},
)
return MOI.supports_add_constrained_variables(m.model_cache, S) && (
m.state == NO_OPTIMIZER ||
MOI.supports_add_constrained_variables(m.optimizer, S)::Bool
)
return _bridge_if_needed(m) do model
return MOI.supports_add_constrained_variables(model, S)
end
end

# Split in two to solve ambiguity
Expand All @@ -424,6 +514,12 @@ function MOI.add_constrained_variables(
m::CachingOptimizer,
set::S,
) where {S<:MOI.AbstractVectorSet}
supports = _bridge_if_needed(m; add = true) do model
return MOI.supports_add_constrained_variables(model, S)
end
if !supports && state(m) == ATTACHED_OPTIMIZER
throw(MOI.UnsupportedConstraint{MOI.VectorOfVariables,S}())
end
if m.state == ATTACHED_OPTIMIZER
if m.mode == AUTOMATIC
try
Expand Down Expand Up @@ -470,17 +566,22 @@ function MOI.supports_constraint(
F::Type{<:MOI.AbstractFunction},
S::Type{<:MOI.AbstractSet},
)
return MOI.supports_constraint(m.model_cache, F, S) && (
m.state == NO_OPTIMIZER ||
MOI.supports_constraint(m.optimizer, F, S)::Bool
)
return _bridge_if_needed(m) do model
return MOI.supports_constraint(model, F, S)
end
end

function MOI.add_constraint(
m::CachingOptimizer,
func::F,
set::S,
) where {F<:MOI.AbstractFunction,S<:MOI.AbstractSet}
supports = _bridge_if_needed(m; add = true) do model
return MOI.supports_constraint(model, F, S)
end
if !supports && state(m) == ATTACHED_OPTIMIZER
throw(MOI.UnsupportedConstraint{F,S}())
end
if m.state == ATTACHED_OPTIMIZER
if m.mode == AUTOMATIC
try
Expand Down Expand Up @@ -710,6 +811,32 @@ end
# they are sent to the optimizer and when they are returned from the optimizer.
# As a result, values of attributes must implement `map_indices`.

function MOI.set(m::CachingOptimizer, attr::MOI.ObjectiveFunction, value)
supports = _bridge_if_needed(m; add = true) do model
return MOI.supports(model, attr)
end
if !supports && state(m) == ATTACHED_OPTIMIZER
throw(MOI.UnsupportedAttribute(attr))
end
if m.state == ATTACHED_OPTIMIZER
optimizer_value = map_indices(m.model_to_optimizer_map, value)
if m.mode == AUTOMATIC
try
MOI.set(m.optimizer, attr, optimizer_value)
catch err
if err isa MOI.NotAllowedError
reset_optimizer(m)
else
rethrow(err)
end
end
else
MOI.set(m.optimizer, attr, optimizer_value)
end
end
return MOI.set(m.model_cache, attr, value)
end

function MOI.set(m::CachingOptimizer, attr::MOI.AbstractModelAttribute, value)
if m.state == ATTACHED_OPTIMIZER
optimizer_value = map_indices(m.model_to_optimizer_map, value)
Expand Down Expand Up @@ -777,6 +904,12 @@ function MOI.supports(
(m.state == NO_OPTIMIZER || MOI.supports(m.optimizer, attr)::Bool)
end

function MOI.supports(m::CachingOptimizer, attr::MOI.ObjectiveFunction)
return _bridge_if_needed(m) do model
return MOI.supports(model, attr)
end
end

function MOI.get(model::CachingOptimizer, attr::MOI.AbstractModelAttribute)
if MOI.is_set_by_optimize(attr)
if state(model) == NO_OPTIMIZER
Expand Down
4 changes: 4 additions & 0 deletions src/Utilities/mockoptimizer.jl
Original file line number Diff line number Diff line change
Expand Up @@ -335,6 +335,10 @@ function MOI.get(mock::MockOptimizer, attr::MOI.AbstractModelAttribute)
end
end

function MOI.get(mock::MockOptimizer, attr::MOI.CoefficientType)
return MOI.get(mock.inner_model, attr)
end

#####
##### Names
#####
Expand Down
2 changes: 2 additions & 0 deletions src/Utilities/model.jl
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ abstract type AbstractModelLike{T} <: MOI.ModelLike end
abstract type AbstractOptimizer{T} <: MOI.AbstractOptimizer end
const AbstractModel{T} = Union{AbstractModelLike{T},AbstractOptimizer{T}}

MOI.get(::AbstractModel{T}, ::MOI.CoefficientType) where {T} = T

# Variables
function MOI.get(model::AbstractModel, ::MOI.NumberOfVariables)::Int64
if model.variable_indices === nothing
Expand Down
11 changes: 11 additions & 0 deletions src/attributes.jl
Original file line number Diff line number Diff line change
Expand Up @@ -921,6 +921,17 @@ attr = MOI.get(model, MOI.ObjectiveFunctionType())
"""
struct ObjectiveFunctionType <: AbstractModelAttribute end

"""
CoefficientType()

Return the coefficient type of a model.

Defaults to `Float64`.
"""
struct CoefficientType <: AbstractModelAttribute end

get(::ModelLike, ::CoefficientType) = Float64

## Optimizer attributes

"""
Expand Down
Loading