Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optimization.LBFGS() can not compute the gradient #744

Open
enigne opened this issue May 14, 2024 · 5 comments
Open

Optimization.LBFGS() can not compute the gradient #744

enigne opened this issue May 14, 2024 · 5 comments
Labels
bug Something isn't working

Comments

@enigne
Copy link

enigne commented May 14, 2024

Hi, I'm using Optimization.jl in my package: DJUICE.jl to optimize a cost function. The example is here
After optimization, the solution is the same as my initial guess. I computed the gradient directly using Enzyme here

But, when comparing with the evaluation from Optimization.jl sol.cache.f.grad(∂J_∂α, prob.u0, prob.p)
I got error messages

@enigne enigne added the bug Something isn't working label May 14, 2024
@ChrisRackauckas
Copy link
Member

SciML/OptimizationBase.jl#43 is the solution. Maybe try that branch and see?

@enigne
Copy link
Author

enigne commented May 14, 2024

Thanks, Chris. I got an error:

ERROR: LoadError: Function to differentiate `MethodInstance for OptimizationEnzymeExt.firstapply(::typeof(DJUICE.costfunction), ::Vector{Float64}, ::DJUICE.FemModel, ::DJUICE.FemModel)` is guaranteed to return an error and doesn't make sense to autodiff. Giving up
Stacktrace:
 [1] error(s::String)
   @ Base ./error.jl:35
 [2] macro expansion
   @ ~/.julia/packages/Enzyme/srACB/src/compiler.jl:5845 [inlined]
 [3] macro expansion
   @ ./none:0 [inlined]
 [4] thunk(::Val{…}, ::Type{…}, ::Type{…}, tt::Type{…}, ::Val{…}, ::Val{…}, ::Val{…}, ::Val{…}, ::Val{…}, ::Type{…})
   @ Enzyme.Compiler ./none:0
 [5] autodiff
   @ ~/.julia/packages/Enzyme/srACB/src/Enzyme.jl:234 [inlined]
 [6] (::OptimizationEnzymeExt.var"#grad#50"{OptimizationFunction{…}, DJUICE.FemModel})(res::Vector{Float64}, θ::Vector{Float64}, args::DJUICE.FemModel)
   @ OptimizationEnzymeExt ~/.julia/packages/OptimizationBase/kgHps/ext/OptimizationEnzymeExt.jl:165
 [7] top-level scope
   @ ~/Dartmouth/DJUICE/test/testoptimization.jl:46
 [8] include(fname::String)
   @ Base.MainInclude ./client.jl:489
 [9] top-level scope
   @ REPL[4]:1
in expression starting at /Users/gongcheng/Dartmouth/DJUICE/test/testoptimization.jl:46

The followings are the packages I'm uisng:

  [7da242da] Enzyme v0.12.6
  [7f7a1694] Optimization v3.25.0
  [bca83a33] OptimizationBase v0.0.7 `https://github.com/SciML/OptimizationBase.jl#ChrisRackauckas-patch-1`
  [36348300] OptimizationOptimJL v0.2.3

@ChrisRackauckas
Copy link
Member

Well that's progress. Why DJUICE.FemModel is doubled is a good question.

@enigne
Copy link
Author

enigne commented May 14, 2024

Interesting.
Enzyme.autodiff(Enzyme.Reverse, DJUICE.costfunction, Duplicated(α, ∂J_∂α), Duplicated(femmodel,dfemmodel)) works

sol.cache.f.f(prob.u0, prob.p) also works for me.

This error only occurs when I call
sol.cache.f.grad(∂J_∂α, prob.u0, prob.p)

And, the solution from optimization did not change.

@ChrisRackauckas
Copy link
Member

I think the next step here in improving Enzyme support is finishing up the DifferentiationInterface integration. We're working with @gdalle on this, I'm thinking it may not take more than 2 more weeks. When that's the case, DI will be used as the AD system within Optimization.jl. That means isolating this bug is simpler, as it isolates it to how DI handles Enzyme, which should be improved. If there is still an error, this becomes an Enzyme+DI issue which is something we can solve there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants