-
-
Notifications
You must be signed in to change notification settings - Fork 99
Constraints for AutoFiniteDiff #318
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov Report
@@ Coverage Diff @@
## master #318 +/- ##
==========================================
+ Coverage 74.68% 79.37% +4.68%
==========================================
Files 9 9
Lines 241 257 +16
==========================================
+ Hits 180 204 +24
+ Misses 61 53 -8
📣 Codecov can now indicate which changes are the most critical in Pull Requests. Learn more |
For a non-square Jacobian, you just need to provide the cache with the output arrays so it knows how they are sized. FiniteDiff.JacobianCache(
x1 ,
fx ,
fx1,
fdtype :: Type{T1} = Val{:central},
returntype :: Type{T2} = eltype(fx),
colorvec = 1:length(x),
sparsity = nothing) provide it with the |
It needs access to the batch data. Dig through one step at a time, why does it get batch data as null parameters instead of the empty tuple? |
BFGS doesn't support constraints. Try IPNewton. |
This, for a reason that I'm not sure of, seems to just a problem with how I'm working with this repo I think. When I use
Fixed now - thank you, I didn't realise
Doesn't seem to work, still. (The extra
I will have to come back to this a bit later. All the tests do pass now though, except for this error. |
Also: Should we be using the prototype/colorvec options from |
Tested some problems now with IPNewton. I think this is all working now, no more changes incoming :). |
test/ADtests.jl
Outdated
@test sol1.u ≈ sol2.u | ||
|
||
optf1 = OptimizationFunction(rosenbrock, Optimization.AutoFiniteDiff(); cons = consf) | ||
prob1 = OptimizationProblem(optf, [0.3, 0.5], lb = [0.2, 0.4], ub = [0.6, 0.8], lcons = [0.2], ucons = [0.55]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The constraint's bounds here need to be different for the two elements, this is the failure on CI currently
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What do you mean?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll fix that. Forgot that con2_c
has two constraints in it when making this loop.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh I see.
Co-authored-by: Vaibhav Kumar Dixit <[email protected]>
|
||
if cons !== nothing && f.cons_h === nothing | ||
cons_h = function (res, θ) | ||
for i in 1:num_cons#note: colorvecs not yet supported by FiniteDiff for Hessians |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should start integrating/testing JuliaDiff/SparseDiffTools.jl#190 and then do similar to the FiniteDiff side (cc @ElOceanografo)
I think the test issue is the only thing left. If with those bounds it passes then I think this is good. |
I don't really understand why it's still failing? It works for me locally... |
Check that the dependencies are the same. It looks like it got FiniteDiff patch this time, but was there another patch you had somewhere? |
Seemed to be related - it keeps giving me the old version without any of my changes and I didn't realise. Still learning how to use other packages like this 😅 Also forgot to use the correct optf's when I changed to the for loop and it seems to work now... hopefully the CI does too. Well, it fails but at least it's not erroring. I'll have to come back to this a bit later and see why. |
Should be all good now, finally. |
I'm not waiting on the DiffEqFlux downstream since it doesn't use AutoFiniteDiff anyways. 🎉 |
optf2 = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff(); cons = consf) | ||
prob2 = OptimizationProblem(optf2, [0.3, 0.5], lb = [0.2, 0.4], ub = [0.6, 0.8], lcons = lcons, ucons = ucons) | ||
sol2 = solve(prob2,IPNewton()) | ||
@test sol1.minimum ≈ sol2.minimum | ||
sol2 = solve(prob2,Optim.SAMIN(), maxiters = 10000) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
SAMIN
doesn't use the constraints though?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But it doesn't error if constraints are given? Optim just lets them through?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I thought it did? Based on what is given here http://optimization.sciml.ai/stable/optimization_packages/optim/#With-Constraint-Equations "The following method in Optim is performing global optimization on problems with constraint equation"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I did miss that SAMIN is derivative-free though (PSO too), so we should probably mark that in the docs.
This defines a method for generating Jacobians and Hessians of the constraints when using
AutoFiniteDiff
(see also #130). Hopefully it's not too bad, I've left some comments in the code where I was unsure. I also have some comments below about issues I haven't been able to fix, that should probably be fixed before any action is taken on this pull request.finitediff.jl
: I had to takeargs...
outside of thef.f
call or else things were breaking. Any ideas why? You can run theAutoFiniteDiff
section of the tests withargs
included to see the errors, e.g.Line 86 of
finitediff.jl
: I'm not sure I understand the way I should be usingFiniteDiff.JacobianCache
here. It was for some reason completely incompatible with the rectangular Jacobian tested in Line 166--180 ofADtests.jl
, giving either an issue withm, n = size(J)
not working for a vectorJ
, or some parent error with_color = reshape(colorvec, axes(x)...)
.Finally, I tried solving some problems, e.g. at the end of
ADtests.jl
I havebut
optf
was for some reason not recognising that the gradient is now defined. I figured I shouldn't tinker any further with this last problem since it's defined outside ofOptimization.jl
. Maybe there's a syntax error I just haven't recognised for whatever reason.