-
-
Notifications
You must be signed in to change notification settings - Fork 410
Closed
Description
Hi there @odow and friends!
Looking at our work on sparse autodiff with @adrhill and @amontoison, I've been wondering how it could be useful to JuMP. We have developed a combination of three new packages:
- for sparsity detection (SparseConnectivityTracer.jl)
- for matrix colorings and decompression (SparseMatrixColorings.jl)
- for backend-independent gradients, (sparse) Jacobians and (sparse) Hessians (DifferentiationInterface.jl)
From what I understand, JuMP's current sparse AD engine is Nonlinear.ReverseAD
, but there are also experiments going on in MathOptSymbolicAD.jl. Did I miss anything?
I'm not suggesting that DifferentiationInterface and friends should replace your own default AD solution. But since you say in the docs that testing other AD backends would be nice, maybe there's an angle there?
In any case, perhaps DI could be a nice addition to the docs page on autodiff of user-defined operators?
Metadata
Metadata
Assignees
Labels
No labels