-
-
Notifications
You must be signed in to change notification settings - Fork 98
Create ode.md #924
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Create ode.md #924
Conversation
Added documentation for MOO in BBO
MOO docs update.
MOO docs update.
updated project.toml for the docs.
Added compat for BBO.
Added required packages for MOO docs.
added required packages for MOO
Corrected function names for MOO docs.
Removed unnecessary FowardDiff function.
Added the package for the algorithms.
Added evolutionary to the package.
updated algorithm call.
Correction of changeing tuple to vector.
corrected algorithm calls.
Adding argument mapping for num_dimensions and fitness_scheme.
syntax change for num_dimensions and fitness_scheme passing in solve().
Docs for first draft of OptimizationODE.jl
* `HighOrderDescent()` — uses the Vern7 high-order Runge-Kutta method. | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing a description of the generic one, and where to find more solvers
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Describe them all as gradient-based local optimizers.
Co-authored-by: Christopher Rackauckas <[email protected]>
Method descriptions for the solvers added.
Looks good, needs to bump after registration. |
|
||
function finite_difference_jacobian(f, x; ϵ = 1e-8) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no just use the jac function provided by the initialization on f.
RKChebyshevDescent() = ODEOptimizer(ROCK2()) | ||
RKAccelerated() = ODEOptimizer(Tsit5()) | ||
HighOrderDescent() = ODEOptimizer(Vern7()) | ||
DAEMassMatrix() = DAEOptimizer(Rodas5()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
DAE is a different PR do not merge different PRs together.
Updated docs.
updated docs
Docs for first draft of OptimizationODE.jl
Checklist
contributor guidelines, in particular the SciML Style Guide and
COLPRAC.
Additional context
Add any other context about the problem here.