Skip to content

Create ode.md #924

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 31 commits into
base: master
Choose a base branch
from
Open

Create ode.md #924

wants to merge 31 commits into from

Conversation

ParasPuneetSingh
Copy link
Collaborator

Docs for first draft of OptimizationODE.jl

Checklist

  • Appropriate tests were added
  • Any code changes were done in a way that does not break public API
  • All documentation related to code changes were updated
  • The new code follows the
    contributor guidelines, in particular the SciML Style Guide and
    COLPRAC.
  • Any new documentation only uses public API

Additional context

Add any other context about the problem here.

ParasPuneetSingh and others added 21 commits September 7, 2024 17:16
Added documentation for MOO in BBO
MOO docs update.
MOO docs update.
updated project.toml for the docs.
Added compat for BBO.
Added required packages for MOO docs.
added required packages for MOO
Corrected function names for MOO docs.
Removed unnecessary FowardDiff function.
Added the package for the algorithms.
Added evolutionary to the package.
updated algorithm call.
Correction of changeing tuple to vector.
corrected algorithm calls.
Adding argument mapping for num_dimensions and fitness_scheme.
syntax change for num_dimensions and fitness_scheme passing in solve().
Docs for first draft of OptimizationODE.jl
Comment on lines 43 to 44
* `HighOrderDescent()` — uses the Vern7 high-order Runge-Kutta method.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing a description of the generic one, and where to find more solvers

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Describe them all as gradient-based local optimizers.

ParasPuneetSingh and others added 2 commits June 2, 2025 16:41
Method descriptions for the solvers added.
@ChrisRackauckas
Copy link
Member

Looks good, needs to bump after registration.


function finite_difference_jacobian(f, x; ϵ = 1e-8)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no just use the jac function provided by the initialization on f.

RKChebyshevDescent() = ODEOptimizer(ROCK2())
RKAccelerated() = ODEOptimizer(Tsit5())
HighOrderDescent() = ODEOptimizer(Vern7())
DAEMassMatrix() = DAEOptimizer(Rodas5())
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

DAE is a different PR do not merge different PRs together.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants