Skip to content

Conversation

@torfjelde
Copy link
Member

After TuringLang/DynamicPPL.jl#452 we can properly preserve the context related to the model.

For example, at the moment something like

sample(DynamicPPL.contextualize(model, DynamicPPL.PriorContext()), NUTS(), 1000)

won't work because LogDensityFunction is constructed with DefaultContext, which then replaces the leaf-context in model when we call evaluate!!.

After this PR, stuff like the above works as intended.

@coveralls
Copy link

coveralls commented Jan 31, 2023

Pull Request Test Coverage Report for Build 4740709796

  • 0 of 1 (0.0%) changed or added relevant line in 1 file are covered.
  • No unchanged relevant lines lost coverage.
  • Overall coverage remained the same at 0.0%

Changes Missing Coverage Covered Lines Changed/Added Lines %
src/inference/hmc.jl 0 1 0.0%
Totals Coverage Status
Change from base Build 4604993505: 0.0%
Covered Lines: 0
Relevant Lines: 1422

💛 - Coveralls

@codecov
Copy link

codecov bot commented Jan 31, 2023

Codecov Report

Patch and project coverage have no change.

Comparison is base (35a1280) 0.00% compared to head (8cfee33) 0.00%.

Additional details and impacted files
@@          Coverage Diff           @@
##           master   #1943   +/-   ##
======================================
  Coverage    0.00%   0.00%           
======================================
  Files          21      21           
  Lines        1422    1422           
======================================
  Misses       1422    1422           
Impacted Files Coverage Δ
src/essential/ad.jl 0.00% <0.00%> (ø)
src/inference/Inference.jl 0.00% <0.00%> (ø)
src/inference/hmc.jl 0.00% <0.00%> (ø)
src/inference/mh.jl 0.00% <ø> (ø)
src/variational/advi.jl 0.00% <0.00%> (ø)

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

☔ View full report in Codecov by Sentry.
📢 Do you have feedback about the report comment? Let us know in this issue.

Copy link
Member

@devmotion devmotion left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me, I would just suggest adding a few additional tests to cover all changes. Feel free to merge afterwards.

# Get the initial log pdf and gradient functions.
∂logπ∂θ = gen_∂logπ∂θ(vi, spl, model)
logπ = Turing.LogDensityFunction(vi, model, spl, DynamicPPL.DefaultContext())
logπ = Turing.LogDensityFunction(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems this is not tested (codecov complains)?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tests have been added 👍

# Make a new transition.
densitymodel = AMH.DensityModel(
Base.Fix1(LogDensityProblems.logdensity, Turing.LogDensityFunction(vi, model, DynamicPPL.SamplingContext(rng, spl)))
Base.Fix1(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems MH sampler should be tested as well?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done! Though it seems to really struggle with here, even with the prior (despite sampling from the prior; am really confused about this..)

@torfjelde torfjelde merged commit 918f6f4 into master Apr 20, 2023
@delete-merged-branch delete-merged-branch bot deleted the torfjelde/preserve-context-in-logdensityfunction branch April 20, 2023 12:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants