-
Notifications
You must be signed in to change notification settings - Fork 37
Merge changes in dev branch into master #158
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Define model generator exactly as specified by the user
The changes in this PR try to enforce the use of bors to a larger extent. PRs are not tested automatically anymore but instead only when one comments with `bors try` (does not merge afterwards) or `bors r+` (merges automatically afterwards). Additionally, the configuration requires at least one approval of a project member and squashes commits before merging. I'm not sure about building the master branch as well - in my experience it leads to redundant tests (since the commits are tested in the staging branch already) but works a bit better with the display of the Github check status and the badges.
This PR tries to address the inconvenience for users having to know about the internal variable `_varinfo` and writing `Turing.acclogp!(_varinfo, myvalue)` to modify the accumulated joint log probability by adding a `@addlogprob!` macro which allows to simply write `@addlogprob!(myvalue)`. In particular, it addresses TuringLang/Turing.jl#1332 (comment) (partly, the documentation still has to be updated) and might have avoided the discussion in TuringLang/Turing.jl#1328. BTW, this macro is quite different from the removed `@logprob` and `@varinfo` macros since it is a "proper" Julia macro that is not replaced or touched by the DynamicPPL compiler. It might seem natural to call it `@acclogp!` (since it just adds a call to `acclogp!`) but IMO a more descriptive name (such as `@addlogprob!`?) might be more intuitive for users.
This PR removes `ModelGen` completely. The main motivation for it were the issues that @itsdfish and I experienced when working with multiple processes. The following MWE ```julia using Distributed addprocs(4) @Everywhere using DynamicPPL @Everywhere @model model() = begin end pmap(x -> model(), 1:4) ``` fails intermittently > if not all of these [`@model`] evaluations generate same evaluator and generator functions of the same name (i.e., these var"###evaluator#253" and var"###generator#254" functions). I assume one could maybe provoke the error by defining another model first on the main process before calling @Everywhere @model .... (copied from the discussion on Slack) With the changes in this PR, `@model model() = ...` generates only a single function `model` on all workers, and hence there are no issues anymore with the names of the generators and evaluators. The evaluator is created as a closure inside of `model`, which can be serialized and deserialized properly by Julia. So far I haven't been able to reproduce the issues above with this PR. The only user-facing change of the removal of `ModelGen` (apart from that one never has to construct it, which simplifies the docs and the example @denainjs asked about) is that the `logprob` macro now requires to specify `model = m` where `m = model()` instead of `model = model` (since that's just a regular function from which the default arguments etc of the resulting model can't be extracted). It feels slightly weird that the evaluation is not based "exactly" on the specified `Model` instance but that the other parts of `logprob` modify it (which was the reason I guess for using the model generator before here), but on the other hand this weird behaviour already exists when specifying `logprob` with a `Chains` object. (BTW I'm not sure if we should actually use string macros here, maybe regular functions would be nicer.) Additionally, I assume (though haven't tested it) that getting rid of the separate evaluator and generator functions will not only simplify serialization and deserialization when working with multiple processes but also when saving models and chains (see e.g. TuringLang/Turing.jl#1091). Co-authored-by: David Widmann <[email protected]>
Co-authored-by: Tor Erlend Fjelde <[email protected]>
Fix prob macros
This PR removes the variables `Turing.DEBUG` and `DynamicPPL.DEBUG`. Instead one can just set the environment variable `JULIA_DEBUG` to "Turing" or "DynamicPPL", if desired (see https://docs.julialang.org/en/v1/stdlib/Logging/#Environment-variables-1). By default, debug logging is not enabled and the messages are filtered immediately without being evaluated (see https://docs.julialang.org/en/v1/stdlib/Logging/#Early-filtering-and-message-handling-1), and hence the additional check `Turing.DEBUG` or `DynamicPPL.DEBUG` is not needed even from a performance perspective.
Try to avoid `pmap` test issues
…153) This PR replaces some occurrences of `logpdf` with `loglikelihood` since the former is intended to be used for a single sample only but we are interested in the log probability of individual and multiple samples. JuliaStats/Distributions.jl#1144 allows us to use `loglikelihood` even for individual samples and arrays of samples (everything that can be sampled by `rand` should support the computation of `loglikelihood`). Currently, `logpdf` is misused (also in DistributionsAD) to compute arrays of log densities for multiple samples which are summed afterwards. Usually, this intermittent step can be avoided by summing the log densities directly (which is the default implementation in Distributions). Similar issues and possible optimizations exist for `Bijectors.logpdf_with_trans` (see TuringLang/Bijectors.jl#120). Co-authored-by: Hong Ge <[email protected]>
Member
Author
|
bors try |
Contributor
devmotion
approved these changes
Aug 24, 2020
Member
devmotion
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, we should just keep in mind to update the Turing docs accordingly when we add support for the new version.
Member
Author
|
Maybe create an issue for updating docs? |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
No description provided.