-
Notifications
You must be signed in to change notification settings - Fork 230
[WIP] PR for issue #634 #750
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
These are likely bugs in the vectorisation implementation (related #476).
I'm not sure I understand the issue here. Can you clarify a bit?
Have an API for manipulating |
We increase log p inside the compiler using: I suspect this is a bug. |
|
Places where we modify Inside the compiler:
Inside PG:
It does seem |
|
UPDATE: Actually, the following line "cancels" the increment of Turing.jl/src/inference/pgibbs.jl Line 176 in e585eea
|
|
It seems it's a very good idea that we refactor that bit of the code base. |
| end | ||
|
|
||
| ################################# | ||
| # Compute the log joint Runner. # |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| # Compute the log joint Runner. # | |
| # Compute the particle filtering Runner. # |
|
Thanks Martin for this PR. I suggest we keep the vectorization issues out of this, since that's a totally different beast. About this PR and its underlying issue, while I am not exactly against them, I just don't get the appeal of introducing a plethora of new types to do exactly the same thing we are doing now without these types at all. The abstractions we are introducing here don't add much functional value for now as far as I can see. If anything they are a bit confusing. For example, the relation between I think what we need to do is work from the use-cases backwards. So firstly, we specify the API functions that we want to work, e.g. I may be just failing to see the value of this re-structuring so please explain it to me if I got it all wrong. Sorry for the late night rant :) |
This is meant to avoid direct dispatch assume(spl::{HMC, NUTS, SGLD}, ...)But this looks a bit ugly and doesn't support plug-and-play inference. Introducing an intermediate type such as
This is meant to reduce the number of functions the Turing compiler has to generate. By lowering the model into IRs and allow users to overload |
|
It seems that the issue is not introducing these new runner types, but the fact these types are currently used to construct |
|
I agree. I'll create a new PR in which I will try to find a solution that doesn't require to construct |
|
What's the plan for this issue? |
|
Once the inference changeover is merged I’ll work on a new PR that contains the main features of this PR. |
|
I suspect that a lot of features planned in this PR is now added in #965, e.g. the |
This is a work in progress PR refactoring the assume and observe interface (#634). Do not merge!
Todo:
logjoint(m::Model, v::VarInfo).logpdf(m::Model, v::VarInfo).SampleFromPrior.assumeandobserve.[ ] Fix Vectorisation issues: Assume in vectorisation of HMC bug. #760 Observe vectorisation issue. #761 .