-
Notifications
You must be signed in to change notification settings - Fork 438
Make FullyBayesian ABC more flexible again #2872
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
cc @sdaulton |
This makes sense to me. An alternative to passing |
I am passing in a small neural network to have a fully bayesian deep kernel model and then I hijack the training function to train it. Hence why having the greater flexibility was good |
@sdaulton what would you suggest are the next steps here? are you happy with the catch-all kwargs and I should work on fixing the tests? |
Just wanted to add that this would be very useful for me! I am intending to use a tree-based GP kernel which we sample with MCMC, which I will convert to a Pyro model, then use as in a FullyBayesian BoTorch model. |
Sorry for the delay here. I lost track of this issue. A couple thoughts:
If one is implementing a custom |
Motivation
In the old ABC we could pass an already initialized
pyro_model
which allowed for fairly complicatedpyro_model
s to be used in the available framework. The refactor prevents us passing arbitary kwargs to thepyro_model
and so is a step back in terms of hackability of the code.Passing a dict of kwargs seems like the right way to keep things flexible if no longer able to just pass the model. This PR is just a draft as I am not sure yet what else I need to touch.
Have you read the [Contributing Guidelines on pull requests]
Yes
Test Plan
Ensure tests pass, add new tests if needed.
Related PRs
None