-
Notifications
You must be signed in to change notification settings - Fork 160
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
sampling without inferring the prior transform #713
Comments
Perhaps I could just use rejection sampling by |
Hi there! Thanks for the compliments, we're happy that you're enjoying the package :) If you want to turn off the tranfsformation, you should probably use the sampler interface. Could you try this (I did not try it yet): from sbi.inference import DirectPosterior
inference = SNPE()
posterior_estimator = inference.append_simulations(theta, x).train()
posterior = DirectPosterior(posterior_estimator, prior) |
So at the instantiation of |
Ah, great that you found this fix! I'll leave this issue open because we should probably make this more easy... |
thanks @Patrickens , interesting problem! But a plain identity transform should work on your custom support, right? I added a Could you please paste your custom prior class here for testing? |
Dear SBI team,
Im absolutely loving the package! Very well written and documented.
I need to fit a distribution that has convex polytope support, meaning that all samples
theta
need to satisfy the following:A @ theta <= b
. I made the following support class for my prior:Now I run into the following issue. Once a round of inference is done and I try to build a posterior,
sbi
assumes that if you have a bounded and dependent support, that there must be a bijective transform to an unconstrained space:Is there any way around this issue? I've thought of such a transform (have not implemented it yet, since its a real headache), but for now I would like to just check whether a sample is within the support without transforming it to unconstrained space.
Thank you for you efforts!
The text was updated successfully, but these errors were encountered: