Folks, Mastodon didn't reach critical mass for my personal usage, and I will probably discontinue this account at some point.
I am trying Bluesky out, which seems promising and growing very fast in the past couple of weeks (wonder why).
You can find me there at: https://bsky.app/profile/lacerbi.bsky.social
21.11.2024 09:07Folks, Mastodon didn't reach critical mass for my personal usage, and I will probably discontinue this account at some point.I am trying...📯I am hiring again!
Postdoc or PhD position in Sample-Efficient Probabilistic Machine Learning at the University of Helsinki with strong links to @FCAI
Please see blurb below, and full ad here:
https://helsinki.fi/en/researchgroups/machine-and-human-intelligence/postdocphd-position-in-sample-efficient-probabilistic-machine-learning
Main goal: Extend our VBMC framework for efficient probabilistic inference (https://github.com/acerbilab/pyvbmc), with applications and collaborators in comp neuro, e.g. @internationalbrainlab
Applications evaluated on a rolling basis. Please RT!
17.11.2023 09:26📯I am hiring again!Postdoc or PhD position in Sample-Efficient Probabilistic Machine Learning at the University of Helsinki with strong...Hi folks, I have multiple openings for PhD/postdoc positions in probabilistic machine learning at the University of Helsinki and @FCAI
starting later in the year!
For more info see: https://www.helsinki.fi/en/researchgroups/machine-and-human-intelligence/open-positions
Please boost!
10.7.2023 08:47Hi folks, I have multiple openings for PhD/postdoc positions in probabilistic machine learning at the University of Helsinki and @FCAI...11/ PS: We have other tools coming out soon, follow me or our spaces for more info: https://github.com/acerbilab
5.4.2023 18:4911/ PS: We have other tools coming out soon, follow me or our spaces for more info: https://github.com/acerbilab10/ We are very interested in which kind of problems you might apply PyVBMC to, and connecting to the broader open source and probabilistic programming language crowd (e.g., @pymc, @mcmc_stan, @ArviZ).
Please get in touch if you have any questions or feedback for us!
5.4.2023 18:4810/ We are very interested in which kind of problems you might apply PyVBMC to, and connecting to the broader open source and probabilistic...9/ Work on this package is a joint effort of many fantastic machine learning developers working in my group at various times: Bobby Huggins, Chengkun Li, Marlon Tobaben, Mikko Aarnos.
Thanks to @FCAI & UnivHelsinkiCS for funding and supporting the project!
5.4.2023 18:469/ Work on this package is a joint effort of many fantastic machine learning developers working in my group at various times: Bobby Huggins,...8/ References:
Our original VBMC papers, detailing the algorithm:
- NeurIPS 2018: https://arxiv.org/abs/1810.05558
- NeurIPS 2020: http://arxiv.org/abs/2006.08655
And we have a tl;dr software preprint: https://arxiv.org/abs/2303.09519
5.4.2023 18:458/ References:Our original VBMC papers, detailing the algorithm:- NeurIPS 2018: https://arxiv.org/abs/1810.05558- NeurIPS 2020:...7/ PyVBMC is available on both PyPI (pip install pyvbmc) and conda-forge, and provides a user-friendly Pythonic interface.
GitHub repo: https://github.com/acerbilab/pyvbmc
Detailed notebook tutorials and documentation make it accessible to new and experienced users:
https://acerbilab.github.io/pyvbmc/examples.html
7/ PyVBMC is available on both PyPI (pip install pyvbmc) and conda-forge, and provides a user-friendly Pythonic interface.
Docs: https://acerbilab.github.io/pyvbmc/
Detailed notebook tutorials make it accessible to new and experienced users:
https://acerbilab.github.io/pyvbmc/examples.html
6/ Like all methods, there are limitations!
PyVBMC:
- Works up to ~10 continuous model parameters
- Works with reasonably smooth target posteriors (no weird posteriors)
- Can deal with mild multi-modality, but (for now) might not work with very separate modes
5/ PyVBMC uses Gaussian processes and a variational approximation to efficiently compute a flexible (non-Gaussian) approximate posterior distribution of the model parameters, from which statistics and posterior samples can be easily extracted.
5.4.2023 18:425/ PyVBMC uses Gaussian processes and a variational approximation to efficiently compute a flexible (non-Gaussian) approximate posterior...4/ PyVBMC is particularly effective when the evaluation of the likelihood of your model:
- is at least a bit expensive (e.g., ~ one second per evaluation),
OR
- the likelihood evaluation is stochastic (e.g., estimated by Monte Carlo sampling aka simulation).
(or both)
5.4.2023 18:414/ PyVBMC is particularly effective when the evaluation of the likelihood of your model: - is at least a bit expensive (e.g., ~ one second...3/ Using PyVBMC is super simple.
Define your model (prior, likelihood, a few options), give them to the VBMC optimization process, and go!
5.4.2023 18:403/ Using PyVBMC is super simple. Define your model (prior, likelihood, a few options), give them to the VBMC optimization process, and go!2/ PyVBMC works similarly to Bayesian optimization, but with the goal of *inference* (getting the full posterior distribution) instead of *optimization* (getting a single point estimate).
Probabilistic numerics FTW.
5.4.2023 18:392/ PyVBMC works similarly to Bayesian optimization, but with the goal of *inference* (getting the full posterior distribution) instead of...1/ PyVBMC 1.0 is out! 🎉
https://github.com/acerbilab/pyvbmc
A new Python package for efficient Bayesian inference.
Get a posterior distribution over model parameters + the model evidence with a small number of likelihood evaluations.
No AGI was created in the process!
5.4.2023 18:371/ PyVBMC 1.0 is out! 🎉https://github.com/acerbilab/pyvbmcA new Python package for efficient Bayesian inference.Get a posterior...I am trying out this new Bayesian optimization algorithm.
23.12.2022 16:03I am trying out this new Bayesian optimization algorithm.Interesting proposal to use inverse binomial sampling for unbiased estimation of the log-likelihood (see https://github.com/acerbilab/ibs) in quantum machine learning: https://arxiv.org/abs/2211.04965
(slightly cross-posted from the bird)
12.11.2022 13:53Interesting proposal to use inverse binomial sampling for unbiased estimation of the log-likelihood (see https://github.com/acerbilab/ibs)...Thorough benchmark showing the superiority of hybrid Bayesian optimization methods, and specifically our BADS (https://github.com/acerbilab/bads), in several control engineering problems: https://arxiv.org/abs/2211.02571
(cross-posted from the bird)
10.11.2022 11:52Thorough benchmark showing the superiority of hybrid Bayesian optimization methods, and specifically our BADS...First toot!
New version of the BADS optimization algorithm just released (v1.1.1), with full support for heteroskedastic (user-provided) noise and several fixes: https://github.com/acerbilab/bads
(Also new lab repo! https://github.com/acerbilab)
What about Python? Coming *soon* — for real. Stay tuned!
31.10.2022 15:53First toot!New version of the BADS optimization algorithm just released (v1.1.1), with full support for heteroskedastic (user-provided)...