lade...
random avatar

acerbiluigi - Network

Posts Subscribe

Folks, Mastodon didn't reach critical mass for my personal usage, and I will probably discontinue this account at some point.I am trying...

https://mastodon.social/@AcerbiL...

Folks, Mastodon didn't reach critical mass for my personal usage, and I will probably discontinue this account at some point.

I am trying Bluesky out, which seems promising and growing very fast in the past couple of weeks (wonder why).

You can find me there at: bsky.app/profile/lacerbi.bsky.

21.11.2024 09:07Folks, Mastodon didn't reach critical mass for my personal usage, and I will probably discontinue this account at some point.I am trying...
https://mastodon.social/@AcerbiL...

📯I am hiring again!Postdoc or PhD position in Sample-Efficient Probabilistic Machine Learning at the University of Helsinki with strong...

https://mastodon.social/@AcerbiL...

📯I am hiring again!

Postdoc or PhD position in Sample-Efficient Probabilistic Machine Learning at the University of Helsinki with strong links to @FCAI

Please see blurb below, and full ad here:
helsinki.fi/en/researchgroups/

Main goal: Extend our VBMC framework for efficient probabilistic inference (github.com/acerbilab/pyvbmc), with applications and collaborators in comp neuro, e.g. @internationalbrainlab

Applications evaluated on a rolling basis. Please RT!

17.11.2023 09:26📯I am hiring again!Postdoc or PhD position in Sample-Efficient Probabilistic Machine Learning at the University of Helsinki with strong...
https://mastodon.social/@AcerbiL...

Hi folks, I have multiple openings for PhD/postdoc positions in probabilistic machine learning at the University of Helsinki and @FCAI...

https://mastodon.social/@AcerbiL...

Hi folks, I have multiple openings for PhD/postdoc positions in probabilistic machine learning at the University of Helsinki and @FCAI
starting later in the year!

For more info see: helsinki.fi/en/researchgroups/

Please boost!

10.7.2023 08:47Hi folks, I have multiple openings for PhD/postdoc positions in probabilistic machine learning at the University of Helsinki and @FCAI...
https://mastodon.social/@AcerbiL...

11/ PS: We have other tools coming out soon, follow me or our spaces for more info: https://github.com/acerbilab

https://mastodon.social/@AcerbiL...

11/ PS: We have other tools coming out soon, follow me or our spaces for more info: github.com/acerbilab

5.4.2023 18:4911/ PS: We have other tools coming out soon, follow me or our spaces for more info: https://github.com/acerbilab
https://mastodon.social/@AcerbiL...

10/ We are very interested in which kind of problems you might apply PyVBMC to, and connecting to the broader open source and probabilistic...

https://mastodon.social/@AcerbiL...

10/ We are very interested in which kind of problems you might apply PyVBMC to, and connecting to the broader open source and probabilistic programming language crowd (e.g., @pymc, @mcmc_stan, @ArviZ).

Please get in touch if you have any questions or feedback for us!

5.4.2023 18:4810/ We are very interested in which kind of problems you might apply PyVBMC to, and connecting to the broader open source and probabilistic...
https://mastodon.social/@AcerbiL...

9/ Work on this package is a joint effort of many fantastic machine learning developers working in my group at various times: Bobby Huggins,...

https://mastodon.social/@AcerbiL...

9/ Work on this package is a joint effort of many fantastic machine learning developers working in my group at various times: Bobby Huggins, Chengkun Li, Marlon Tobaben, Mikko Aarnos.

Thanks to @FCAI & UnivHelsinkiCS for funding and supporting the project!

5.4.2023 18:469/ Work on this package is a joint effort of many fantastic machine learning developers working in my group at various times: Bobby Huggins,...
https://mastodon.social/@AcerbiL...

8/ References:Our original VBMC papers, detailing the algorithm:- NeurIPS 2018: https://arxiv.org/abs/1810.05558- NeurIPS 2020:...

https://mastodon.social/@AcerbiL...

8/ References:

Our original VBMC papers, detailing the algorithm:
- NeurIPS 2018: arxiv.org/abs/1810.05558
- NeurIPS 2020: arxiv.org/abs/2006.08655

And we have a tl;dr software preprint: arxiv.org/abs/2303.09519

5.4.2023 18:458/ References:Our original VBMC papers, detailing the algorithm:- NeurIPS 2018: https://arxiv.org/abs/1810.05558- NeurIPS 2020:...
https://mastodon.social/@AcerbiL...

7/ PyVBMC is available on both PyPI (pip install pyvbmc) and conda-forge, and provides a user-friendly Pythonic interface.GitHub repo:...

https://mastodon.social/@AcerbiL...

7/ PyVBMC is available on both PyPI (pip install pyvbmc) and conda-forge, and provides a user-friendly Pythonic interface.

GitHub repo: github.com/acerbilab/pyvbmc

Detailed notebook tutorials and documentation make it accessible to new and experienced users:
acerbilab.github.io/pyvbmc/exa

5.4.2023 18:447/ PyVBMC is available on both PyPI (pip install pyvbmc) and conda-forge, and provides a user-friendly Pythonic interface.GitHub repo:...
https://mastodon.social/@AcerbiL...

7/ PyVBMC is available on both PyPI (pip install pyvbmc) and conda-forge, and provides a user-friendly Pythonic interface.Docs:...

https://mastodon.social/@AcerbiL...

7/ PyVBMC is available on both PyPI (pip install pyvbmc) and conda-forge, and provides a user-friendly Pythonic interface.

Docs: acerbilab.github.io/pyvbmc/

Detailed notebook tutorials make it accessible to new and experienced users:
acerbilab.github.io/pyvbmc/exa

5.4.2023 18:437/ PyVBMC is available on both PyPI (pip install pyvbmc) and conda-forge, and provides a user-friendly Pythonic interface.Docs:...
https://mastodon.social/@AcerbiL...

6/ Like all methods, there are limitations!PyVBMC:- Works up to ~10 continuous model parameters- Works with reasonably smooth target...

https://mastodon.social/@AcerbiL...

6/ Like all methods, there are limitations!

PyVBMC:
- Works up to ~10 continuous model parameters
- Works with reasonably smooth target posteriors (no weird posteriors)
- Can deal with mild multi-modality, but (for now) might not work with very separate modes

5.4.2023 18:426/ Like all methods, there are limitations!PyVBMC:- Works up to ~10 continuous model parameters- Works with reasonably smooth target...
https://mastodon.social/@AcerbiL...

5/ PyVBMC uses Gaussian processes and a variational approximation to efficiently compute a flexible (non-Gaussian) approximate posterior...

https://mastodon.social/@AcerbiL...

5/ PyVBMC uses Gaussian processes and a variational approximation to efficiently compute a flexible (non-Gaussian) approximate posterior distribution of the model parameters, from which statistics and posterior samples can be easily extracted.

5.4.2023 18:425/ PyVBMC uses Gaussian processes and a variational approximation to efficiently compute a flexible (non-Gaussian) approximate posterior...
https://mastodon.social/@AcerbiL...

4/ PyVBMC is particularly effective when the evaluation of the likelihood of your model: - is at least a bit expensive (e.g., ~ one second...

https://mastodon.social/@AcerbiL...

4/ PyVBMC is particularly effective when the evaluation of the likelihood of your model:

- is at least a bit expensive (e.g., ~ one second per evaluation),

OR

- the likelihood evaluation is stochastic (e.g., estimated by Monte Carlo sampling aka simulation).

(or both)

5.4.2023 18:414/ PyVBMC is particularly effective when the evaluation of the likelihood of your model: - is at least a bit expensive (e.g., ~ one second...
https://mastodon.social/@AcerbiL...

3/ Using PyVBMC is super simple. Define your model (prior, likelihood, a few options), give them to the VBMC optimization process, and go!

https://mastodon.social/@AcerbiL...

3/ Using PyVBMC is super simple.

Define your model (prior, likelihood, a few options), give them to the VBMC optimization process, and go!

5.4.2023 18:403/ Using PyVBMC is super simple. Define your model (prior, likelihood, a few options), give them to the VBMC optimization process, and go!
https://mastodon.social/@AcerbiL...

2/ PyVBMC works similarly to Bayesian optimization, but with the goal of *inference* (getting the full posterior distribution) instead of...

https://mastodon.social/@AcerbiL...

2/ PyVBMC works similarly to Bayesian optimization, but with the goal of *inference* (getting the full posterior distribution) instead of *optimization* (getting a single point estimate).

Probabilistic numerics FTW.

5.4.2023 18:392/ PyVBMC works similarly to Bayesian optimization, but with the goal of *inference* (getting the full posterior distribution) instead of...
https://mastodon.social/@AcerbiL...

1/ PyVBMC 1.0 is out! 🎉https://github.com/acerbilab/pyvbmcA new Python package for efficient Bayesian inference.Get a posterior...

https://mastodon.social/@AcerbiL...

1/ PyVBMC 1.0 is out! 🎉

github.com/acerbilab/pyvbmc

A new Python package for efficient Bayesian inference.

Get a posterior distribution over model parameters + the model evidence with a small number of likelihood evaluations.

No AGI was created in the process!

5.4.2023 18:371/ PyVBMC 1.0 is out! 🎉https://github.com/acerbilab/pyvbmcA new Python package for efficient Bayesian inference.Get a posterior...
https://mastodon.social/@AcerbiL...

I am trying out this new Bayesian optimization algorithm.

https://mastodon.social/@AcerbiL...

I am trying out this new Bayesian optimization algorithm.

23.12.2022 16:03I am trying out this new Bayesian optimization algorithm.
https://mastodon.social/@AcerbiL...

Interesting proposal to use inverse binomial sampling for unbiased estimation of the log-likelihood (see https://github.com/acerbilab/ibs)...

https://mastodon.social/@AcerbiL...

Interesting proposal to use inverse binomial sampling for unbiased estimation of the log-likelihood (see github.com/acerbilab/ibs) in quantum machine learning: arxiv.org/abs/2211.04965

(slightly cross-posted from the bird)

12.11.2022 13:53Interesting proposal to use inverse binomial sampling for unbiased estimation of the log-likelihood (see https://github.com/acerbilab/ibs)...
https://mastodon.social/@AcerbiL...

Thorough benchmark showing the superiority of hybrid Bayesian optimization methods, and specifically our BADS...

https://mastodon.social/@AcerbiL...

Thorough benchmark showing the superiority of hybrid Bayesian optimization methods, and specifically our BADS (github.com/acerbilab/bads), in several control engineering problems: arxiv.org/abs/2211.02571

(cross-posted from the bird)

10.11.2022 11:52Thorough benchmark showing the superiority of hybrid Bayesian optimization methods, and specifically our BADS...
https://mastodon.social/@AcerbiL...

First toot!New version of the BADS optimization algorithm just released (v1.1.1), with full support for heteroskedastic (user-provided)...

https://mastodon.social/@AcerbiL...

First toot!

New version of the BADS optimization algorithm just released (v1.1.1), with full support for heteroskedastic (user-provided) noise and several fixes: github.com/acerbilab/bads

(Also new lab repo! github.com/acerbilab)

What about Python? Coming *soon* — for real. Stay tuned!

31.10.2022 15:53First toot!New version of the BADS optimization algorithm just released (v1.1.1), with full support for heteroskedastic (user-provided)...
https://mastodon.social/@AcerbiL...
Subscribe
To add news/posts to your profile here, you must add a link to a RSS-Feed to your webfinger. One example how you can do this is to join Fediverse City.
         
Webfan Website Badge
Nutzungsbedingungen   Datenschutzerklärung  Impressum
Webfan | @Web pages | Fediverse Members