Gah!
Bayesian inference is neat etc., but it is not magic - whatever FB thinks, and I would not really call it machine learning any more than I would call linear regression machine learning. It's just that the number crunching for Bayesian inference has become feasible for complex-ish models with current hardware levels. After all, much of what happens in your favourite algorithm[1] (Gibbs sampler or whatever) is linear algebra, and graphic cards are good in that.
[1] assuming you have a model where you cannot write down the posterior PDFs analytically, or you are just a lazy sod (like me...)