Comments on: Mean-Field Variational Inference Made Easy
https://lingpipe-blog.com/2013/03/25/mean-field-variational-inference-made-easy/
Natural Language Processing and Text Analytics
Wed, 10 Oct 2018 20:16:42 +0000
hourly
1 http://wordpress.com/
By: Bob Carpenter
https://lingpipe-blog.com/2013/03/25/mean-field-variational-inference-made-easy/#comment-22168
Tue, 09 Apr 2013 00:16:38 +0000http://lingpipe-blog.com/?p=6514#comment-22168Not until I understand it a bit better. As I said, I’m not very good with these exponential family derivations.
]]>
By: Rob
https://lingpipe-blog.com/2013/03/25/mean-field-variational-inference-made-easy/#comment-22167
Tue, 09 Apr 2013 00:05:27 +0000http://lingpipe-blog.com/?p=6514#comment-22167Is there any chance you could work through an example, such as the beta-binomial model that you covered in an earlier posting?
]]>
By: brendan o'connor (@brendan642)
https://lingpipe-blog.com/2013/03/25/mean-field-variational-inference-made-easy/#comment-22142
Sun, 31 Mar 2013 23:36:52 +0000http://lingpipe-blog.com/?p=6514#comment-22142i thought the presentation in the Koller and Friedman textbook was helpful for variational inference — a bit less tied up with a particular model if i remember right. The Murphy textbook also has a general derivation for mean field, then references out to other chapters for particular instantiations (section 21.3)
]]>
By: Aki Vehtari
https://lingpipe-blog.com/2013/03/25/mean-field-variational-inference-made-easy/#comment-22126
Mon, 25 Mar 2013 20:51:16 +0000http://lingpipe-blog.com/?p=6514#comment-22126With power-EP method it is possible to use alpha-divergence which includes both KL divergences and symmetric Hellinger distance as special cases.
]]>
By: Bob Carpenter
https://lingpipe-blog.com/2013/03/25/mean-field-variational-inference-made-easy/#comment-22125
Mon, 25 Mar 2013 19:36:23 +0000http://lingpipe-blog.com/?p=6514#comment-22125It certainly makes sense, but the issue is whether you can come up with some computable way to do it. The trick to variational inference is that the hairy integral involved in the componentwise hill climbing can be solved for conjugate priors and approximated elsewhere.
]]>
By: michael
https://lingpipe-blog.com/2013/03/25/mean-field-variational-inference-made-easy/#comment-22124
Mon, 25 Mar 2013 18:56:09 +0000http://lingpipe-blog.com/?p=6514#comment-22124I am actually not a fun of any talor expansion based approximation methods including the reference above. They simly break the lower bound condition (except first order approximation).
Actually, there is another paper written by some Japanese researcher that I cannot recall the title. They show that mean field (including non conjugate models) can see from the dual problem, is to minimize bregman divergence block coordinate wise.
]]>
By: Eric
https://lingpipe-blog.com/2013/03/25/mean-field-variational-inference-made-easy/#comment-22123
Mon, 25 Mar 2013 18:36:29 +0000http://lingpipe-blog.com/?p=6514#comment-22123Thanks, Bob. Nice overview. Would it make any sense to try to minimize some symmetric divergence, such as the Jensen-Shannon divergence?
]]>