Log 03-14-2013

What I did today:

1. Tried to modify the maximum-entropy method to use p(x|y) instead of p(y|x), but it turns out this probably won’t work because we inherently need to consider counterfactual values of y to get good performance.

2. Also tried an approach where you trade off badness-of-fit against p(y) [with the intuition that it’s okay to fit models that are unlikely under your model class], but couldn’t get it to work.

3. Did some experiments with Kalman filters for the p(y|x) approach, plus worked out how things should work out for semantic parsing, and convinced myself that the approach is at least reasonable, even if I can’t prove that it works well.

4. Came up with a somewhat sketchy formalization of which variables one ought to add for relevance-directed inference on a subset of variables in a graphical model (basically, add the one where adding it has the highest KL-divergence relative to not adding it).

5. Worked out a few possible ways to deal with merging of abstract states in undirected graphical models (where there might be clique potentials where some of the variables in the potential are not represented due to the abstraction). One way involves dynamic programming and the other involves breaking dependencies (which is justified under the maximum entropy formalism).

6. Thought a little bit about how to adopt these ideas to sequential Monte Carlo, and how to add in refinements.

7. Write up post on other blog describing the probabilistic abstractions formalism. [read it here]

I felt much more productive today than yesterday. Yay!

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s