Log 02-19-2013

Writing this post-hoc. Apparently today I wrote a long-ass summary of progress on the very lazy evaluation project so far (which I’ve been thinking of as “probabilistic abstractions”). This includes the notion of a probabilistic abstraction, of a \Pi-system, of a valid abstraction of a function, and also efficiently algorithms for subsampling \Pi-systems that are tree-structured, as well as a criterion for the optimal choice of which to include among a collection of random variables, if we can only include a limited number k of them (it turns out the answer is to take the ones with minimum marginal entropy).

02-07-2013

(This is a post-hoc writeup).

Apparently today I prepared a talk to give at NLP lunch. This was, I believe, a pretty busy week, as I gave 4 talks in total (one for Vannevar, one for NLP lunch, one for ML Reading Group, and one for 229T section). The NLP lunch talk was to about my research so far, and included a presentation on weighted KL divergence (there was also transformed KL divergence but I cut it from the talk), as well as an attempted definition of probabilistic abstractions in terms of piecewise uniform approximations, although I will later find out that this definition doesn’t work.

In the talk I go over abstract beam search (it’s actually relatively similar to the thing that I have right now for my write-up, minus the dynamic programming insight + punting on conditioning at the end of the day). I also didn’t yet have a way to subsample a Pi-system (that gets resolved over the next week and a half, though, at least for trees).