Stuff I did today:
1. Attended Fei-Fei Li’s comp. neuro lab meeting
2. Wrote better logging (with the help of fig) so I can actually keep track of the weights output by L-BFGS.
3. Realize that L-BFGS is converging really slowly and increase the tolerances from 1e-5 to 1e-2 as a stopgap (will have to investigate the slow convergence in detail later).
4. Added an extra “dummy predicate” to allow for the ability to extract the first argument from a predicate.
5. Examine the predicates in our data set in more detail to get a bit more intuition for what’s going on.
6. Found a bug in my pre-processing code for the bottom-up search that was making stuff really slow (not a bug so much as a bunch of unecessary computation). Re-wrote that code to make it substantially faster and less memory intensive; also wrote a script to pre-process in batches of 10 so that I don’t run into memory problems. That script is currently running and will hopefully terminate overnight.
7. Met with Percy to discuss project goals (short-term and long-term). In the short term, there’s still one more bug I need to track down (related to beam search); there are also lots of alignment issues, which may be related to the bug, but which I need to fix either way. I’m hoping that fixing the bug + running on the full training set will deal with this.
8. Lab group dinner.