07-12-2016

[0:05] Plan Day
[0:48] Think about product distribution case for semi-adversarial setting
[0:44] Write up product distribution case
[0:19] Improve de-serialization speed for risk estimation code
[0:48] Set up 2 more datasets for risk estimation project
[0:34] Understand where estimates are off and why
Possible reasons I discovered:
-very narrow set of words get classified as anchors (causes distributional skew, often leads to under-estimate of risk)
-very easy and common examples end up not getting classified as anchors (leads to over-estimate of risk)
-removing the anchor from the prediction can be a large hit to accuracy if classification is based on only a few features (leads to over-estimate of risk)
[0:10] Think about how to improve the estimates
[0:15] Look over Uri / Frederik’s papers on representation learning for counterfactual estimation
[0:42] More reading about non-negative matrix factorization
[1:05] Meeting about code translation project
[1:00] Climbing

Advertisements

07-11-2016

Back after a long hiatus. Decided to start doing these again because my productivity / fitness were slipping over the summer.

[0:17] Plan Day
[1:32] Think about semi-adversarial setting
[0:38] Write about semi-adversarial setting
[1:34] Set up risk estimation experiments
[0:55] Lifting / cardio
squat 3×5@125
bench press 3×5@105
stationary bike 4x(60s jog/30s sprint)
[0:45] Read about non-negative matrix factorization
[1:15] Skype call
[2:15] E-mails 😦
[0:10] Think about PCFG approach
[0:10] E-mail collaborators about PCFG approach

T minus 5,6,7,8,9,10,11,12,13

This is all the remaining backlog of logs that I plan to catch up on. There’s yet more, but they devolved into chicken scratch at some point.

1. NLP reading group [1:30]
2. Meet with Percy [1:15]
3. Admin [1:15]
4. PL reading [1:20]
5. Minimax [1:00]

1. Minimax [1:00]
2. NLP lunch [1:30]
3. Admin [2:00]
4. Vannevar prep [1:10]
5. Think about MCMC [0:30]
6. Prep MD tutorial [2:00]

1. MD prep [1:00]
2. ML reading group [2:00]
3. Software lunch [1:00]
4. Meet with Rahul [1:00]
5. Meet with Percy [1:00]
6. Minimax implementation [1:20]

1. Minimax hacking [0:30]
2. Lester [1:30]
3. John Duchi talk [1:00]
4. Stand-up [1:00]

1. Read about generalized entropies [0:45]
2. SDL reading group [1:00]
3. Play around with verification code [2:00]
4. John Duchi meeting [0:45]
5. Matt Fischer [0:30]

1. Read about loop invariants [2:20]
2. NLP lunch [1:30]
3. Meet with Alex/Rahul [1:00]
4. Write up MWU paper [1:30]

1. Adaptive regularizer [2:00]
2. Stand-up [1:00]
3. Meet with Rahul [1:30]
4. Read about AdaGrad [1:00]

[This was 2 days, I think]
1. Clean up source code [0:75]
2. Address reviewer comments [2:00]
3. Address Percy comments [2:15] + [5:00] + [2:00]
4. Meet with Percy [1:00]