| Statistical concept / technique | Neuroscience application |
|---|---|
| Point processes; conditional intensity functions | Neural spike trains; photon-limited image data |
| Time-rescaling theorem for point processes | Fast simulation of network models; goodness-of-fit tests for spiking models |
| Bias, consistency, principal components | Spike-triggered averaging; spike-triggered covariance |
| Generalized linear models | Neural encoding models including spike-history effects; inferring network connectivity |
| Regularization; shrinkage estimation | Maximum a posteriori estimation of high-dimensional neural encoding models |
| Laplace approximation; Fisher information | Model-based decoding and information estimation; adaptive design of optimal stimuli |
| Mixture models; EM algorithm; Dirichlet processes | Spike-sorting / clustering |
| Optimization and convexity techniques | Spike-train decoding; ML estimation of encoding models |
| Markov chain Monte Carlo: Metropolis-Hastings and hit-and-run algorithms | Firing rate estimation and spike-train decoding |
| State-space models; sequential Monte Carlo / particle filtering | Decoding spike trains; optimal voltage smoothing |
| Fast high-dimensional Kalman filtering | Optimal smoothing of voltage and calcium signals on large dendritic trees |
| Markov processes; first-passage times; Fokker-Planck equation | Integrate-and-fire-based neural models |
| Hierarchical Bayesian models | Estimating multiple neural encoding models |
| Amortized inference | Spike sorting; stimulus decoding |
| Date | Topic | Reading | Notes |
|---|---|---|---|
| Sept 3 | Intro and overview | Paninski and Cunningham, `18; International Brain Lab, '17 | |
| Sept 10 | No class | ||
| Sept 17 - Oct 8 | Signal acquisition: single-cell-resolution functional imaging | Overview: Pnevmatikakis
and Paninski '18; Compression and
denoising: Buchanan
et al
'18, Sun
et al '19; Demixing: Pnevmatikakis et al '16; Zhou et al '18; Friedrich et al '17b; Lu et al '17; Giovanucci et al '17; Charles et al '19 Deconvolution: Deneux et al '16; Picardo et al '16; Friedrich et al '17a; Berens et al '18, Wei and Zhou et al '19 | HMM tutorial by Rabiner; HMM notes |
| Oct 8-15 | Signal acquisition: spike sorting | Lewicki '98; Pachitariu et al '16; Lee et al '17; Calabrese and Paninski '11, Wang et al '19 | EM notes; Blei et al review on variational inference |
| Oct 22 | Poisson regression models; estimating time-varying firing rates; hierarchical models for sharing information across cells | Kass et al (2003), Wallstrom et al (2008), Batty et al (2017), Cadena et al (2017), Seely et al (2017) | Generalized linear model notes |
| Oct 22 | Point processes: Poisson process, renewal process, self-exciting process, Cox process; time-rescaling: goodness-of-fit, fast simulation of network models | Brown et al. '01, Mena and Paninski '14 | Uri Eden's point process notes; supplementary notes. |
| Oct 29 | Presentations of project ideas | Just two minutes each | |
| Oct 29 - Nov 19 | Expected log-likelihood. Network models. Optimal experimental design. | Ramirez and Paninski, '14, Field et al '10, Lewi et al '09, Shababo et al '13, Soudry et al '15 | |
| Nov 5 | No class (University holiday) | ||
| Nov 26 | State space models and neural decoding | HMM tutorial by Rabiner; Kalman filter notes by Minka; Roweis and Ghahramani '99; Wu et al '05; Brown et al '98; Smith et al '04; Yu et al '05; Kulkarni and Paninski '08; Paninski et al '10, Vidne et al '12, Gao et al '16, Parthasarathy, Batty et al '17, Batty, Whiteway et al '19 | state-space notes (need updating) |
| Dec 3 | No class | ||
| Dec 10 | Project presentations | E-mail me your report as a pdf by the end of this week. |