| Statistical concept / technique | Neuroscience application |
|---|---|
| Point processes; conditional intensity functions | Neural spike trains; photon-limited image data |
| Time-rescaling theorem for point processes | Fast simulation of network models; goodness-of-fit tests for spiking models |
| Bias, consistency, principal components | Spike-triggered averaging; spike-triggered covariance |
| Generalized linear models | Neural encoding models including spike-history effects; inferring network connectivity |
| Regularization; shrinkage estimation | Maximum a posteriori estimation of high-dimensional neural encoding models |
| Laplace approximation; Fisher information | Model-based decoding and information estimation; adaptive design of optimal stimuli |
| Mixture models; EM algorithm; Dirichlet processes | Spike-sorting / clustering |
| Optimization and convexity techniques | Spike-train decoding; ML estimation of encoding models |
| Markov chain Monte Carlo: Metropolis-Hastings and hit-and-run algorithms | Firing rate estimation and spike-train decoding |
| State-space models; sequential Monte Carlo / particle filtering | Decoding spike trains; optimal voltage smoothing |
| Fast high-dimensional Kalman filtering | Optimal smoothing of voltage and calcium signals on large dendritic trees |
| Markov processes; first-passage times; Fokker-Planck equation | Integrate-and-fire-based neural models |
| Date | Topic | Reading | Notes |
|---|---|---|---|
| Sept 5 | Intro and overview | Paninski and Cunningham, `17 | |
| Sept 5 | Signal acquisition: spike sorting | Lewicki '98; Pachitariu et al '16; Lee et al '17; Calabrese and Paninski '11 | EM notes; Blei et al review on variational inference |
| Sept 12 | Neuroscience review by Gonzalo Mena | ||
| Sept 19 - Oct 3 | Signal acquisition: calcium imaging | Demixing: Pnevmatikakis
et al '16; Zhou
et al
'16; Friedrich
et al
'17b; Lu
et al
'17; Giovanucci et al
'17; Deconvolution: Deneux et al '16; Picardo et al '16; Friedrich et al '17a; Berens et al '17 | HMM tutorial by Rabiner; HMM notes |
| Oct 10-17 | Poisson regression models; estimating time-varying firing rates; hierarchical models for sharing information across cells | Kass et al (2003), Wallstrom et al (2008), Batty et al (2017), Cadena et al (2017), Seely et al (2017) | Generalized linear model notes |
| Oct 24 | Presentations of project ideas | Just two minutes each | |
| Oct 31 | Expected log-likelihood. Network models. Optimal experimental design. | Ramirez and Paninski, '14, Field et al '10, Lewi et al '09, Shababo et al '13, Soudry et al '15 | |
| Nov 7 | No class (University holiday) | ||
| Nov 14 | Point processes: Poisson process, renewal process, self-exciting process, Cox process; time-rescaling: goodness-of-fit, fast simulation of network models | Brown et al. '01, Mena and Paninski '14 | Uri Eden's point process notes; supplementary notes. |
| Nov 21 - 28 | State space models; autoregressive models; Kalman filter; extended Kalman filter; fast tridiagonal methods. Applications in neural prosthetics, optimal smoothing of voltage/calcium traces, fitting common-input models for population spike train data | HMM tutorial by Rabiner; Kalman filter notes by Minka; Roweis and Ghahramani '99; Huys et al '06; Paninski et al '04; Jolivet et al '04; Beeman's notes on conductance-based neural modeling; Wu et al '05; Brown et al '98; Smith et al '04; Yu et al '05; Kulkarni and Paninski '08; Paninski et al '10, Vidne et al '12, Pfau et al '13, Gao et al '16 | state-space notes (need updating) |
| Dec 5 | No class (office hours) | Stop by if you want to discuss your project. | |
| Dec 12 | Project presentations | E-mail me your report as a pdf by Dec 15. |