BIOS 362: Advanced Statistical Inference (Statistical Learning)

Instructor

Teaching Assistant

  • Yuqi Tian

Dates, Time, and Location

  • First meeting: Mon. Jan. 6, 2020; Last meeting: Mon. Apr. 20, 2020
  • Monday, Wednesday, and Friday 11:00-11:55AM
  • Room 11139 (small classroom), 11th floor, 2525 WEA Beginning 3/16, all classes will be held virtually.
  • Office hours: open door and by request
  • We will use the Graduate School Academic Calendar

Textbook

The book for this course is listed below, and free to download in PDF format at the book webpage: Hastie, Tibshirani, Friedman. (2009) The elements of statistical learning: data mining, inference and prediction. Springer, 2nd edition.. In the course outline and class schedule, the textbook is abbreviated "HTF", often followed by chapter or page references "Ch. X-Y" or "pp. X-Y", respectively. The BibTeX entry for the book is as follows:

@book{HTF2009,
  author = {Hastie, Trevor and Tibshirani, Robert and Friedman, Jerome},
  title = {The elements of statistical learning: data mining, inference and prediction},
  url = {http://www-stat.stanford.edu/~tibs/ElemStatLearn/},
  publisher = {Springer},
  year = 2009,
  edition = 2
}

The wide margins of the linked PDF version of the book make it difficult to read on smart devices (e.g., an iPhone). The margins may be removed using the following ghostscript command in Linux, where "output.pdf" and "input.pdf" are substituted for the appropriate file names. Please see Dr. Shotwell for help with this.

gs -o output.pdf -sDEVICE=pdfwrite -c "[/CropBox [130 140 460 685] /PAGES pdfmark" -f input.pdf

Other Resources

Course Topics

  • Overview of Supervised Learning and Review of Linear Methods: HTF Ch. 2-4
  • Splines and Kernel Methods: HTF Ch. 5-6
  • Model Assessment, Selection, and Inference: HTF Ch. 7-8
  • Neural Networks: HTF Ch. 11
  • Support Vector Machines: HTF Ch. 12
  • Unsupervised Learning: HTF Ch. 14

Other information

  • Unless otherwise stated, assigned homework is due in one week.
  • Students are encouraged to work together on homework problems, but they must turn in their own write-ups.
  • Class participation is encouraged.
  • Please bring a laptop to class.

Grading

  • Homework: 40%
  • Take-home Midterm Exam: 30%
  • Take-home Final Exam: 30%

Schedule of Topics

Date Reading (before class) Homework Topic/Content Presentation
Mon. 1/6 none none Syllabus, introduction Intro.pdf
Wed. 1/8 HTF Ch. 1 and Ch. 2.1, 2.2, and 2.3 none Least-squares, nearest-neighbors lecture-1.pdf mixture-data-lin-knn.R
Fri. 1/10 HTF Ch. 2.4 none Decision theory lecture-2.pdf
Mon. 1/13 none Mon. 1/13 (below) Loss functions in practice lecture-2a.pdf prostate-data-lin.R
Wed. 1/15 HTF Ch. 2.7, 2.8, and 2.9 none Structured regression lecture-3.pdf ex-1.R ex-2.R ex-3.R
Fri. 1/17 HTF Ch. 3.1 and 3.2 none Linear methods for regression lecture-4a.pdf linear-regression-examples.R
Mon. 1/20 No Class none Martin Luther King Jr. Holiday (no class)  
Wed. 1/22 HTF Ch. 3.1, 3.2, 3.3, and 3.4 none Linear methods: subset selection, ridge, and lasso lecture-5.pdf lasso-example.R
Fri. 1/24 HTF Ch. 3.1, 3.2, 3.3, and 3.4 none Linear methods: subset selection, ridge, and lasso (cont.) lecture-5.pdf lasso-example.R
Mon. 1/27 HTF Ch. 3.5 and 3.6 none Guest lecture by Ryan Jarrett: Linear methods: principal components regression; Meet at 1:10-2:00PM, ESB/the Wond’ry, 044  
Wed. 1/29 none none Guest lecture by Nathan James; Meet at Dept. Journal club 12pm Nature article Presentation NYT article WIRED article
Fri. 1/31 HTF Ch. 4.1, 4.2, and 4.3 Fri. 1/31 (below) Linear methods: Linear discriminant analysis lecture-8.pdf simple-LDA-3D.R
Mon. 2/3 none none SVD, LDA, and g-inverse lec6.pdf lec7.pdf lec8.pdf pca-and-g-inverses.html
Wed. 2/5 HTF Ch. 4.4 and 4.5 none Linear methods: Reduced-rank LDA LA Examples lecture-9.pdf example.R
Fri. 2/7 HTF Ch. 4.4 and 4.5 none Linear methods: Reduced-rank LDA (cont.) LA Examples lecture-9.pdf example.R
Mon. 2/10 HTF Ch. 4.4 and 4.5 none Linear methods: Logistic regression and Newton-Raphson lecture-10.pdf vowel-data-LR.Rmd mLR-delta.Rmd
Wed. 2/12 HTF Ch. 5.1 and 5.2 Wed. 2/12 (below) Basis expansions: piecewise polynomials & splines lecture-11.pdf splines-example.R
Wed. 2/14 HTF Ch. 5.1 and 5.2 none Basis expansions: piecewise polynomials & splines (cont.) lecture-11.pdf splines-example.R
Mon. 2/17 HTF Ch. 6.1-6.5 none Guest lecture by Nathan James: Kernel methods; Meet at 1:10-2:00PM, ESB/the Wond’ry, 044 lecture-13.pdf mixture-data-knn-local.R kernel-methods-examples-mcycle.R
Wed. 2/19 none none Guest lecture by Hannah Weeks: medExtractR; Meet at 1:10-2:00PM, ESB/the Wond’ry, 044 medExtractR_lecture.pdf
Fri. 2/21 none none Guest lecture by Nick Strayer; Meet at 1:10-2:00PM, ESB/the Wond’ry, 044  
Mon. 2/24 HTF Ch. 6.1-6.5 none Assign Midterm; Kernel methods (cont.) lecture-13.pdf mixture-data-knn-local-kde.R multivariate-KDE.html
Wed. 2/26 HTF Ch. 5.4 and 5.5 none More on smoothing splines, equivalent kernels, etc lecture-12.pdf smoothing-splines-example.R
Fri. 2/28 none none Bootstrap Iteration lecture-16.pdf
Mon. 3/2 none none Spring Break (no class)  
Wed. 3/4 none none Spring Break (no class)  
Fri. 3/6 none none Spring Break (no class)  
Mon. 3/9 HTF Ch. 7.1, 7.2, 7.3, 7.4 none Model assessment: Cp, AIC, BIC lecture-14.pdf effective-df-aic-bic-mcycle.R
Wed. 3/11 none none VU Cancellation due to COVID-19 (no class)  
Fri. 3/13 none Fri 3/13 (below) VU Cancellation due to COVID-19 (no class)  
Mon. 3/16 HTF Ch. 7.10 none Cross validation lecture-15.pdf kNN-CV.R
Wed. 3/18 none none Midterm Exam Review  
Fri. 3/20 HTF Ch. 9.2 none Classification and Regression Trees lecture-21.pdf mixture-data-rpart.R
Mon. 3/23 HTF Ch. 8.7, 8.8, 8.9 none Bagging lecture-18.pdf mixture-data-rpart-bagging.R nonlinear-bagging.html
Wed. 3/25 HTF Ch. 15.1, 15.2 Wed 3/25 (below) Random Forest lecture-25.pdf
Fri. 3/27 HTF Ch. 10.1 none Boosting and AdaBoost.M1 (part 1) lecture-22.pdf boosting-trees.R
Mon. 3/30 HTF Ch. 10.2-10.9 Work through this nice GBM tutorial Boosting and AdaBoost.M1 (part 2) lecture-23.pdf
Wed. 4/1 HTF Ch. 10.10, 10.13 none Boosting and AdaBoost.M1 (part 3) lecture-24.pdf gradient-boosting-example.R
Fri. 4/3 HTF Ch. 10.10, 10.13 none Boosting and AdaBoost.M1 (part 3; cont.) lecture-24.pdf gradient-boosting-example.R
Mon. 4/6 HTF Ch. 11.1, 11.2, 11.3, 11.4, 11.5 none Introduction to Neural networks lecture-31.pdf nnet.R
Wed. 4/8 HTF Ch. 11.1, 11.2, 11.3, 11.4, 11.5 none Introduction to Neural networks (cont.) lecture-31.pdf nnet.R
Fri. 4/10 HTF Ch. 14.5 none Principal curves and surfaces lecture-28.pdf principal-curves.R
Mon. 4/13 HTF 14.5.3 none k-means, hierarchical, and spectral clustering lecture-29.pdf spectral-clustering.R
Wed. 4/15 HTF 14.8 none Multidimensional scaling lecture-30.pdf MDS-examples.R
Fri. 4/17 none none Clustering with mixtures lecture-32.pdf normal-mixture-examples.R
Mon. 4/20 none none Distribute Final Exam  

Homework/Laboratory (other than problems listed in HTF)

Mon. 1/13

Using the RMarkdown/knitr/github mechanism, implement the following tasks by extending the example R script (prostate-data-lin.R):
  • Write functions that implement the L1 loss and tilted absolute loss functions.
  • Create a figure that shows lpsa (x-axis) versus lcavol (y-axis). Add and label (using the 'legend' function) the linear model predictors associated with L2 loss, L1 loss, and tilted absolute value loss for tau = 0.25 and 0.75.
  • Write functions to fit and predict from a simple exponential (nonlinear) model with three parameters defined by 'beta[1] + beta[2]*exp(-beta[3]*x)'. Hint: make copies of 'fit_lin' and 'predict_lin' and modify them to fit the nonlinear model. Use c(-1.0, 0.0, -0.3) as 'beta_init'.
  • Create a figure that shows lpsa (x-axis) versus lcavol (y-axis). Add and label (using the 'legend' function) the nonlinear model predictors associated with L2 loss, L1 loss, and tilted absolute value loss for tau = 0.25 and 0.75.

Fri. 1/31

Goal: Understand ridge regression and its implementation in R.

Due: Before class Fri. 2/7.

Using the RMarkdown/knitr/github mechanism, implement the following tasks:
  • Use the prostate cancer data from the ElemStatLearn package for R.
  • Use the cor function to reproduce the correlations listed in HTF Table 3.1, page 50.
  • Treat lpsa as the oucome, and use all other variables in the data set as predictors.
  • With the training subset, train a least-squares regression model with all predictors using the lm function (with the training subset).
  • Use the testing subset to compute the test error using the fitted least-squares regression model.
  • Train a ridge regression model using the glmnet function, and tune the value of lambda.
  • Create a path diagram of the ridge regression analysis
  • Create a figure that shows the training and test error associated with ridge regression as a function of lambda

Wed. 2/12

Goal: Understand and implement reduced rank LDA in R.

Due: Before class Wed. 2/19.

Using the RMarkdown/knitr/github mechanism, implement the following tasks:
  • Retrieve the vowel data (training and testing) from the HTF website or R package.
  • Review the class notes and HTF section 4.3.3.
  • Implement reduced-rank LDA using the vowel training data. Check your work by plotting the first two discriminant variables as in HTF Figure 4.4. Hint: Center the 10 training predictors before implementing LDA. See built-in R function ’scale’. The singular value or Eigen decompositions may be computed using the built-in R functions ’svd’ or ’eigen’, respectively.
  • Use the vowel testing data to estimate the expected prediciton error (assuming zero-one loss), varying the number of canonical variables used for classification.
  • Plot the EPE as a function of the number of discriminant variables, and compare this with HTF Figure 4.10.
  • (Optional) Reproduce HTF Figure 4.11. Note: The reproduction need not be exact. However, the information content should be preserved.

Fri. 3/13

  • Review HTF sections 7.10 and 7.11
  • Consider a 1-nearest neighbor classifier applied to a two-class classification problem, where the marginal probability associated with either class is one half, and where the distribution of a univariate predictor is standard normal, independent of class (i.e., not a very good predictor). Do the following:
    • Show that the expected prediction error (EPE; HTF expression 7.3) is equal to 0.5.
    • Show that $E_z[\hat{\mathrm{Err}}_{\mathrm{boot}}]$ (expectation of HTF expression 7.54) is approximately equal to 0.184, where z represents the training sample of N class and predictor pairs. Thus, demonstrate that the bootstrap estimate of EPE is optimistic.
    • Compute or approximate $E_z[\hat{\mathrm{Err}}^{(1)}]$ (expectation of HTF expression 7.56).
    • Compute or approximate $E_z[\hat{\mathrm{Err}}^{(0.632)}]$ (expectation of HTF expression 7.57)

Wed. 3/25

  • Complete HTF exercise Ex. 15.4.

Links

RStudio/Knitr

Topic attachments
I Attachment Action Size Date Who Comment
2016-midterm-phoneme.RR 2016-midterm-phoneme.R manage 3.7 K 25 Mar 2016 - 08:33 MattShotwell Code for solution to 2016 midterm.
HW10.pdfpdf HW10.pdf manage 44.3 K 09 Mar 2015 - 09:51 MattShotwell Homework 10
Intro.pdfpdf Intro.pdf manage 781.2 K 06 Jan 2020 - 08:21 MattShotwell  
LA_Examples_DS_Bootcamp.htmlhtml LA_Examples_DS_Bootcamp.html manage 2374.0 K 05 Feb 2020 - 11:04 MattShotwell  
LAozone.RR LAozone.R manage 3.1 K 14 Mar 2018 - 10:57 MattShotwell  
LagrangeMultipliers-Bishop-PatternRecognitionMachineLearning.pdfpdf LagrangeMultipliers-Bishop-PatternRecognitionMachineLearning.pdf manage 1574.4 K 06 Apr 2016 - 17:53 MattShotwell Lagrange Multipliers; Bishop; Pattern Recognition and Machine Learning
MCB-20121115.pdfpdf MCB-20121115.pdf manage 676.5 K 17 Dec 2014 - 10:32 MattShotwell The Matrix Cookbook (version 15 November 2012)
MDS-examples.RR MDS-examples.R manage 2.0 K 15 Apr 2020 - 09:51 MattShotwell  
airquality-EM-mixture.RR airquality-EM-mixture.R manage 2.2 K 11 Apr 2016 - 10:59 MattShotwell EM algorithm with finite normal mixture
airquality-agnes.RR airquality-agnes.R manage 1.3 K 13 Apr 2016 - 11:22 MattShotwell [Ag]glomerative [nes]ting (clustering) with airquality data
boosting-trees.RR boosting-trees.R manage 3.3 K 27 Mar 2020 - 10:32 MattShotwell Boosting a tree stump with the AdaBoost.M1 algorithm
bootstrap-calibration.RR bootstrap-calibration.R manage 3.2 K 23 Feb 2018 - 11:43 MattShotwell  
df-stepwise.RDataRData df-stepwise.RData manage 2.5 K 17 Feb 2016 - 16:43 MattShotwell  
df-stepwise.RmdRmd df-stepwise.Rmd manage 5.5 K 12 Feb 2017 - 20:50 MattShotwell  
df-stepwise.htmlhtml df-stepwise.html manage 737.8 K 12 Feb 2017 - 20:50 MattShotwell  
effective-df-aic-bic-mcycle.RR effective-df-aic-bic-mcycle.R manage 3.9 K 09 Mar 2020 - 10:58 MattShotwell  
gradient-boosting-example.RR gradient-boosting-example.R manage 8.5 K 03 Apr 2020 - 12:43 MattShotwell  
kNN-CV.RR kNN-CV.R manage 4.1 K 16 Mar 2020 - 09:26 MattShotwell  
kernel-manipulate-example.RR kernel-manipulate-example.R manage 1.2 K 15 Jan 2020 - 10:20 MattShotwell  
kernel-methods-examples-mcycle.RR kernel-methods-examples-mcycle.R manage 3.6 K 24 Feb 2020 - 10:32 MattShotwell  
lab1.pdfpdf lab1.pdf manage 226.6 K 12 Jan 2015 - 14:04 GuanhuaChen BIOS362_lab1
lab2.pdfpdf lab2.pdf manage 1901.4 K 21 Jan 2015 - 11:20 GuanhuaChen slides from Dr. Jojic (UNC)'s Machine learning class
lasso-example.RR lasso-example.R manage 5.0 K 24 Jan 2020 - 11:13 MattShotwell  
lecture-1.pdfpdf lecture-1.pdf manage 408.0 K 08 Jan 2020 - 09:08 MattShotwell  
lecture-10.RmdRmd lecture-10.Rmd manage 3.8 K 10 Feb 2020 - 11:01 MattShotwell  
lecture-10.pdfpdf lecture-10.pdf manage 170.9 K 10 Feb 2020 - 11:17 MattShotwell  
lecture-11.pdfpdf lecture-11.pdf manage 285.1 K 12 Feb 2020 - 08:42 MattShotwell  
lecture-12.pdfpdf lecture-12.pdf manage 473.8 K 26 Feb 2020 - 10:48 MattShotwell  
lecture-13.pdfpdf lecture-13.pdf manage 376.9 K 12 Feb 2018 - 10:31 MattShotwell  
lecture-14.pdfpdf lecture-14.pdf manage 382.7 K 09 Mar 2020 - 10:22 MattShotwell  
lecture-15.pdfpdf lecture-15.pdf manage 353.6 K 16 Mar 2020 - 08:53 MattShotwell  
lecture-16.pdfpdf lecture-16.pdf manage 240.5 K 28 Feb 2020 - 10:50 MattShotwell  
lecture-17.pdfpdf lecture-17.pdf manage 372.7 K 20 Feb 2019 - 11:05 MattShotwell  
lecture-18.pdfpdf lecture-18.pdf manage 190.7 K 28 Feb 2018 - 10:43 MattShotwell  
lecture-2.pdfpdf lecture-2.pdf manage 243.4 K 10 Jan 2020 - 09:36 MattShotwell  
lecture-20.pdfpdf lecture-20.pdf manage 143.4 K 14 Mar 2018 - 10:57 MattShotwell  
lecture-21.pdfpdf lecture-21.pdf manage 456.8 K 20 Mar 2020 - 09:10 MattShotwell  
lecture-22.pdfpdf lecture-22.pdf manage 499.9 K 27 Mar 2020 - 10:33 MattShotwell  
lecture-23.pdfpdf lecture-23.pdf manage 292.3 K 30 Mar 2020 - 09:50 MattShotwell  
lecture-24.pdfpdf lecture-24.pdf manage 494.2 K 01 Apr 2020 - 09:45 MattShotwell  
lecture-25.pdfpdf lecture-25.pdf manage 410.4 K 25 Mar 2020 - 12:54 MattShotwell  
lecture-26.pdfpdf lecture-26.pdf manage 569.1 K 27 Mar 2019 - 11:01 MattShotwell  
lecture-27.pdfpdf lecture-27.pdf manage 190.7 K 29 Mar 2019 - 10:59 MattShotwell  
lecture-28.pdfpdf lecture-28.pdf manage 330.7 K 10 Apr 2020 - 10:58 MattShotwell  
lecture-29.pdfpdf lecture-29.pdf manage 955.6 K 13 Apr 2020 - 09:34 MattShotwell  
lecture-2a.pdfpdf lecture-2a.pdf manage 97.6 K 13 Jan 2020 - 12:02 MattShotwell  
lecture-3.pdfpdf lecture-3.pdf manage 569.0 K 15 Jan 2020 - 10:20 MattShotwell  
lecture-30.pdfpdf lecture-30.pdf manage 626.1 K 15 Apr 2020 - 10:04 MattShotwell  
lecture-31.pdfpdf lecture-31.pdf manage 4059.4 K 08 Apr 2020 - 10:17 MattShotwell  
lecture-32.pdfpdf lecture-32.pdf manage 165.5 K 17 Apr 2020 - 10:03 MattShotwell  
lecture-4.pdfpdf lecture-4.pdf manage 175.7 K 14 Jan 2019 - 10:58 MattShotwell  
lecture-4a.pdfpdf lecture-4a.pdf manage 152.8 K 17 Jan 2020 - 10:11 MattShotwell  
lecture-5.pdfpdf lecture-5.pdf manage 578.2 K 22 Jan 2020 - 10:18 MattShotwell  
lecture-6.pdfpdf lecture-6.pdf manage 259.8 K 18 Jan 2019 - 10:02 MattShotwell  
lecture-7.pdfpdf lecture-7.pdf manage 136.6 K 23 Jan 2019 - 11:18 MattShotwell  
lecture-8.pdfpdf lecture-8.pdf manage 596.0 K 31 Jan 2020 - 10:18 MattShotwell  
lecture-9.pdfpdf lecture-9.pdf manage 1199.6 K 05 Feb 2020 - 11:05 MattShotwell  
linear-regression-examples.RR linear-regression-examples.R manage 4.9 K 17 Jan 2020 - 10:11 MattShotwell  
linear-spline-manipulate-example.RR linear-spline-manipulate-example.R manage 1.2 K 15 Jan 2020 - 10:20 MattShotwell  
mLR-delta.RmdRmd mLR-delta.Rmd manage 4.7 K 12 Feb 2020 - 09:41 MattShotwell  
medExtractR_lecture.pdfpdf medExtractR_lecture.pdf manage 5878.6 K 27 Feb 2020 - 14:26 HannahWeeks medExtractR_lecture
mixture-data-complete.RR mixture-data-complete.R manage 5.7 K 10 Feb 2015 - 09:12 MattShotwell splines regression, local regression, and kernel density classification of the mixture data
mixture-data-knn-local-kde.RR mixture-data-knn-local-kde.R manage 8.1 K 24 Feb 2020 - 10:31 MattShotwell  
mixture-data-knn-local.RR mixture-data-knn-local.R manage 4.7 K 17 Jan 2018 - 10:25 MattShotwell  
mixture-data-lin-knn.RR mixture-data-lin-knn.R manage 3.8 K 08 Jan 2020 - 09:08 MattShotwell  
mixture-data-rpart-bagging.RR mixture-data-rpart-bagging.R manage 3.7 K 23 Mar 2020 - 07:44 MattShotwell  
mixture-data-rpart.RR mixture-data-rpart.R manage 2.5 K 20 Mar 2020 - 09:09 MattShotwell  
mixture-data-svm.RR mixture-data-svm.R manage 3.3 K 07 Apr 2017 - 12:31 MattShotwell SVM with mixture data; 3D graphic
mixture-data.RR mixture-data.R manage 2.1 K 30 Jan 2015 - 09:31 MattShotwell Lab 3; demo code for mixture data
mnist-convnet.RR mnist-convnet.R manage 2.6 K 08 Apr 2019 - 11:09 MattShotwell  
multivariate-KDE.htmlhtml multivariate-KDE.html manage 862.9 K 24 Feb 2020 - 10:31 MattShotwell  
nlls_v2.RR nlls_v2.R manage 3.2 K 19 Jan 2018 - 10:47 MattShotwell  
nnet.RR nnet.R manage 3.0 K 05 Apr 2020 - 16:09 MattShotwell  
nonlinear-bagging.csvcsv nonlinear-bagging.csv manage 0.5 K 29 Feb 2016 - 11:09 MattShotwell nonlinear bagging example data
nonlinear-bagging.htmlhtml nonlinear-bagging.html manage 656.0 K 23 Mar 2020 - 10:46 MattShotwell  
normal-mixture-examples.RR normal-mixture-examples.R manage 1.8 K 17 Apr 2020 - 10:03 MattShotwell  
pca-and-g-inverses.RmdRmd pca-and-g-inverses.Rmd manage 2.4 K 03 Feb 2020 - 08:43 MattShotwell  
pca-and-g-inverses.htmlhtml pca-and-g-inverses.html manage 1507.4 K 03 Feb 2020 - 08:43 MattShotwell  
presentation.pdfpdf presentation.pdf manage 333.3 K 11 Mar 2019 - 10:41 MattShotwell  
principal-curves.RR principal-curves.R manage 3.7 K 10 Apr 2020 - 10:09 MattShotwell Principal curves example
prostate-data-lin.RR prostate-data-lin.R manage 1.2 K 13 Jan 2020 - 10:20 MattShotwell  
prostate.RR prostate.R manage 2.9 K 26 Jan 2015 - 09:45 MattShotwell least-squares, ridge, and principal components regression with prostate data.
simple-LDA-3D.RR simple-LDA-3D.R manage 2.7 K 31 Jan 2020 - 10:18 MattShotwell  
simple-neural-network.RR simple-neural-network.R manage 3.8 K 28 Mar 2017 - 09:43 MattShotwell Neural network with one hidden layer, 20 units, fully connected
smooth-splines-manipulate-example.RR smooth-splines-manipulate-example.R manage 1.0 K 15 Jan 2020 - 10:20 MattShotwell  
smoothing-splines-example.RR smoothing-splines-example.R manage 1.1 K 26 Feb 2020 - 08:22 MattShotwell  
spectral-clustering.RR spectral-clustering.R manage 3.1 K 13 Apr 2020 - 09:34 MattShotwell  
sphered-and-canonical-inputs.RR sphered-and-canonical-inputs.R manage 6.3 K 07 Feb 2020 - 11:24 MattShotwell  
splines-example.RR splines-example.R manage 3.8 K 14 Feb 2020 - 11:02 MattShotwell Splines example
vowel-data-LR.RmdRmd vowel-data-LR.Rmd manage 3.2 K 10 Feb 2020 - 12:19 MattShotwell  
yuying_1.pdfpdf yuying_1.pdf manage 953.4 K 13 Feb 2015 - 15:30 GuanhuaChen  
yuying_2.pdfpdf yuying_2.pdf manage 2283.6 K 13 Feb 2015 - 15:31 GuanhuaChen  
Topic revision: r313 - 20 Apr 2020, MattShotwell
 

This site is powered by FoswikiCopyright © 2013-2020 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding Vanderbilt Biostatistics Wiki? Send feedback