PCA multivariate signal processing applied to neural data
What you’ll learn
Understand advanced linear algebra methods
Includes a 3+ hour “crash course” on linear algebra
Apply advanced linear algebra methods in MATLAB and Python
Simulate multivariate data for testing analysis methods
Analyzing multivariate time series datasets
Appreciate the challenges neuroscientists are struggling with!
Learn about modern neuroscience data analysis
Requirements
Some linear algebra background (3+ hour crash course is provided)
Some neuroscience background (or interest in learning!)
Some MATLAB/Python programming experience (only to complete exercises)
Interest in learning applied linear algebra
Description
What is this course all about?Neuroscience (brain science) is changing — new brain-imaging technologies are allowing increasingly huge data sets, but analyzing the resulting Big Data is one of the biggest struggles in modern neuroscience (if don’t believe me, ask a neuroscientist!). The increases in the number of simultaneously recorded data channels allows new discoveries about spatiotemporal structure in the brain, but also presents new challenges for data analyses. Because data are stored in matrices, algorithms developed in linear algebra are extremely useful. The purpose of this course is to teach you some matrix-based data analysis methods in neural time series data, with a focus on multivariate dimensionality reduction and source-separation methods. This includes covariance matrices, principal components analysis (PCA), generalized eigendecomposition (even better than PCA!), and independent components analysis (ICA). The course is mathematically rigorous but is approachable to individuals with no formal mathematics background. The course comes with MATLAB and Python code (note that the videos show the MATLAB code and the Python code is a close match).You should take this course if you are a…neuroscience researcher who is looking for ways to analyze your multivariate data.student who wants to be competitive for a neuroscience PhD or postdoc position.non-neuroscientist who is interested in learning more about the big questions in modern brain science.independent learner who wants to advance your linear algebra knowledge.mathematician, engineer, or physicist who is curious about applied matrix decompositions in neuroscience.person who wants to learn more about principal components analysis (PCA) and/or independent components analysis (ICA)intrigued by the image that starts off the Course Preview and want to know what it means! (The answers are in this course!)Unsure if this course is right for you?I worked hard to make this course accessible to anyone with at least minimal linear algebra and programming background. But this course is not right for everyone. Check out the preview videos and feel free to contact me if you have any questions.I look forward to seeing you in the course!
Overview
Section 1: Introduction
Lecture 1 Target audience and learning from this course
Lecture 2 What is multivariate neuroscience?
Lecture 3 What are linear spatial filters?
Lecture 4 Why spatial filters are useful for neuroscience
Section 2: Download all course materials
Lecture 5 IMPORTANT: Download all course materials
Lecture 6 Download Python code
Section 3: Dimensions and sources
Lecture 7 The concept of “dimension” in measured signals
Lecture 8 The concept of “source” in measured signals
Lecture 9 Sources, mixing, and unmixing
Lecture 10 Dimension reduction vs. source separation
Lecture 11 Linear vs. nonlinear filtering
Lecture 12 Data requirements for source separation
Section 4: Linear algebra crash course
Lecture 13 Introduction to this section
Lecture 14 Vectors and matrices
Lecture 15 Vector multiplications (incl. dot product)
Lecture 16 Matrix multiplications
Lecture 17 MATLAB: vectors and matrices
Lecture 18 Linear independence
Lecture 19 Matrix rank
Lecture 20 Shifting a matrix
Lecture 21 MATLAB: rank and shifting
Lecture 22 Matrix inverse
Lecture 23 A transpose A
Lecture 24 MATLAB: Inverse and AtA
Lecture 25 Eigenvalues/vectors and diagonalization
Lecture 26 The singular value decomposition (SVD)
Lecture 27 SVD for compression
Lecture 28 MATLAB: eig and svd
Section 5: Creating and interpreting covariance matrices
Lecture 29 Using real and simulated data
Lecture 30 Correlation and covariance: terms and matrices
Lecture 31 Creating covariance matrices in data
Lecture 32 MATLAB: covariance of simulated data
Lecture 33 MATLAB: covariance with real data
Lecture 34 Proof: Covariance matrices are symmetric
Lecture 35 Evaluating and improving covariance quality
Lecture 36 MATLAB: Single trial covariance distances
Lecture 37 The quadratic form and the covariance surface
Lecture 38 MATLAB: visualizing the quadratic form
Section 6: Dimension reduction with PCA
Lecture 39 PCA: Goals, objective, and solution
Lecture 40 MATLAB: PCA intuition with 2D data
Lecture 41 How to perform a principal components analysis
Lecture 42 Exercise: PCA on non-phase-locked data
Lecture 43 The geometry of PCA
Lecture 44 Proof of principal component orthogonality
Lecture 45 Scree plots and eigenspectra
Lecture 46 MATLAB: PCA of simulated EEG data
Lecture 47 MATLAB: PCA of real EEG data
Lecture 48 Exercise: Repeat PCA using pca()
Lecture 49 MATLAB: importance of mean-centering for PCA
Lecture 50 Dimension reduction using SVD instead of eigendecomposition
Lecture 51 MATLAB: PCA via SVD and covariance
Lecture 52 PCA for state-space representation
Lecture 53 MATLAB: state-space representation via PCA
Lecture 54 MATLAB: PCA on multitrial data
Lecture 55 Limitations of principal components analysis
Section 7: Source separation with GED
Lecture 56 Tutorial paper on GED
Lecture 57 Hypothesis-driven motivation for GED
Lecture 58 GED: Goals, objective, and solution
Lecture 59 MATLAB: GED intuition with covariance surfaces
Lecture 60 GED weights and nonorthogonality
Lecture 61 MATLAB: GED in a simple example
Lecture 62 Visualizing the spatial filter vs. spatial patterns
Lecture 63 Component sign uncertainty
Lecture 64 MATLAB: Adjusting component signs
Lecture 65 MATLAB: 2 components in simulated EEG data
Lecture 66 Constructing the S and R matrices
Lecture 67 MATLAB: Task-relevant component in EEG
Lecture 68 MATLAB: Spectral scanning in MEG and EEG
Lecture 69 Two-stage compression and source separation
Lecture 70 Exercise: Two-stage source separation in real EEG data
Lecture 71 ZCA prewhitening
Lecture 72 MATLAB: Simulated data with and without ZCA
Lecture 73 Exercise: ZCA+two-stage separation on real EEG data
Lecture 74 Source separation with nonstationary covariances
Lecture 75 MATLAB: Simulated EEG data with alternating dipoles
Lecture 76 Regularization: Theory, math, and intuition
Lecture 77 MATLAB: Effects of regularization in real data
Lecture 78 Empirical methods for regularization amount
Lecture 79 MATLAB: Regularization cross-validation
Lecture 80 Complex-valued solutions
Lecture 81 MATLAB: GED vs. factor analysis
Section 8: Source separation for steady-state responses
Lecture 82 The steady-state evoked potential
Lecture 83 Motivations for a spatial filter for the steady-state response
Lecture 84 RESS analysis pipeline
Lecture 85 MATLAB: example with real EEG data
Section 9: Independent components analysis (ICA)
Lecture 86 Overview of independent components analysis
Lecture 87 MATLAB: Data distributions and ICA
Lecture 88 MATLAB: ICA, PCA, GED on simulated data
Lecture 89 MATLAB: Explore IC distributions in real data
Section 10: Overfitting and inferential statistics
Lecture 90 What is overfitting and why is it inappropriate?
Lecture 91 Unbiased filter creation and application
Lecture 92 Cross-validation (in- vs. out-of-sample testing)
Lecture 93 Permutation testing
Lecture 94 MATLAB: Permutation testing
Section 11: Big questions in multivariate neuroscience
Lecture 95 Math, physiology, and anatomy
Lecture 96 Functional networks vs. volume conduction
Lecture 97 Interpreting individual differences
Lecture 98 A surfeit of source separation selections (and a reading list!)
Lecture 99 Is reducing dimensionality always good?
Section 12: Bonus section
Lecture 100 Bonus lecture
Anyone interested in next-generation neuroscience data analyses,Learners with interest in applied linear algebra to modern big-data challenges,Neuroscientists dealing with “big data”,Mathematicians, engineers, and physicists who are interested in learning about neuroscience data
Course Information:
Udemy | English | 17h 33m | 6.28 GB
Created by: Mike X Cohen
You Can See More Courses in the Teaching & Academics >> Greetings from CourseDown.com