Schedule and Readings

The course schedule may be subject to change. The textbooks and materials for this course are:

SLP Speech and Language Processing by Dan Jurafsky and James H. Martin
CNLCognitive Neuroscience of Language by David Kemmerer
Mat1Brain_Encoding_Decoding
Notes Course notes

These textbooks are primarily for reference; we will not be “following” them in any sense. Additional readings will include research papers and blog posts.

Week 1, Feb. 29

Basics for brain language processing

We will introduce the concepts of language, brain, deficit-lesion, neuroimaging, etc. We will learn about what language/the human brain is, how to study brain language processing, and what have we known about brain language processing so far.

Lecture
Slides,

Reading : 语言认知和语言计算

Neural Bases of Language with Liina Pylkkanen

Week 2, Mar. 7

Basics for machine language processing

We will introduce the concept of grammar, embeddings, language models, etc. We will learn about what is machine language processing/understanding and how to represent the meaning/syntax of text.

Lecture
Slides,
Coding exercise
codes&data

Reading : 文本向量表示方法研究-博士论文-王少楠

syntactic complaxity(Node Count)

word embedding (Efficient Estimation of Word Representations in Vector Space)

interpreting word embeddings(Investigating inner properties of multimodal representation and semantic compositionality with brain-based componential semantics)

language model(A Neural Probabilistic Language Model)

Week 3, Mar. 14

Statistical methods for brain encoding

We will introduce the concept of brain encoding, generalized linear model (GLM), etc. We will learn how to deal with neuroimaging data and implement a simple brain encoding system.

Lecture
Slides,
Coding exercise
codes&data

Reading : Generalized linear model

Brian encoding papers(The organization of the semantic network as reflected by the neural correlates of six semantic dimensions)

Experimental papers on brain encoding models(How Does the Experimental Setting Affect the Conclusions of Neural Encoding Models?)

Week 4, Mar. 21

Neural network methods for brain encoding

We will introduce the concept of ridge regression, permutation tests, t-tests, etc. We will learn how to import raw neuroimaging data and leverage neural network models to craft meaningful representations. Subsequently, we will explore the construction of a sophisticated brain encoding system, integrating fMRI data with computational representations for a comprehensive analysis.

Lecture
Slides,
Coding exercise-student
codes&data
Coding full-version
codes&data

Reading : Experimental papers on brain encoding models(How Does the Experimental Setting Affect the Conclusions of Neural Encoding Models?)

Week 5, Mar. 28

Large language models for brain encoding

We will introduce the concept of language model, representational similarity analysis(RSA), etc. We will learn how to use RSA to interpret the meaning encoded in models and brain activation data and we will also learn different kinds of brain visualization tools.

Lecture
Slides1, Slides2, Example data
Coding exercise-student
codes&data
Coding full-version
codes&data

Reading : GPT(Attention is all you need)

RSA(Representational similarity analysis – connecting the branches of systems neuroscience)

Week 6, Apr. 11

Decoding words from brain activation

We will introduce the concept of Multi-voxel pattern analysis(MVPA) and time-series decoding, etc. We will learn the characteristics of MEG and EEG data, and build a word decoding model using the MVPA method.

Lecture
Slides,
Coding exercise-student
codes&data
Coding full-version
code&data

Reading : TGM(Studying language in context using thetemporal generalization method)

MVPA(The time course of cross-modal representations of conceptual categories)

Week 7, Apr. 25

Decoding sentences from brain activation

We will introduce the concepts of word rate and beam search, among others. Additionally, we will explore an example of a model that decodes coherent sentences from a series of fMRI data.

Lecture
Slides,
Coding full-version
code&data

Reading : Neural decoding(Semantic reconstruction of continuous language from non-invasive brain recordings)

Neural decoding(MapGuide: A Simple yet Effective Method to Reconstruct Continuous Language from Brain Activities)

Week 8, Apr. 28

Decoding speech from brain activation

We will introduce the concepts of temporal receptive field analysis, zero-shot classification, and other related topics. Additionally, we will explore the characteristics of electrocorticography (ECoG) data and develop a neural decoder to interpret speech from ECoG signals.

Coding full-version
code&data

Reading : Neural decoder(Decoding speech perception from non-invasive brain recordings)