## Prerequisite

- Real Analysis (MIT18.100C)
- Linear Algebra (MIT 18.065C)
- Introduction to Topology (MIT 18.901)
- Introduction to Functional Analysis (MIT 18.102)
- Introduction To Probability And Statistics (MIT 18.05)

## Courses

**Introduction to Deep Learning** (MIT 6.S191)

MIT’s introductory program on deep learning methods with applications to computer vision, natural language processing, biology, and more! Students will gain foundational knowledge of deep learning algorithms and get practical experience in building neural networks in TensorFlow. Program concludes with a project proposal competition with feedback from staff and panel of industry sponsors. Prerequisites assume calculus (i.e. taking derivatives) and linear algebra (i.e. matrix multiplication), we’ll try to explain everything else along the way! Experience in Python is helpful but not necessary.

**Machine Learning** (University of Stanford)

This course provides a broad introduction to machine learning and statistical pattern recognition. Topics include: supervised learning (generative/discriminative learning, parametric/non-parametric learning, neural networks, support vector machines); unsupervised learning (clustering, dimensionality reduction, kernel methods); learning theory (bias/variance tradeoffs, practical advice); reinforcement learning and adaptive control. The course will also discuss recent applications of machine learning, such as to robotic control, data mining, autonomous navigation, bioinformatics, speech recognition, and text and web data processing.

**Mathematics of Machine Learning **(MIT 18.657)

Broadly speaking, Machine Learning refers to the automated identification of patterns in data. As such it has been a fertile ground for new statistical and algorithmic developments. The purpose of this course is to provide a mathematically rigorous introduction to these developments with emphasis on methods and their analysis.

**Statistical Learning Theory and Applications **(MIT 9.520)

The main goal of this course is to study the generalization ability of a number of popular machine learning algorithms such as boosting, support vector machines and neural networks. Topics include Vapnik-Chervonenkis theory, concentration inequalities in product spaces, and other elements of empirical process theory. Lecture Notes (Stanford)

### Theory Of Probability (MIT 18.175)

This course covers topics such as sums of independent random variables, central limit phenomena, infinitely divisible laws, Levy processes, Brownian motion, conditioning, and martingales.

## Options

**Machine Learning with Graphs** (University of Stanford)

Complex data can be represented as a graph of relationships between objects. Such networks are a fundamental tool for modeling social, technological, and biological systems. This course focuses on the computational, algorithmic, and modeling challenges specific to the analysis of massive graphs. By means of studying the underlying graph structure and its features, students are introduced to machine learning techniques and data mining tools apt to reveal insights on a variety of networks. **Topics include:** representation learning and Graph Neural Networks; algorithms for the World Wide Web; reasoning over Knowledge Graphs; influence maximization; disease outbreak detection, social network analysis.

### For healthcare projects: **Machine Learning For Healthcare** (MIT 6.S897)

This course introduces students to machine learning in healthcare, including the nature of clinical data and the use of machine learning for risk stratification, disease progression modeling, precision medicine, diagnosis, subtype discovery, and improving clinical workflows.

### Measure And Integration (MIT 18.125)

This graduate-level course covers Lebesgue’s integration theory with applications to analysis, including an introduction to convolution and the Fourier transform.

### Introduction To Stochastic Processes (MIT 18.445)

This course is an introduction to Markov chains, random walks, martingales, and Galton-Watsom tree. The course requires basic knowledge in probability theory and linear algebra including conditional expectation and matrix.

### Advanced Stochastic Processes (MIT 15.070J)

This class covers the analysis and modeling of stochastic processes. Topics include measure theoretic probability, martingales, filtration, and stopping theorems, elements of large deviations theory, Brownian motion and reflected Brownian motion, stochastic integration and Ito calculus and functional limit theorems. In addition, the class will go over some applications to finance theory, insurance, queueing and inventory models