Skip to content

Commit ce42635

Browse files
committed
update
1 parent a091280 commit ce42635

File tree

2 files changed

+127
-8
lines changed

2 files changed

+127
-8
lines changed

LICENSE

+1-6
Original file line numberDiff line numberDiff line change
@@ -1,6 +1 @@
1-
Copyright Zheng Zhao and all lecturers (add their names here) - All rights reserved
2-
3-
The course materials are proprietary, however, the authors grant you the right to freely download and use the materials.
4-
5-
The codes in folder `./***` are distributed under the MIT license.
6-
1+
Copyright Zheng Zhao and all the lecturers (add their names here) - All rights reserved

README.md

+126-2
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,132 @@
11
# A computational introduction to stochastic differential equations
22

3-
To be updated.
3+
This course aims to develop a computational view of stochastic differential equations (SDEs) for students that have an applied or engineering background, e.g., machine learning, signal processing, electrical engineering, control, and statistics.
44

5-
# Course arrangements
5+
# Prerequisites
6+
7+
1. Linear algebras
8+
2. Real analysis (not essential)
9+
3. Probability theory
10+
4. Ordinary differential equations
11+
12+
# How to register
13+
14+
Please fill in the Google form https://forms.gle/mC7tLBUnPdEUL4XeA to register the course.
15+
16+
# Essential lectures (6 credits)
17+
18+
1. **Introduction to the course**. <br>
19+
17 Oct, 2022. Room 4005 Ångström.
20+
21+
2. **Stochastic differential/integral equations**. <br>
22+
21 Oct, 2022. Room 101132 Ångström.
23+
24+
3. **Numerical solution to stochastic differential equation**. <br>
25+
24 Oct, 2022. Room 101127 Ångström.
26+
27+
4. **Statistical properties of SDE solutions**. <br>
28+
28 Oct, 2022. Room 101142 Ångström.
29+
30+
5. **Linear SDEs and Gaussian processes**. <br>
31+
31 Oct, 2022. Room 101146 Ångström.
32+
33+
6. **Exercise 1**. <br>
34+
2 Nov, 2022. Room 101127 Ångström.
35+
36+
7. **Filtering and smoothing problems I (i.e., regression with SDE prior)**. <br>
37+
4 Nov, 2022. Room 101127 Ångström.
38+
39+
8. **Filtering and smoothing problems II (i.e., regression with SDE prior)**. <br>
40+
7 Nov, 2022. Room 101127 Ångström.
41+
42+
9. **Exercise 2**. <br>
43+
9 Nov, 2022. Room 101127 Ångström.
44+
45+
10. **SDE system identification**. <br>
46+
11 Nov, 2022. Room 101142 Ångström. <br>
47+
Lecturer: [Mohamed Abdalmoaty](https://people.kth.se/~abda/)
48+
49+
11. **Exercise 3**. <br>
50+
18 Nov, 2022. Room 101127 Ångström.
51+
52+
12. **Student project work presentation**. <br>
53+
16 Dec, 2022. Room 101142 Ångström.
54+
55+
Time is 13:15 - 17:00 for all the lectures.
56+
57+
# Seminar lectures (9 credits)
58+
59+
By attending (not necessarily all) the seminar courses and complete their writing assigments/exericses, you get upgrade to 9 credits.
60+
61+
1. **Continuous-time filtering**. <br>
62+
This lecture is concerned with continuous-time filtering, for example, Zakai equation, Kushner equation, and projection filtering. <br>
63+
Date: 14 Nov, 2022. Room 101127 Ångström <br>
64+
Lecturer: [Muhammad Fuady Emzir](https://scholar.google.com/citations?user=nfBRAHAAAAAJ&hl=en) (KFUPM)
65+
66+
2. **SDEs and Markov chain Monte Carlo**. <br>
67+
In this lecture, we present a general recipe for constructing Markov chain Monte Carlo (MCMC) samplers, including stochastic gradient (SG) versions, from stochastic continuous dynamics (SDEs). We also explore the connections between SG-MCMC and stochastic optimization methods via simple annealing techniques. Recommended readings: 1) A Complete Recipe for Stochastic Gradient MCMC. 2) Bridging the gap between stochastic gradient MCMC and stochastic optimization. <br>
68+
Date: 21 Nov, 2022. Room 101127 Ångström <br>
69+
Lecturer: [Cagatay Yildiz](https://cagatayyildiz.github.io/) (University of Tübingen)
70+
71+
3. **Probabilistic numerics for ordinary differential equations**. <br>
72+
Probabilistic numerical methods aim to explicitly represent the numerical uncertainty that results from limited computational resources. In this lecture, we present a class of probabilistic numerical solvers for ODEs which pose the numerical solution of an ODE as a Gauss--Markov regression problem. The resulting methods, called "ODE filters", efficiently compute posterior distributions over ODE solutions with methods from Bayesian filtering and smoothing. <br>
73+
Date: 25 Nov, 2022. Room 101127 Ångström <br>
74+
Lecturer: [Nathanael Bosch](https://nathanaelbosch.github.io/) (University of Tübingen)
75+
76+
4. **Applications of SDEs in statistical signal processing**. <br>
77+
In this lecture we present a few applications that use SDEs to solve signal processing problems. These include, for example, using SDEs to construct non-stationary processes to model complicated signals, and using SDEs to estiamte the spectrogram of signals with uncertainty. <br>
78+
Date: 28 Nov, 2022. Room 101127 Ångström
79+
80+
5. TBD. <br>
81+
2 Dec, 2022. Room 101127 Ångström. You are very welcome to contact me if you would like to give a guest lecture.
82+
83+
6. **Constructions of Wiener processes and stochastic integrals**. <br>
84+
This lecture explains the constructions of Brownian motion and Ito integrals.
85+
Date: 5 Dec, 2022. Room 101127 Ångström
86+
87+
Time is 13:15 - 17:00 for all the lectures.
88+
89+
Note that the dates for the seminar courses are not fixed. They are subject to change depending on the schedule of the lecturers.
90+
91+
# Course arrangement
92+
93+
The course consists of lectures, exercises, and project work. Specifically, in each week, there would be one/two lectures (45 + 45 mins) and an exercise session (60 mins). The students shall present and discuss their exercise solutions during the exercise session.
94+
95+
Total credit is 6 or 9.
96+
97+
In order to get 6 credits, you need to
98+
99+
- actively participate all the essential lectures,
100+
- pass the three exercise assignments,
101+
- present the project work. Depending on the number of students, you may do the project work in group.
102+
103+
If you would like to get 9 credits, you need to fullfill the requirements for the 6 credits as stated above, and in addition,
104+
105+
- actively participate all the seminar lectures,
106+
- Select five from all the seminar lectures, then pass the exercises of the selected, or do a writing assignment if the lecture has no exercise. (We will define the writing assignment later).
107+
108+
The course grade is based on pass/fail.
109+
110+
# Project work
111+
112+
To be added.
113+
114+
# Reading materials
115+
116+
This course is mainly based on the following textbooks.
117+
118+
- Hui-Hsiung Kuo. Introduction to stochastic integration. Universitext. Springer, 2006.
119+
- Simo Särkkä and Arno Solin. Applied stochastic differential equations. Cambridge University Press, 2019.
120+
- Ioannis Karatzas and Steve E. Shreve. Brownian motion and stochastic calculus. Springer, 2nd edition, 1991.
121+
122+
# Course history
6123

7124
- Oct - Dec, 2022, Uppsala Universitet, FTN0332 TN22H006.
8125

126+
# Contact
127+
128+
Zheng Zhao, Uppsala University.
129+
130+
131+
132+
https://zz.zabemon.com

0 commit comments

Comments
 (0)