SCiML Seminar

Scientific Machine Learning Seminar

The main purpose of this seminar is to build a community of SciML researchers at Northwestern University and provide a venue where students can share their work and informal ideas. We hope the short and fun talks will spark new discussions, encourage collaboration, and create opportunities for feedback. Beyond presenting, the seminar also serves as a space to learn new approaches, exchange insights, and discover ongoing projects, helping everyone broaden their perspectives in scientific machine learning.

2025 Fall

In Fall 2025, the Scientific Machine Learning seminar will be held TBD. Specific information about the format of each talk will be provided in the email announcement and posted below. If you're interested in attending the seminar, please contact Yiping Lu.

Topics Interested

Technology
Diffusion Model
1. Basic of diffusion model
2. Discrete diffusion model
3. Posterior Sampling with Diffusion model
Interest Level:
Design
Optimizers
1. Quasi-Newton Methods, Gauss-Newton Methods
2. Adam and SOAP
3. signgd and MUON
Interest Level:
Education
Operator Learning
1. Basic of Operator Learning
2. Mathmetics of Operator Learning
Interest Level:

Syallbus

Time and Location: Firday 3pm at M416/M434
2025 Fall SCiML Seminar
SpeakerTitle
Date: 10/3/2025
Yiping LuProbabilistic Foundations of Diffusion Models and Monte Carlo PDE Solvers: From Particle Systems to Macroscopic Dynamics
We study large systems of interacting stochastic particles undergoing motion, birth, and death, and explore the probabilistic mechanisms connecting microscopic randomness to macroscopic deterministic behavior. Each particle follows a stochastic differential equation with position-dependent drift and diffusion and experiences independent birth-death events modeled by exponential clocks. The state of the system is described through its empirical measure, for which we establish a martingale decomposition of test function observables. As the number of particles tends to infinity, the empirical measure converges in probability to a deterministic limit, recovering classical advection-diffusion-reaction PDEs as a law of large numbers. This probabilistic framework forms the basic foundation for modern diffusion models and Monte Carlo PDE solvers. We further illustrate how forward and backward stochastic differential equations can be interpreted to simulate, denoise, and reconstruct distributions, and discuss Feynman–Kac representations and particle methods, emphasizing their convergence and stochastic underpinnings.
Reference:
Date: TBD
Yiping LuRecent Advances in Optimization: From Quasi-Newton to Geometry-Aware Algorithms
Optimization lies at the heart of modern machine learning and numerical computation. In this seminar, we provide a comprehensive overview of recent developments in optimization algorithms. We begin with classical and quasi-Newton methods, including BFGS, L-BFGS, and K-FAC, highlighting their efficiency in approximating second-order information. Next, we explore adaptive algorithms inspired by Adagrad, such as Adam, Shampoo, and SOAP, which leverage per-parameter scaling to accelerate convergence. Finally, we discuss geometry-aware methods, including SignGD and MUON, which exploit alternative geometrical structures to improve optimization performance. The seminar aims to give participants both a conceptual understanding and practical insights into these cutting-edge algorithms, bridging theoretical ideas with real-world applications.
Reference:
Date: TBD
Discrete Diffusion Model
Date: TBD
Preodictin Powered Inference