Recent years have seen considerable advances in generative models, which learn distributions from data and also generate new data instances from the learned distribution; and dynamical models, which model systems with a dynamical or temporal component. Both of these developments have been leveraging advances in deep learning. The course will cover key advances in generative and dynamical models, including variational auto-encoders, normalizing flows, generative adversarial networks, neural differential equations, physics guided machine learning, among other topics.
This link for course website is at https://arindam.cs.illinois.edu/courses/f21cs598/
Posts
DS1 Physics Informed Neural Networks
DS1 Discovering Equations
NODE2 Augmented Neural ODE
NODE1 Neural ODE
GAN5 Understanding the Connection Between Privacy and Generalization in Generative Adversarial Networks
GAN5 Generalization and Equilibrium in Generative Adversarial Nets (GANs)
GAN4 Toward a Better Global Loss Landscape of GANs
GAN4 Which Training Methods for GANs do actually Converge? (ICML 2018)
GAN3 Wasserstein GANs
GAN3 TOWARDS PRINCIPLED METHODS FOR TRAINING GENERATIVE ADVERSARIAL NETWORKS
GAN2 Cycle-GAN
GAN2 Understanding Deep Convolutional Generative Adversarial Networks
GAN1 f-GAN-Training Generative Neural Samplers using Variational Divergence Minimization
NF4 Discrete Flows - Invertible Generative Models of Discrete Data
VAE4 Can VAE learn concepts from data unsupervised?
- •
- 1
- 2