I will introduce In-Context Operator Networks (ICON), a framework in which a single neural network learns solution operators for differential equations directly from a few prompted input-output examples at inference time, without any weight updates. ICON acts as a few-shot learner across forward and inverse problems for ODEs, PDEs, and mean-field control. I will then present a probabilistic interpretation: under a random differential equation data model, ICON implicitly computes the posterior predictive mean given the context, linking operator learning to Bayesian inference. This motivates GenICON, a generative variant that samples from the posterior predictive for principled uncertainty quantification, yielding a unified Bayesian view of in-context operator learning.
Siting Liu is an Assistant Professor in the Department of Mathematics at the University of California, Riverside. She received her PhD in Mathematics from UCLA, advised by Professor Stanley J. Osher. Her research interests center around mathematical modeling and computational techniques, spanning optimization, data science, machine learning, mean-field games, optimal control, inverse problems, and related areas.