Uncertainty Quantification

Uncertainty quantification is a critical tool for building trustworthy AI. We consider the following key question:
Can we rigorously learn and quantify the uncertainty of AI models, e.g., Large Language Models (LLMs), price predictors, or drones, under distribution shift and adversarial manipulation?
On-going/Potential Projects
- Selective Prediction: Improve selective prediction methods.
- Conformal Prediction: Improve conformal prediction methods.
- Calibration: Improve calibration methods.
Keywords
- Uncertainty Quantification
- Calibration
- Conformal Prediction
- Learning Theory
- Distribution Shift