PyData Global 2025

Decisions Under Uncertainty: A Hands‑On Guide to Bayesian Decision Theory
2025-12-10 , Analytics, Visualization & Decision Science

We often must make decisions under uncertainty—should you carry an umbrella if there's a 30 % chance of rain? Bayesian decision theory provides a principled, probabilistic framework to answer such questions by combining beliefs (probabilities), utilities (what matters to us), and actions to maximize expected gain.

This talk:
- Introduces key decision‑theoretic concepts in intuitive terms.
- Uses a toy umbrella example to ground ideas in relatable context.
- Demonstrates applications in Bayesian optimization (PoI/EI) and Bayesian experimental design.
- Is hands‑on—with Python code and practical tools—so participants leave ready to apply these ideas to real‑world problems.


This talk bridges everyday decision-making (umbrella example) with advanced techniques like Bayesian optimization and experimental design, and equips attendees with conceptual clarity and immediate code they can adapt to their data-driven workflows.

Audience

Primarily data scientists, ML practitioners, and statisticians who:

  • Have applied Bayesian models but want a broader decision-theory perspective.
  • Want actionable insight into uncertainty-aware decision frameworks.
  • Seek practical demos in Python.

Outline

Motivation & Core Concepts (5 min)

  • Frame real-world decision problems: rain or shine, clinical trials, A/B testing.
  • Introduce Bayesian decision theory: beliefs × utilities → action via expected utility maximization.

Toy Example: Should I Bring an Umbrella? (8 min)

  • Define: Probabilityp of rain; utility/loss matrix
ActionRainNo Rain
Umbrella–1 (weight)–1 (inconvenience)
No Umbrella–10 (soaked)0
  • Derive expected utility:
EU_umbrella = -1
EU_no_umbrella = -10p

So bring umbrella if p > 0.1.

  • Interactive Python demo: explore how p and utility values shift the decision point.

Bayesian Optimization: PoI & EI (8 min)

  • Introduce Gaussian-process-based optimization and the need to trade off exploration vs. exploitation.
  • Define Probability of Improvement (PoI) and Expected Improvement (EI)
  • Show how they're derived from decision theory: choosing the next point to maximize expected gain.
  • Python demo using GPyTorch: fit GP, compute PoI/EI acquisition functions, visualize decision boundary—why one chooses a high-uncertainty point vs. one near known good values.

Bayesian Experimental Design (BED): Minimizing Uncertainty (8 min)

  • Motivation: cost-sensitive data collection (labeling, surveys, medical tests).
  • Define an information-based utility (e.g., expected reduction in entropy).
  • Show how decision theory prescribes choosing the next experiment to maximize this expected utility.
  • Python demo using OptBayesExpt.

Summary & Takeaways (1 min)

  • Reiterate the decision-theoretic arc: belief → utility → action.
  • Emphasize the unifying framework across umbrella example, optimization, and experimental design.
  • Share resources & practical tips: GPyTorch / scikit-optimize, OptBayesExpt

Prior Knowledge Expected:

Yes