2025-11-08 –, Room 301A
Want to create AI agents that can do more than just generate text? Join us to explore how combining Databricks' Agent Bricks with the Model Context Protocol (MCP) unlocks powerful tool-calling capabilities. We'll show you how MCP provides a standardized way for AI agents to interact with external tools, data and APIs, solving the headache of fragmented integration approaches. Learn to build agents that can retrieve both structured and unstructured data, execute custom code and tackle real enterprise challenges.
Want to create AI agents that can do more than just generate text? Join us to explore how combining Databricks' Agent Bricks with the Model Context Protocol (MCP) unlocks powerful tool-calling capabilities. We'll show you how MCP provides a standardized way for AI agents to interact with external tools, data and APIs, solving the headache of fragmented integration approaches. Learn to build agents that can retrieve both structured and unstructured data, execute custom code and tackle real enterprise challenges.
Key takeaways:
- Implementing MCP-enabled tool-calling in your AI agents
- Prototyping in AI Playground and exporting for deployment
- Integrating Unity Catalog functions as agent tools
- Ensuring governance and security for enterprise deployments
Whether you're building customer service bots or data analysis assistants, you'll leave with practical know-how to create powerful, governed AI agents.
Denny Lee is a long-time Apache Spark™ and MLflow contributor, Unity Catalog and Delta Lake maintainer, and a Product Management Director and Principal Developer Advocate at Databricks. He is a hands-on distributed systems and data sciences engineer with extensive experience developing internet-scale data platforms and predictive analytics and AI systems. He has previously built enterprise DW/BI and big data systems at Microsoft, including Azure Cosmos DB, Project Isotope (HDInsight), and SQL Server. He was also the Senior Director of Data Sciences Engineering at SAP Concur. He also has a Masters of Biomedical Informatics from Oregon Health and Sciences University and has implemented powerful data solutions for enterprise Healthcare customers. His current technical focuses include AI, Distributed Systems, Delta Lake, Apache Spark, Deep Learning, Machine Learning, and Genomics.