[ウェビナー] 火災訓練からゼロロスレジリエンスへ | 今すぐ登録

Online Talk

Build and Scale Real-Time Enterprise AI Agents with Confluent and Databricks

今すぐ見る

Available On-Demand

Agents are only as good as the context they run on – and many fall short because of stale, siloed data and unscalable frameworks. With Confluent and Databricks, agents are event-driven and replayable by design, use fresh contextualized data to optimize decision-making and outputs, and are built on fully managed platforms with proven security, governance, and reliability. Learn how data streaming and Databricks together provide a complete foundation for agents that can learn, act, and adapt on live context at enterprise scale – making intelligent automation finally production ready.

Join AI experts Sean Falconer, Britton LaRoche, and Brenner Heintz as they walk through an agentic AI tutorial and operationalize the vision of agent-powered use cases.

Register now to learn:

  • What are Streaming Agents and why event-driven architecture (EDA) is key
  • How Confluent and Databricks unify data processing and AI workflows
  • Agentic AI use cases and reference architectures
  • How to prepare data for AI, sync data across your systems, and serve fresh context to agents
  • Demo tutorial on building and scaling event-driven agents using Databricks, Confluent, Flink, Tableflow, Claude LLM, and ML models
  • How to ensure security and governance for your multi-agent systems
  • Get all your questions answered in live Q&A

SeanはConfluentのAIプロダクトマネジメント担当シニアディレクターとして、AI戦略とソートリーダーシップに取り組んでいます。ショーンは、学者、スタートアップ創業者、そしてGoogle社員としての経験を持ち、AIから量子コンピューティングまで幅広いトピックを網羅した著書を出版しています。また、人気エンジニアリングポッドキャスト「Software Engineering Daily」と「Software Huddle」のホストも務めています。

Britton is a seasoned enterprise solutions architect and technical sales expert with 20 years of experience, including roles at Oracle, MongoDB, and Confluent. He’s a certified MongoDB Developer and DBA, known for developing AI/ML models.

Brenner Heintz focuses on applying stream processing and real-time data transformation techniques to LLM- and AI-based applications in his role as Staff Technical PMM, Apache Flink at Confluent. He is a practitioner of and evangelist for open table formats and streaming data pipelines, having previously worked at Fivetran and Databricks.