Video description
Stream processing is becoming something like a “grand unifying paradigm” for data processing. Outgrowing its original space of real-time data processing, stream processing is becoming a technology that offers new approaches to data processing (including batch processing), real-time applications, and even distributed transactions.
Stephan Ewen (Ververica) dives into these developments from the view of Apache Flink and presents some of the major efforts in the Flink community to build a unified stream processor data processing and data-driven applications. Flink already powers many of the world’s most demanding stream processing applications. He explores the approach of Flink’s next-generation streaming runtime that offers a state-of-the-art batch processing experience and performance. A new machine learning library, built on top of a unique new API, supports many algorithms to train dynamically across static and real-time data. And he examines the new building blocks stream processing offers for data-driven applications that open a new direction to solve application consistency. You’ll see use cases from different users, showing how companies apply this broader streaming paradigm in practice.
With use cases from different users, we show how companies apply this broader streaming paradigm in practice.
Prerequisite knowledge
- A basic understanding of stream processing (useful but not required)
What you'll learn
- Understand the breadth of changes that stream processing is introducing to data infrastructures and to the way we build applications and which areas are most likely to be affected next
This session is from the 2019 O'Reilly Strata Conference in New York, NY.
Table of Contents
Stream processing beyond streaming data - Stephan Ewen (Ververica)