Jay Kreps, CEO of Confluent discusses an enterprise integration architecture organized around an event log. Robert Blumen spoke with Jay about the N-squared problem of data integration; how LinkedIn tried and failed to solve the integration problem; the nature of events; the enterprise event schema; schema definition languages; the use of an event log in single node and distributed databases; how the concept of an event log was generalized from a database to the enterprise; the initial development of Kafka; the evolution of Kafka in the last four years; the addition of a SQL-like interface to Kakfa; operational requirements around the use of Kafka as event log; how programs get their events into Kakfa; adding new databases to the Kafak-centric architecture; applications that consume and produce internal events; correctness and how to handle bad events.
- The Log: What every software engineer should know about real-time data’s unifying abstraction by Jay Kreps
- All Aboard the Databus! LinkedIn’s Scalable Consistent Change Data Capture Platform by multiple authors
- Apache Kakfa
- SE Radio episode 162: Project Voldemort with Jay Kreps
- SE Radio episode 219: Apache Kafka with Jun Rao
- Confluent blog post Introducing ksqlDB by Jay Kreps
- I Heart Logs: Event Data, Stream Processing, and Data Integration by Jay Kreps