Building data pipelines with PostgreSQL and Kafka
Presented by:
Oskari Saarenmaa
Oskari Saarenmaa is the CEO and one of the founders of Aiven, a next-generation managed cloud services company offering the best Open Source database and messaging technologies as fully-managed cloud services to businesses around the world. Oskari has previously worked as a software architect designing secure, large-scale database systems and network security infrastructure.
Oskari has been using PostgreSQL professionally since the version 6.4 back in 1999.
Apache Kafka is a high-performance open-source stream processing platform for collecting and processing large numbers of messages in real-time. It's used in an increasingly large number of data pipelines to handle events such as website click streams, transactions and other telemetry in real-time and at scale.
This session focuses on connecting Kafka and PostgreSQL to automatically update a relational database with incoming Kafka events, allowing you to use PostgreSQL's powerful data aggregation and reporting features on the live data stream.
We'll demonstrate a production IoT setup Kafka and PostgreSQL to monitor device states.
- Date:
- 2018 April 18 11:30 EDT
- Duration:
- 50 min
- Room:
- Liberty I
- Conference:
- PostgresConf US 2018
- Language:
- English
- Track:
- Data
- Difficulty:
- Medium