KEDAS: SYNC RELATIONAL AND NON-RELATIONAL DATABASES USING KAFKA
Background
In the era of digitalization, the landscape of data has evolved dramatically. Initially used for monitoring and analysis, data quickly became a vital asset for real-time decision-making. As data volumes soared, the value of static information dwindled, while the significance of continuous data streams skyrocketed. Within this dynamic context, Kafka emerged as a pivotal tool for data management.
What is Kafka?
Kafka lies at the heart of this data revolution as an event streaming platform. It excels at capturing data from diverse sources, seamlessly processing, storing, and delivering it to those who seek actionable insights. While Kafka shares similarities with traditional pub-sub message queues like RabbitMQ, it sets itself apart in several critical ways:
- Operates as a modern distributed system
- Offers robust data storage capabilities
- Processes data streams, creating new events beyond traditional message brokering
Purpose of this Project
Recognizing Kafka's versatility and surging popularity, we embarked on a journey to harness its capabilities for delivering superior solutions to our clients.
Our mission? To dive deep into this technology and build an innovative solution leveraging Kafka’s capability.
Project Overview
KEDAS is a data synchronization application designed to work seamlessly across various database servers, including MS SQL, MySQL, PostgreSQL, and more. In an effort to enhance versatility and complexity, we extended its capabilities to facilitate synchronization between both relational and non-relational databases. Unlike most data synchronization tools available, our solution allows changes made in one data source to be seamlessly synchronized with multiple data sources through minimum configurations.
This project stands as a testament to our commitment to embracing innovative technologies like Kafka to offer cutting-edge solutions to our clients. As data continues to be a driving force in the digital landscape, Kafka remains at the forefront of efficient and real-time data management, enabling us to deliver exceptional results.
Outcomes
We have developed a Kafka processor and a set of Kafka connectors to enable data synchronization between different data sources. To make the system testable, we have developed a simple web application that interacts with two independent databases which are synchronized using our KEDAS.
Kafka Demo App
Check out the video below to get a quick overview of how everything works together and if you like to try how this works, try it yourself with our demo application.
Demo Video: Kedas Demo
Try it yourself: Kedas Dashboard
Want to know more about Kafka based solutions? Talk to Creators
For more information and updates on Kafka-driven projects by Fidenz, stay tuned. Follow us for the latest updates!