LuxSE adopts Apache Kafka to translate data into valuable services
Messaging systems are to modern businesses what the nervous system is to the human body. They form the technological backbone of a world where organisations are increasingly automated and software-defined, and where the user of an application is often another software. Pascal Juliot, Head of Shared & Transversal Solutions at the Luxembourg Stock Exchange, tells us how the adoption of a new distributed streaming platform, Apache Kafka, enables LuxSE to move forward with its transformation strategy
"The solutions we offer to our clients are largely based on the collection, processing and valuation of large amounts of data. We use this information to build products and services tailored to market expectations and to our clients' needs," explains Pascal Juliot. "Data is therefore at the heart of our products, and unlocking its full value lies at the centre of our strategy."
A perfect fit
To ensure the proper handling of this data, LuxSE had to find a robust, reliable and efficient solution, which is easy to use and integrate into existing business chains. This solution also had to match the group's overall strategy, which involves migrating to the cloud and transitioning its entire infrastructure to microservice architectures. "Kafka fits perfectly into our data processing and integration strategy, and is now a key part of our overall infrastructure and business strategy," says Mr Juliot.
Kafka is a distributed data streaming platform published and maintained by the Apache Foundation. Its main function is to centralise data flows. "The Kafka platform is a smart alternative to traditional enterprise messaging systems called ESB, or Enterprise Service Bus," Pascal Juliot points out. "It is a modern way of communicating data and messages between different components and applications."
The messaging system is based on a publish-subscribe model, where producers publish messages and consumers exploit or pull that data. The platform is able to publish, store, process and subscribe to real-time recording streams. It is designed to manage data flows from multiple sources and deliver them to multiple users. Initially developed by LinkedIn as an internal system for the management of its hundreds of millions of daily messages, it is now an open source project supported by a large community of contributors, in particular large corporations that have substantial performance needs, including LinkedIn, Netflix, Microsoft, and Airbnb.
A flexible, scalable, API-based platform
"Kafka is a distributed platform designed to integrate and manage large data streams from a wide variety of sources," Pascal Juliot continues. "A set of API-based connectors allows developers to easily transform and integrate various data sets into Kafka's messaging solution. There are connectors to access traditional databases, to dig into Big Data, or to exploit any other data source. This is one of the great strengths of the product," he underlines. Data can also be provided to various users and applications, through specifically designed, standardised and easy to use APIs.
"The publish-subscribe model on which Kafka is based enables messages to be available to all recipients. Any recipient who needs to access the published data can subscribe to it and dispose of it on demand, without having to be identified. This means that if a new application or user needs to access the data exposed through a topic, the candidate consumer just has to register to retrieve the information and process it. This method offers much more flexibility than conventional ESB solutions, making developments far more agile," Mr Juliot declares.
Increasing efficiency and agility
Compared to traditional tools relying on the message queuing method, Kafka offers significant advantages, particularly in terms of speed and throughput, which allows for more efficient processing of large volumes of information.
"The platform also brings a substantial reduction in latency," adds Pascal Juliot. "This means that users can access data faster, which is essential when the information needs to be available in real time. In addition, data is partitioned and distributed in a cluster. Should an issue arise, we could easily add an additional node - a broker - into the Kafka cluster to increase the level of performance."
"Moreover," he says, "Kafka gives data a persistent, long-lasting character as messages are replicated within the cluster. This guarantees that valuable information is not lost, which is an essential criterion for this type of system. Since the system is distributed, it offers great resilience. This is one of the reasons why Kafka was an obvious choice for the LuxSE Group."
The rich ecosystem surrounding the platform gives users great agility and allows developers to deliver services with reduced time-to-market. "Producer APIs allow you to produce messages, Consumer APIs give the possibility to consume them, while Streams APIs enable you to transform data," Pascal Juliot explains. "And Kafka Connectors - APIs that allow you to integrate data flows of different kinds - are among the most notable features," he underlines.
Providing better services, faster
As an essential inter-application communication element, Kafka is becoming the central nervous system of LuxSE's IT infrastructure. In the future, the platform is expected to play a growing transversal role in the core applications of the LuxSE Group.
"Our developers can now better exchange and integrate data between applications. We are able to disseminate data and translate information into products and services for clients more rapidly. All this with a robust and efficient solution," Pascal Juliot concludes.
Original interview by Michael Renotte for ITOne.