VIDEO RECAP | Modernizing towards a Streaming First Data Architecture

by Caroline Maier

Hero Image
April 28, 2021 8:25am




Business leaders in data-intensive organizations are increasingly expected to unify fragmented data from across silos to power decision making in real time. Improve the customer experience as they are engaging. Drive new revenue streams with real-time data. Collect faster, more accurate analytic insights to streamline operations. You get the picture.

But traditional change data capture (CDC) tools and legacy ETL (batch) processes are straining under a constantly- growing volume and velocity of data. In-house Do-It-Yourself implementations of powerful open-source frameworks are complex and costly to implement and maintain with large teams needed for custom codes and endless patch & fix solutions that inevitably follow suit.

Information technology leaders need a new approach.

Join Equalum & Gulf Consulting as we discuss how to move towards a Streaming-First Data Architecture, the continued role of batch within your new framework, and the immediate gains to be achieved towards strategic business initiatives as a result of your data evolution.

We will also walk you through a variety of design choices that data architects need to consider when creating a modern data management environment.


Erez Alsheich | CPO & Co-Founder - Equalum
Erez is a data platform professional with over 20 years of experience in various hands-on, managerial and business development roles. Erez also founded, led and sold two leading Israeli data platform consulting companies.

Henk Schouten | Founder & CSO - Gulf Consulting

In IT for three decades, Henk brings a wealth of experience and love for technology to his global client base. With experience in 3D / VR / AR / AI / Cloud / Quantum / Open Source / Crypto / Blockchain / Asset Management and more, Gulf Consulting drives tech forward improvements to achieve major business gains.

Navaneetha Babu C | Partner - Gulf Consulting | CTO - Big Data Services
Navaneetha has managed teams with their transformation on Cloud and Big Data from traditional Datastores and Data Warehouses to Hadoop/cloud and Cassandra Data Lake. He has experience in setting up Enterprise Data Ingestion layers using Confluent and Apache Kafka. He is also a Cloudera certified trained and consultant and holds extensive knowledge in Hadoop, Spark and Apache Kafka Development and Administration based on CDP on Data Center and Cloud(AWS and Azure) and Confluent, DataStax Cassandra Administration and Development, Apache Spark for Data Science and Apache Nifi administration and development.

Over the years he has trained > 5500 engineers from Infosys, Vodafone, Western Digital, TCL, Groupon, Epsilon, Xebia, Fourkites, Adobe, RBS (Royal Bank of Scotland), IBM, JP Morgan, Bank of America, Facebook, Flipkart, Etisalat, Ooreedoo.

Why You Can’t Ignore a Streaming First Data Architecture
Plus...Design Tradeoffs As you Modernize

Ready to Get Started?

Experience Enterprise-Grade Data Integration + Real-Time Streaming