Senior Data Engineer

Xenon7 Croatia
Apply Now

About us: Where elite tech talent meets world-class opportunities! At Xenon7, we work with leading enterprises and innovative startups on exciting, cutting- edge projects that leverage the latest technologies across various domains of IT including Data, Web, Infrastructure, AI, and many others. Our expertise in IT solutions development and on-demand resources allows us to partner with clients on transformative initiatives, driving innovation and business growth. Whether it's empowering global organizations or collaborating with trailblazing startups, we are committed to delivering advanced, impactful solutions that meet today’s most complex challenges. About the Client: Join one of Egypt’s premier financial institutions, renowned for its extensive suite of banking services, including Institutional Banking, Personal Banking, and Islamic Banking. With a global presence through over 50 branches and correspondents, we serve a diverse and dynamic clientele. As we embark on a groundbreaking digital transformation journey, we are committed to leveraging the latest technologies to establish a state-of-the-art data architecture that will redefine our performance and service delivery. Position Overview We are seeking a highly motivated and experienced Senior Data Engineer to join our growing data team. In this role, you will be at the forefront of designing and developing scalable, high- performance data pipelines and lakehouse architectures. You will work closely with data modelers, analysts, and business stakeholders to deliver trusted, real-time and batch- based data solutions. Your expertise will directly contribute to the backbone of our analytics and AI-driven strategies. Key Responsibilities • Design, implement, and optimize data pipelines using both batch and streaming processing • frameworks. • Architect and maintain data lakehouse solutions using Apache Iceberg and object storage such as S3. • Implement scalable Data Vault and Star Schema models. • Build and manage real-time ingestion pipelines with Kafka, Spark, or Flink. • Integrate and orchestrate workflows using tools like Airflow, dbt, NiFi, or Airbyte. • Enforce data governance, data quality, and access control policies. • Troubleshoot pipeline performance and reliability issues.