Data Engineer

iKnowHow S.A. Kentriki Makedonia, Greece
Apply Now

iKnowHow Group is a leading Software & Robotics Solutions group of companies operating internationally for over 24 years, with 300+ professionals delivering innovative technology solutions across Energy, Telecommunications, Banking & Financial Services, and Public Sector industries. The group is structured into specialized subsidiaries, each focused on distinct technology domains and market verticals. We are now looking for a mid-level Data Engineer to work in new challenging outsourced projects. You will design and develop scalable data pipelines, modernize legacy data flows into a cloud-native architecture, and partner with data scientists, analysts, and business stakeholders to ensure trusted, well-governed data is available across the enterprise. The primary technology footprint is Microsoft Azure, with selected workloads on Google Cloud Platform and a smaller Amazon Web Services presence. Responsibilities: • Design, build, and maintain scalable batch and streaming data pipelines across Azure Data Factory, Azure Synapse, and Databricks, ingesting data from policy administration, claims, CRM, and external data providers.

• Develop curated data models in a medallion (bronze/silver/gold) architecture using Delta Lake, ensuring data quality, lineage, and reusability across analytics and AI use cases.

• Develop and optimise SQL and PySpark transformations for high-volume datasets, with strong attention to performance, cost, and reliability.

• Operationalise pipelines through Azure DevOps and/or GitHub Actions, embedding automated testing, deployment, and observability into the data delivery lifecycle.

• Implement data quality checks, monitoring, and alerting across critical data products, working with platform engineering on lineage and cataloguing (e.g., Microsoft Purview, Unity Catalog).

• Collaborate with data architects to align pipelines with the enterprise data model and governance standards, including PII handling, retention, and access controls relevant to insurance regulation.

• Work closely with analytics, actuarial, and data science teams to translate business requirements into robust data products and self-service datasets.

• Participate in code reviews, design sessions, and Agile ceremonies, contributing to engineering standards and continuous improvement of the data platform.