Skip to content

Data Engineer

Job Title: Data Engineer

Department: Data Engineering

Reports to: Principal Data Architect

Role Overview:

We are seeking a Data Engineer to join Simpaisa Holdings, a cross-border payments and remittances company operating across the Middle East and South Asia. The ideal candidate will be responsible for building and maintaining data pipelines, analytics infrastructure, and real-time streaming systems that power the organisation's payment processing, transaction analytics, regulatory reporting, and business intelligence capabilities. Strong expertise in ETL/ELT development, database technologies, and data quality practices, ensuring data is accessible, reliable, and timely, is essential. Expertise in agile methodologies and collaborating with data architects, data scientists, and product teams is also preferable.

Key Responsibilities:

  • Build and maintain robust, scalable data pipelines for ingesting, transforming, and loading (ETL/ELT) transaction data from payment gateways, banking partners, FX providers, and internal systems.
  • Develop and operate real-time streaming pipelines for payment transaction events, enabling live dashboards, alerting, and fraud detection capabilities.
  • Implement and maintain the organisation's data warehouse and data lake infrastructure on cloud platforms (AWS), ensuring performance, scalability, and cost-effectiveness.
  • Build analytics datasets and reporting views to support transaction analytics, corridor performance, FX margin analysis, settlement reconciliation, and regulatory reporting.
  • Ensure data quality, integrity, and consistency across all pipelines and data stores, implementing automated data validation and monitoring.
  • Support master data management processes for core entities: customers, merchants, counterparties, and corridor configurations.
  • Collaborate with the Principal Data Architect to implement enterprise data models, data lineage tracking, and data governance standards.
  • Develop and maintain data documentation, including pipeline specifications, data dictionaries, and runbooks, maintained in Bitbucket repositories.
  • Participate in on-call rotations for critical data pipeline failures affecting payment processing and regulatory reporting.
  • Stay up-to-date with the latest trends and advancements in data engineering tools and techniques.

Required Skills and Experience:

  • Agile: Awareness of agile principles and experience working with cross-functional teams in an agile environment.
  • Communication: Good written and verbal communication skills with the ability to articulate technical data concepts clearly to both technical and non-technical audiences.
  • Strategy and Planning: Ability to understand and follow data architecture plans and pipeline designs. Strong organisational skills for managing tasks and priorities.
  • Problem-solving and Analytical skills: Strong problem-solving and analytical skills to diagnose data quality issues, optimise pipeline performance, and address data-related challenges.
  • Data Engineering Expertise: Solid understanding of data modelling principles, database technologies (SQL and NoSQL), and data warehousing concepts. Experience with ETL/ELT tools and frameworks (e.g., Apache Airflow, dbt, AWS Glue). Experience with real-time streaming technologies (e.g., Apache Kafka, Kinesis). Proficiency in Python and SQL. Experience with cloud-based data platforms (AWS Redshift, Athena, S3, Glue).
  • Payments Domain Awareness: Familiarity with payment transaction data, reconciliation workflows, FX data, and regulatory reporting data requirements is desirable but not essential.
  • Teamwork and Collaboration: Ability to work effectively in a collaborative team environment across geographically distributed teams.

General Requirements for the Role:

  • Bachelor's Degree in related field: A bachelor's degree in Information Systems, Computer Science, Engineering, or a closely related STEM field is required.
  • 3+ years of experience in data engineering or related roles: Minimum of 3 years of progressive experience in building and maintaining data pipelines, data warehouses, or analytics infrastructure.
  • Experience with data pipeline tools and cloud platforms: Demonstrated experience in using ETL/ELT tools, streaming platforms, and cloud data services.
  • Proven track record of delivering reliable data solutions: A verifiable history of contributing to the delivery of data pipelines and analytics infrastructure that support business operations.

Benefits and Perks:

  • Competitive salary and comprehensive benefits package.
  • Opportunity to work with cutting-edge payments and fintech data solutions and collaborate with skilled professionals across multiple markets.
  • Professional development and training opportunities.
  • Inclusive company culture that values diversity and innovation.