FIND INTERNSHIPS

Senior Data Engineer - (Max Budget Sgd 9k)

Posted on Oct. 14, 2025 by Rapsys Technologies Pte. Ltd

  • fulltime, contract

Senior Data Engineer - (Max Budget Sgd 9k)

Key Responsibilities

  • Design, build, and optimize ETL/ELT pipelines using Databricks, Apache Spark, PySpark, or Azure Synapse.
  • Develop real-time streaming pipelines using Kafka, Confluent, or Azure Event Hubs.
  • Integrate and transform data from Core Banking, Trade, Payments, Treasury, and CRM systems.
  • Implement data quality, validation, and lineage controls using tools like dbt, Great Expectations, or Deequ.
  • Create and maintain data models (3NF, dimensional, or Data Vault) to support analytics and reporting.
  • Work with Security and Governance teams to apply data masking, encryption, and tokenization per MAS TRM, PDPA, and PCI-DSS requirements.
  • Support regulatory reporting data flows (e.g., MAS 610/649, Basel III/IV, IFRS 9/17).
  • Collaborate with Data Scientists and AI Engineers to enable ML feature stores and model-serving pipelines.
  • Participate in data platform modernization projects (e.g., migration from Teradata / DB2 to Snowflake / Databricks / Synapse).
  • Automate data workflows and maintain CI/CD pipelines using Azure DevOps, Terraform, or GitHub Actions.

Technical Skills

  • Languages: Python, PySpark, SQL, Scala
  • Data Platforms: Azure Data Lake, Synapse, Databricks, Snowflake
  • Data Orchestration: Airflow, Azure Data Factory, dbt
  • Streaming: Kafka, Confluent, Event Hubs
  • Governance: Azure Purview, Apache Atlas, Collibra
  • Security: Data encryption, masking, RBAC, tokenization
  • CI/CD & Infrastructure: Terraform, Azure DevOps, GitHub Actions

Experience & Qualifications

  • 6–10 years of experience in data engineering, with at least 3 years in banking or financial services.
  • Proven hands-on experience building real-time and batch data pipelines on Azure or AWS.
  • Exposure to regulatory data frameworks (MAS 610, Basel III/IV, IFRS 9/17, BCBS 239).
  • Understanding of DevOps and MLOps integration for data workloads.
  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
  • Preferred Certifications:
  • Microsoft Azure Data Engineer Associate
  • Databricks Data Engineer Professional
  • Snowflake SnowPro Core

Key Attributes

  • Strong analytical and problem-solving mindset.
  • Excellent communication and teamwork skills.
  • High attention to data quality, performance, and compliance.
  • Ownership and accountability in delivering high-impact solutions.

Job Types: Full-time, Contract
Contract length: 12 months

Pay: Up to $9,000.00 per month

Location:

  • Singapore (Required)

Advertised until:
Nov. 13, 2025


Are you Qualified for this Role?


Click Here to Tailor Your Resume to Match this Job


Share with Friends!

Similar Internships


No similar Intern Jobs at the Moment!