Cloud Data Engineer Banking

Cape Town, Western Cape, South Africa

Job Description

EMPLOYMENT TYPE:
12-month contract with extension possibilities
COMPANY:
A leading service provider of comprehensive IT solutions.
LOCATION:
Cape Town
WORKING MODEL:
Hybrid (3 days in office, 2 days remote)
JOB OVERVIEW:
We are seeking an experienced Cloud Data Engineer to join our Vito data engineering team for one of our clients launching a new bank. You will play a pivotal role in designing, building, and maintaining data infrastructure and pipelines that underpin the banking products (e.g. retail banking, credit, payments, deposits), regulatory reporting, risk analytics, fraud detection, and business intelligence. You will collaborate with data scientists, analysts, compliance, product, and operations teams to deliver high-quality, secure, auditable data assets.
You will need to balance cloud engineering best practices with the stringent requirements of the financial industry: governance, security, compliance, traceability, and performance.
DESCRIPTION OF POSITION:
Data Architecture & Pipeline Development

  • Design, build, and maintain robust data pipelines (batch, micro-batch, streaming) on cloud platforms to ingest, transform, integrate, and deliver data from diverse source systems (transaction systems, core banking, payment gateways, external feeds).
  • Architect data repositories and data product layers (e.g. raw, cleaned, curated, aggregated) to support analytics, reporting, and machine learning models.
  • Ensure data pipelines meet latency, throughput, and SLA requirements particularly in transactional / real-time / near real-time contexts (e.g. fraud detection, alerts).
  • Implement data modelling (star, snowflake, dimensional, normalized) suited for both operational and analytical workloads.
  • Design and enforce data partitioning, indexing, sharding strategies for performance and scalability.
Banking/Financial Domain Integration
  • Work with transactional financial data (payments, deposits, loans, card transactions, interest/fees, accruals, reconciliations).
  • Support credit risk, market risk, liquidity risk, stress testing, provisioning, IFRS 9 workflows by providing clean, traceable data inputs.
  • Enable fraud detection and anti-money laundering (AML / KYC) pipelines, ensuring timely and reliable data delivery to analytics and alerting systems.
  • Oversee reconciliation pipelines (e.g. core banking vs ledger / accounting system vs external settlements).
  • Provide enrichment, aggregation, and derivation of financial metrics (e.g. balances, exposures, delinquencies, chargeoffs, utilization).
Governance, Compliance & Auditability
  • Build in data lineage, versioning, change tracking, and metadata capture to support audit trails and regulatory compliance.
  • Enforce data retention / archival / disposals in line with legal, regulatory, and internal policies.
  • Ensure anonymization, pseudonymization, or masking of sensitive PII / financial data where required.
  • Implement role-based access control, encryption (in transit, at rest), key management, tokenization, and secure credentials handling.
  • Collaborate with compliance / risk / security teams to embed controls, checks, monitoring, and alerting in pipelines.
Quality, Monitoring & Operations
  • Define and enforce data quality checks, validation, exception handling, and remediation workflows.
  • Instrument pipelines with logging, metrics, monitoring, and alerting (SLIs / SLOs).
  • Regularly review and optimize performance and cost of cloud infrastructure.
Participate in on-call / standby rotations to support production stability.
Document architecture, pipeline logic, data contracts, and operational runbooks.
Collaboration & Mentorship
  • Work closely with data scientists, analysts, business stakeholders, product owners, compliance and risk teams to translate business requirements into engineering deliverables.
Mentor junior engineers, perform peer reviews, and propagate best practices.
Provide regular progress updates, contribute in planning sessions (e.g. agile / scrum), and help manage technical debt.
KNOWLEDGE AND SKILLS:
  • Strong analytical and critical?thinking skills; ability to translate ambiguous business needs into technical solutions.
  • Excellent communication (written & verbal), ability to present and explain complex technical concepts to non-technical stakeholders.
  • Strong sense of ownership, proactiveness, and ability to take ownership of end-to-end deliverables.
  • Detail orientation, ability to handle multiple workstreams and shifting priorities in a fast environment.
  • Team player mindset, willingness to mentor others and collaborate across functions.
  • Knowledge of cloud platforms (e.g. AWS, Azure, Google Cloud) and their data services.
QUALIFICATIONS REQUIRED:
  • Bachelor's Degree in Computer Science, Information Technology, Engineering, or a related field.
EXPERIENCE REQUIRED:
  • 5 - 8 years of data warehouse, ETL experience.
  • Hands-on experience with any of the Data Cloud Platforms (e.g. BigQuery, Dataflow, Pub/Sub, Cloud Storage, AWS, Azure, MS Fabric, Snowflake, Databricks etc).
  • Proficiency in SQL and experience with relational databases (BigQuery, PostgreSQL, MySQL, SQL Server etc.
  • Strong scripting skills Python, Java, or equivalent; scripting (e.g. Shell, Bash).
  • Experience with version control tools (Git, SVN) and working in software development lifecycles (SDLC).
  • Exposure to Agile methodologies, continuous integration / deployment (CI/CD).
Familiarity with data modelling (dimensional, normalized, hybrid), partitioning, indexing, performance tuning. * Ability to optimize cloud costs, manage resource provisioning, and monitor usage.
  • Knowledge of financial products (loans, credit, deposits, cards), interest/fee accruals, amortization, settlements / reconciliations.
  • Experience in designing data pipelines for risk / fraud / compliance use cases.
  • Strong skills in data reconciliation, exception handling, and maintaining data integrity across systems.
  • Familiarity with real-time / streaming data processing (e.g. Kafka, Pub/Sub, streaming frameworks).
  • Security / privacy expertise encryption, masking, tokenization, handling PII & financial data in regulated environments.
**Please note: If you have not heard from us within 2 weeks, please consider your application unsuccessful.

Skills Required

Beware of fraud agents! do not pay money to get a job

MNCJobs.co.za will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD1557065
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Cape Town, Western Cape, South Africa
  • Education
    Not mentioned