Through our client-facing brands Metropolitan and Momentum, with Multiply (wellness and rewards programme), and our other specialist brands, including Guardrisk and Eris Property Group, the group enables business and people from all walks of life to achieve their financial goals and life aspirations.
We help people grow their savings, protect what matters to them and invest for the future. We help companies and organisations care for and reward their employees and members. Through our own network of advisers or via independent brokers and utilising new platforms Momentum Metropolitan provides practical financial solutions for people, communities and businesses. Visit us at www.momentummetropolitan.co.za
Disclaimer
As an applicant, please verify the legitimacy of this job advert on our company career page.
Role Purpose
We are looking for a technically skilled Data Engineer with solid hands-on experience in building and maintaining end-to-end data solutions on AWS. The ideal candidate will be comfortable working across the data pipeline--from ingestion to transformation to delivery--while ensuring performance, scalability, and data quality.
Requirements
Preferable a Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience).
5-7 years of experience in data engineering.
3+ years of hands-on experience with AWS, including:
S3, Glue, Spark, Athena, Redshift, RDS, Lambda, Lake Formation
Strong SQL skills and experience with relational databases (e.g., PostgreSQL, Oracle, RDS).
Proficiency in Python or Scala for data processing.
Familiarity with infrastructure-as-code tools (e.g., Terraform, CloudFormation).
Understanding of data governance, security, and compliance in cloud environments.
Exposure to AI/ML platforms (e.g., AWS AI, SageMaker, OpenAI) is an advantage
Duties & Responsibilities
Collaborate with analysts, developers, architects, and business stakeholders to understand data needs and deliver technical solutions.
Design, build, and maintain data pipelines and integrations using AWS services such as S3, Glue, Lambda, and Redshift.
Develop and manage data lakes and data warehouses on AWS.
Support and maintain production and non-production data environments.
Optimize data storage and query performance through schema design and efficient data processing.
Implement CI/CD practices for data infrastructure, including monitoring, logging, and alerting.
Ensure data quality, security, and governance across all stages of the data lifecycle.
Document data models, pipelines, and architecture for internal use and knowledge sharing.
Stay current with AWS data services and best practices.
Contribute to a culture of continuous improvement and knowledge sharing within the team
Competencies
S3, Glue, Spark, Athena, Redshift, RDS, Lambda, Lake Formation
Strong SQL skills and experience with relational databases (e.g., PostgreSQL, Oracle, RDS).
Proficiency in Python or Scala for data processing.
Familiarity with infrastructure-as-code tools (e.g., Terraform, CloudFormation).
* Understanding of data governance, security, and compliance in cloud environments.
Beware of fraud agents! do not pay money to get a job
MNCJobs.co.za will not be responsible for any payment made to a third-party. All Terms of Use are applicable.