Data Engineer (expert) 1155 Evdb

Pretoria, GP, ZA, South Africa

Job Description

Data Engineers are responsible for building and maintaining Big Data Pipelines using client's Data Platforms.



Data Engineers are custodians of data and must ensure that data is shared in line with the information classification requirements on a need-to-know basis.



Data Engineers are also expected to:



Stay up to date with the latest data engineering tools, technologies and industry trends. Identify opportunities for process improvements and automation to enhance the efficiency and reliability of data pipelines. Explore and evaluate new data engineering approaches and technologies to drive innovation within the organisation.



Mentor, train and upskill members in the team.



Qualifications/Experience:



Relevant IT / Business / Engineering Degree.



Certifications:



Candidates with one or more of the certifications are preferred.



AWS Certified Cloud Practitioner, AWS Certified SysOps Associate, AWS Certified Developer Associate, AWS Certified Architect Associate, AWS Certified Architect Professional, Hashicorp Certified Terraform Associate.



Experience in working with Enterprise Collaboration tools such as Confluence, JIRA etc.



Experience developing technical documentation and artefacts.



Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV etc.



Experience working with Data Quality Tools such as Great Expectations.



Experience developing and working with REST API's is a bonus.



Basic experience in Networking and troubleshooting network issues.



Knowledge of the Agile Working Model.



Highly motivating, energetic, and fast-paced working environment.



Modern, state-of-the-art offices.



Dynamic Global Team collaboration.



Application of the Agile Working Model Methodology.



Essential Skills Requirements:



Above average experience/understanding:



Terraform



Python 3x



SQL - Oracle/PostgreSQL



Py Spark



Boto3



ETL



Docker



Linux / Unix



Big Data



PowerShell / Bash



Cloud Data Hub (CDH)



CDEC Blueprint



Glue



CloudWatch



SNS



Athena



S3



Kinesis Streams (Kinesis, Kinesis Firehose).



Lambda



DynamoDB



Step Function



Param Store



Secrets Manager



Code Build/Pipeline



CloudFormation



Business Intelligence (BI) Experience.



Technical data modelling and schema design ("not drag and drop").



Kafka



AWS EMR



Redshift



Advantageous Skills Requirements:



Demonstrate expertise in data modelling Oracle SQL.



Exceptional analytical skills analysing large and complex data sets.



Perform thorough testing and data validation to ensure the accuracy of data transformations.



Strong written and verbal communication skills, with precise documentation.



Self-driven team player with ability to work independently and multi-task.



Experience building data pipeline using AWS Glue or Data Pipeline, or similar platforms.



Familiar with data store such as AWS S3, and AWS RDS or DynamoDB.



Experience and solid understanding of various software design patterns.



Experience preparing specifications from which programs will be written, designed, coded, tested and debugged.



Strong organizational skills.



More than 10 Years

Beware of fraud agents! do not pay money to get a job

MNCJobs.co.za will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD1611414
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Pretoria, GP, ZA, South Africa
  • Education
    Not mentioned