Key Responsibilities
Lead and mentor a team of Ab Initio Data Engineers in the delivery of complex data integration projects.
Architect, design, and implement end-to-end data solutions using Apache Hadoop tools and frameworks.
Collaborate with business stakeholders, data analysts, and solution architects to translate business requirements into scalable technical solutions.
Ensure adherence to data governance, data quality, and security best practices.
Manage task allocation, performance reviews, and professional development of team members.
Contribute to process improvement and best practices within the data engineering function.
Technical Experience & Skills
Strong hands-on experience in Ab Initio (Graphical Development Environment, Co >Operating System, Conduct >It, etc.)
Proficiency in SQL and working with major databases such as Oracle, Teradata, DB2, or Sybase.
Familiarity with data governance, metadata management, and data quality tools (e.g., SAS Dataflux, Trillium).
Exposure to big data ecosystems (e.g., Hadoop, Spark, Hive).
Knowledge of cloud platforms, preferably Azure.
Experience working in Unix environments with shell scripting.
Proficiency in programming languages such as C, C++, Java, or COBOL is advantageous.
Familiarity with real-time applications, web services, and data warehousing concepts.
Understanding of ETL job scheduling and workflow orchestration.
Minimum Requirements
Bachelor's or Master's Degree in Science, Engineering, Information Technology, or related field.
Ab Initio certification is mandatory.
Minimum of 5 years' experience in data engineering and ETL development, with at least 2 years in a leadership or team lead capacity.
Proven ability to lead and manage technical teams and complex data integration initiatives.
Between 7 - 10 Years
MNCJobs.co.za will not be responsible for any payment made to a third-party. All Terms of Use are applicable.