To design, build, and maintain robust, scalable data infrastructure that ensures secure, real-time access to operational and supply chain data. This role underpins accurate reporting, advanced analytics, and automation--enabling performance optimisation, cost reduction, and smarter decision-making across the logistics value chain.
Duties:
Design, build, and maintain ELT pipelines
Data latency (extraction to availability)
Error resolution turnaround time
Integrate data across systems (WMS, TMS, ERP, IoT)
Number of successful system integrations
Data completeness and consistency across sources
Average time to onboard new data source
System uptime and availability
Query performance (execution speed)
Data quality score (accuracy, completeness, validity)
Number of reported data issues
Time to deliver datasets for reports/models
Internal stakeholder satisfaction rating
Compliance with access control policies
Number of unauthorised access incidents
Audit readiness/ completion rate
Manual hours reduced via automation
Number of recurring tasks automated
Stability of automated workflows
Time to issue resolution (from investigation to recommendation)
Number of root causes correctly identified
Query performance improvements (execution speed, efficiency)
SQL-based data validation, transformation, and cleansing coverage
Reusable SQL pipelines for recurring logistics workflows (inventory,
shipments, routing)
SQL-based reconciliation across multiple systems
Version-controlled, documented SQL scripts aligned with governance standards
Reduction in errors or rework due to SQL inefficiencies
Bachelor's/Masters in Computer Science or related field
Preferred certifications:
Microsoft Certified: Azure Data Engineer Associate
Google Professional Data Engineer
AWS Certified Data Analytics
Certifications in BI or analytics tools (e.g. Power BI, Tableau, SQL)
3-5 years' experience in data engineering, preferably in logistics, supply chain sectors
Required Knowledge:
Development of ELT/ETL pipelines using tools like Apache Airflow, SSIS, or Azure Data Factory
Data integration across logistics systems (ERP, WMS, TMS, IoT)
Data modelling, schema design, and SQL optimisation
Data warehousing concepts (e.g. star/snowflake schemas)
Version control and CI/CD pipelines for data products
Supply chain/logistics data structures and flows
Required Skills:
Advanced proficiency in SQL and Python or another scripting language
Strong debugging, problem-solving, and performance tuning skills
Data validation, cleansing, and transformation techniques
Building scalable and reusable data pipelines
Communication skills
Working knowledge of cloud-based data platforms (e.g. Azure, AWS, GCP)
Other
Required Competencies:
Ability to work under pressure
Time management
Collaboration
Problem solving
Attention to detail
Analytical thinking
Working under pressure in an agile environment
Job Reference: RTT74779
Beware of fraud agents! do not pay money to get a job
MNCJobs.co.za will not be responsible for any payment made to a third-party. All Terms of Use are applicable.