To design and implement scalable and efficient data pipelines using Microsoft Fabric components such as OneLake, Dataflows Gen2, and Lakehouse. Develop ETL/ELT processes using Azure Data Factory, PySpark, Spark SQL, and Python. Ensure data quality, integrity, and security across all platforms. Collaborate with stakeholders to gather requirements and deliver technical solutions. Optimize data workflows and troubleshoot performance issues. Support hybrid cloud deployments and integrate on-premises and cloud environments while maintaining documentation and following best practice in data engineering, including version control and modular code design.
Qualifications
Bsc Computer Science or Information Technology as well as Microsoft certification in Azure Data Engineering or Microsoft Fabric
Minimum 3 years' experience in a Data engineering role with strong hands-on experience in Microsoft Fabric, Azure Synapse, Azure SQL, and Databricks
Proficiency in SQL, Python, and Power BI
Solid understanding of data modelling, data governance, and data warehousing
Experience with CI/CD pipelines, DevOps, or machine learning workflows is a plus.
Additional Information
Behavioural Competencies:
Adopting Practical Approaches
Checking Things
Developing Expertise
Embracing Change
Examining Information
Technical Competencies:
Big Data Frameworks and Tools
Data Engineering
Data Integrity
IT Knowledge
Stakeholder Management (IT)
Please note:
All our recruitment processes comply with the applicable local laws and regulations. We will never ask for money or any from of payment as part of our recruitment process. If you experience this, please contact our Fraud line on +27 800222050 or TransactionFraudOpsSA@standardbank.co.za
Beware of fraud agents! do not pay money to get a job
MNCJobs.co.za will not be responsible for any payment made to a third-party. All Terms of Use are applicable.