Business Intelligence Developer

Fourways, GP, ZA, South Africa

Job Description

###

Overview




The Business Intelligence Developer is responsible for designing and delivering end-to-end Business Intelligence or advance analytical solutions across Omnia's modern data platform. This includes building scalable data pipelines in Azure and Databricks, developing semantic models, and creating Power BI reports that enable trusted, self-service analytics.



The role bridges data engineering and front-end development, translating business requirements into technical solutions, ensuring data quality and reconciliation, and supporting the delivery of Omnia's centralized, governed Data Hub.



Working closely with cross-functional teams, this role plays a key part in enabling data-driven decision-making across the Group.

###

Qualifications



Matric or NQF level 5 BSc/BA in Computer Science, Engineering or Statistics, master's in informatics is advantageous Certifications: BI technologies (e.g., Microsoft Power BI, Oracle Business Intelligence, T-SQL)
###

Experience



Minimum of 5 years' experience in data engineering, data management, or related fields. Demonstrated experience implementing cloud-based data solutions (Azure, Databricks preferred). Proven experience designing and optimizing complex data pipelines, ETL/ELT processes, and automated workflows. Practical experience implementing data governance and data quality frameworks. Proven experience building and managing scalable, cloud-based or hybrid data platforms. Strong collaboration experience with cross-functional stakeholders to translate business requirements into effective technical solutions. Experience supporting self-service analytics and semantic layer governance Practical experience in modern analytics environments, enabling advanced analytics and data science use cases. Track record in continuous improvement and knowledge sharing within teams.
###

Duties



End-to-End Data Product Development


Design, build, and maintain full-stack BI solutions using Azure, Databricks, and Power BI--supporting data ingestion, transformation, modeling, and reporting within the Data Hub architecture.

Data Pipeline Engineering and Optimization


Develop and maintain ELT pipelines across medallion layers (bronze/silver/gold), ensuring performance, scalability, and observability using tools such as ADF and Databricks. improvement.

Semantic Modeling and Visualization


Design tabular models, implement DAX calculations, and build Power BI reports that enable intuitive, governed self-service analytics.

Business Requirements Translation


Engage with business users to understand analytical needs and translate them into scalable, governed technical solutions aligned with domain-driven architecture.

Data Reconciliation and Validation


Implement robust data validation and reconciliation across sources, semantic models, and reports to ensure accuracy, trust, and traceability of all data products.

Data Governance, Quality, and Security


Apply and support data governance policies, data cataloguing (e.g., Unity Catalog), lineage tracking, and access control to ensure trusted and secure data delivery.

Cross-Functional Collaboration and Enablement


Collaborate with data owners, analysts, and data stewards to ensure alignment of data solutions with business outcomes and to enable self-service analytics capabilities.

Continuous Improvement and Innovation


Stay current with emerging BI and data engineering trends, identifying and implementing opportunities to improve performance, automation, and maintainability of the data platform


OTHER SPECIAL REQUIREMENTS




This role supports the development of a centralized Omnia Data Hub based on domain-driven architecture. Involves collaboration across divisions (e.g., Fertilizer, BME, Group) and ERPs (D365, AX). Candidates must understand or be willing to adopt Unity Catalog, Databricks SQL, and self-service enablement through governed semantic models. Expected to contribute to the AI-readiness vision and the enablement of intelligent data products.
###

Job Competencies



Core Competencies




Creativity and Innovation Problem Solving and Critical Thinking Planning and Organizing Collaboration and Communication Customer-Centric Thinking Resilience and Integrity Agile and Outcome-Focused


Knowlege




Modern Data Architectures


Deep understanding of Lakehouse architecture, medallion (bronze/silver/gold) data modelling, and domain-driven data product design.

Cloud Data Platforms


In-depth knowledge of Microsoft Azure services relevant to data engineering (e.g., Data Factory, Data Lake, Synapse, Key Vault).

Databricks Ecosystem


Working knowledge of Databricks notebooks, Delta Lake, Unity Catalog, job orchestration, and SQL/Python usage within the platform.

Semantic Modelling


Strong grasp of dimensional modelling (Kimball methodology), star/snowflake schemas, and semantic layer design using tabular models (SSAS/Power BI datasets).

Power BI Best Practices


Comprehensive understanding of report design principles, self-service enablement, row-level security, and workspace governance.

ETL/ELT Methodologies


Familiarity with data ingestion and transformation design patterns, pipeline orchestration, and automated deployment processes (CI/CD for data).

Data Governance and Quality


Knowledge of data stewardship, lineage tracking, master data management, and quality control within a governed architecture

Data Security & Compliance


Awareness of enterprise data security practices, including access control, data masking, and compliance with POPIA/GDPR.

Data Reconciliation and Validation


Ability to trace and reconcile data across source systems, staging layers, semantic models, and final reports to ensure accuracy and trust in data products.

Business-to-Technical Translation


Skilled at engaging with stakeholders to capture business requirements and translate them into technical specifications that align with the data platform architecture and delivery capabilities.

Agile Delivery in BI Context


Understanding of agile methodologies (Scrum/Kanban) applied to BI development, including backlog management and iterative delivery.

Advanced Analytics Enablement


General knowledge of how to structure data to support machine learning and AI use cases, including feature store readiness and model consumption pipelines

###

General



Skills




Data Engineering & Platform




Advanced SQL and Python Azure Data Factory / Databricks workflows Medallion architecture (Bronze, Silver, Gold layers) Lakehouse and Delta Lake Data pipeline automation and orchestration Data quality, governance, and CI/CD




Analytics & Front-End




Dimensional Modelling (Kimball) DAX, Power BI Tabular Model (SSAS), and Power Query (M) Dashboard & Report Development (Power BI Desktop & Service) Semantic layer development and certification Version control integration with Git/Azure DevOps




General




Agile delivery (Azure Boards or equivalent) Stakeholder requirement analysis and communication Troubleshooting and performance tuning * Collaboration across data, analytics, and business teams

Beware of fraud agents! do not pay money to get a job

MNCJobs.co.za will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD1453140
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Fourways, GP, ZA, South Africa
  • Education
    Not mentioned