Data Engineer Emea

Cape Town, Western Cape, South Africa

Job Description



At First Quantum, we free the talent of our people by taking a very different approach which is underpinned by a very different, very definite culture xe2x80x93 the xe2x80x9cFirst Quantum Wayxe2x80x9d.

Working with us is not like working anywhere else, which is why we recruit people who will take a bolder, smarter approach to spot opportunities, solve problems and deliver results.

Our culture is all about encouraging you to think independently and to challenge convention to deliver the best result. Thatxe2x80x99s how we continue to achieve extraordinary things in extraordinary locations.

Job description:
Purpose of the Role:
The Data Engineer will be responsible for designing, developing, and maintaining data pipelines and systems in the Azure cloud environment. The incumbent will work collaboratively within an integrated team of Data Engineering, Data Designers, Data Scientists, Database Administrators, DevOps Engineer and Data Architects. Together this integrated team ensures the smooth deployment and vigilant monitoring of data solutions.

Key Responsibilities:
Data Engineering:

  • Data acquisition & source management
    • Develop and deploy efficient data pipelines using Azure Synapse Analytics, adhering to Medallion architecture principles, and ensuring alignment with Semantic layer concept.
  • Develop APIs & data feeds
    • Extract, Transform, and Load (ETL) data from various sources (e.g., databases, APIs, flat files) into Azure data platforms.
  • Data Quality
    • Ensure data quality and consistency during the ETL process.
  • Data Governance
    • Implement and enforce data governance policies and security measures in compliance with industry standards and organizational requirements.
    • Monitor and audit data access and usage to ensure data privacy and security.
  • Testing
    • Work closely with cross-functional teams to design and implement automated testing strategies for Synapse resources.
    • Participate in release management processes and coordinate deployments across different environments
    • Monitor and manage build and deployment processes, troubleshoot issues, and implement improvements
    • Ensure seamless deployment and monitoring of data solutions.
    • Implement and manage CI/CD pipelines to streamline the deployment process.

Data Architecture:
  • Data Integration
    • Contribute to integration standards for system to system requirements.
    • Implement integrations with single sources of truth
  • Data Storage, transfer & access
    • Maintain and set standards surrounding how data is stored, transferred and accessed in the data lake

Data Operations:
  • Monitor Data Products
    • Leverage Parquet partitioning strategies to enhance data retrieval efficiency for optimal performance.
    • Employ and optimise orchestration tools to guarantee timely availability of data in the necessary storage, while also minimizing any excess workload
    • Monitor and fine-tune system performance for optimal efficiency.
    • Utilize automation with utility templates to streamline data processing workflows.
    • Optimise data storage and retrieval for performance and cost efficiency are maximized.

DevOps:
  • Assist with develop, maintain, and optimize DevOps agent pipelines for Azure Synapse Analytics


Qualifications Required:
  • Degree in Computer Science, Engineering, or related field.


Experience & Skills Required:
  • At least 5 to 8 yearsxe2x80x99 experience.
  • Strong proficiency in Azure cloud services and tools, including Azure Synapse Analytics, and Medallion architecture.
  • Minimum of 5 yearsxe2x80x99 hands-on experience in designing and implementing solutions using Common Data Model (CDM), demonstrating proficiency in data modelling and integration.
  • Proficient in Scala or Pyspark for developing and optimizing data processing pipelines.
  • Strong command of SQL for querying and manipulating structured data.
  • Familiarity with Azure Stream Analytics.
  • Familiarity with Azure Event Hubs.
  • Experience with source control management systems like Git.
  • Solid understanding of both streaming and batch data processing techniques, with a demonstrated ability to work with real-time and batch data pipelines.
  • Experience with data governance, security, and compliance practices.
  • In-depth knowledge of Parquet partitioning strategies for optimizing data retrieval performance.
  • Proven experience in automating data processing workflows using utility templates.


Behavioural Traits Required:
  • Performance and results orientated
  • Collaborative team member and comfortable with duty rotation
  • Effective communication abilities
  • Proficiency in navigating change within an evolving environment
  • Capability to perform effectively in high-pressure situations
  • Exhibit leadership abilities, while also being adaptable to follow when necessary or step up to lead when required
  • Knowledge and interest in computer systems and the latest technologies


Other Requirements:
  • Travel: Regionally, minimum twice a year
  • Location: Cape Town
  • Place of work: Hybrid (2-3 Office days per week)


Visit our website and register for instant job alerts at
careers.first-quantum.com
Follow us for the latest news at
LinkedIn

If you are already a First Quantum employee and have access to the First Quantum network, log into First Quantum MINE > Careers to apply internally for this opportunity.

If you are an employee without network access, contact your Site Recruiter.

Beware of fraud agents! do not pay money to get a job

MNCJobs.co.za will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD1270730
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Cape Town, Western Cape, South Africa
  • Education
    Not mentioned