Data Architect job at First Quantum Minerals
Website :
96 Days Ago
Linkedid Twitter Share on facebook

Vacancy title:
Data Architect

[ Type: FULL TIME , Industry: Business Management and Administration , Category: Management ]

Jobs at:

First Quantum Minerals

Deadline of this Job:
Monday, September 30 2024 

Duty Station:
Within Zambia , Lusaka, South - Central Africa

Summary
Date Posted: Monday, September 16 2024, Base Salary: Not Disclosed

Similar Jobs in Zambia
Learn more about First Quantum Minerals
First Quantum Minerals jobs in Zambia

JOB DETAILS:
Job description:
Job title: Data Architect
Site: Kansanshi Mining Plc.
Department: IT & Digital
Section: Business Analytics and Engagement
Position reports to: IT Enterprise Architect Lead

Purpose
Manage site data governance and contribute to regional and group data engineering teams in pursuing vision of analytics driven mining. Provides expertise in building and deploying databases
• The role includes the development and design of data and software strategies, monitoring and improving system and data performance.
• The Data Architect is responsible for planning for future upgrades, expansion and capacity requirements. In addition, the Data Architect plans, coordinates, and implements security measures to safeguard the data and related environment.
• The role also includes the design, configuration, and development standards for all databases onsite
• The Data Architect determines database structural and functional requirements by analysing operations, applications and programming.

Key Responsibilities
• Develop and oversee creating and updating of database solutions by designing proposed
• systems/enhancements. Define database structure and functional capabilities, security, backup and recovery specifications; document design to implementation.
• • Design, development, deployment, and support of enterprise data platform based upon Microsoft
• Azure Services aligned to region and group data engineering and analytics guidelines
• Create and enforce policies for effective data management, including techniques for data accuracy and legitimacy
• Maintain database performance across the technical environment by identifying and resolving production and application development problems; calculating optimum values for parameters;
• evaluating, integrating and installing new releases following an established version control
• change management methodology; ensuring proper organization, indexing, and optimization for
• efficient data retrieval and storage; completing maintenance activities; publishing release notes; and addressing user training and information needs.
• Design, develop, integrate, and review real-time/bulk data pipelines from a variety of internal and external sources (streaming data, APIs, data warehouse, messages, images, video, etc.)
• Perform data transformation and normalization tasks to prepare data for analytics, modelling, and reporting purposes. This also includes development of data models and structures that facilitate efficient data analysis and retrieval
• Implement data quality checks, monitoring, and error handling mechanisms to ensure data accuracy, completeness, and consistency • Ensure the IT & Digital team is following established design patterns for data ingest, transformation, and egress
• Develop documentation of Data Lineage and Data Dictionaries to create a broad awareness of the enterprise data model and its applications • Apply best practices within Data Ops (Version Control, P.R. Based Development, Schema Change Control, CI/CD, Deployment Automation, Test Automation, Shift left on Security, Loosely Coupled Architectures, Monitoring, Proactive Notifications)
• Provide thought leadership in problem solving to enrich possible solutions by constructively challenging paradigms and actively soliciting other opinions. Actively participate in R&D initiatives
• Architecture: Utilize modern cloud technologies and employ best practices from DevOps/DataOps to produce enterprise quality production Python and SQL code with minimal errors. Identify and direct the implementation code optimization opportunities during code review sessions and proactively pull in external experts as needed.
• Develop interactive dashboards, reports, and visualizations using tools like Power BI, Python presenting data in a user-friendly and insightful manner

Qualifications
• Bachelor’s degree in engineering, computer science, analytical field (Statistics, Mathematics, etc.), Masters or PhD will be an added Advantage.

Experience
• Minimum 5 years related experience
• Knowledgeable Practitioner of SQL development with experience designing high quality, production SQL codebases Knowledgeable Practitioner of Python development with experience designing high quality, production Python codebases
• Knowledgeable in objected oriented programming languages like C#
• Knowledgeable Practitioner in data engineering, software engineering, and ML systems architecture
• Knowledgeable Practitioner of data modelling
• Experience applying software development best practices in data engineering projects, including Version Control, P.R. Based Development, Schema Change Control, CI/CD, Deployment
• Automation, Test Driven Development/Test Automation, Shift left on Security, Loosely Coupled Architectures, Monitoring, Proactive Notifications using Python and SQL
• Data science experience wrangling data, model selection, model training, modelling validation, e.g., Operational Readiness Evaluator and Model Development and Assessment Framework, and deployment at scale
• Working knowledge of Azure Stream Architectures, DBT, Schema Change tools, Data Dictionary tools, Azure Machine Learning Environment, GIS Data • Working knowledge of Software Engineering and Object Orient Programming Principles
• Working knowledge of Distributed Parallel Processing Environments such as Spark or Snowflake • Working knowledge of problem solving/root cause analysis on Production workloads
• Working knowledge of Agile, Scrum, and Kanban
• Working knowledge of workflow orchestration using tools such as Airflow, Prefect, Dagster, or similar tooling • Working knowledge with CI/CD and automation tools like Jenkins or Azure DevOps
• Experience with containerization tools such as Docker • Member of ICTAZ or EIZ

Behavioural Traits
• Effective communication
• Ability to influence managers and employees
• Ability to demonstrate leadership
• Critical thinking
• Conflict management
• Problem solving skills
• Ability to work in pressured and deadline-driven operating environment
• Detail-orientated with the technical aptitude and ability to perform tasks accurately and comprehensively
• Expert in multi-tasking, time management and planning of work
• Excellent presentation skills

Work Hours: 8


Experience in Months: 60

Level of Education:
Bachelor Degree

Job application procedure
• To apply for this job please visit firstquantum.wd3.myworkdayjobs.com.

All Jobs

QUICK ALERT SUBSCRIPTION

Job Info
Job Category: Data, Monitoring, and Research jobs in Zambia
Job Type: Full-time
Deadline of this Job: Monday, September 30 2024
Duty Station: Lusaka
Posted: 16-09-2024
No of Jobs: 1
Start Publishing: 16-09-2024
Stop Publishing (Put date of 2030): 16-09-2066
Apply Now
Notification Board

Join a Focused Community on job search to uncover both advertised and non-advertised jobs that you may not be aware of. A jobs WhatsApp Group Community can ensure that you know the opportunities happening around you and a jobs Facebook Group Community provides an opportunity to discuss with employers who need to fill urgent position. Click the links to join. You can view previously sent Email Alerts here incase you missed them and Subscribe so that you never miss out.

Caution: Never Pay Money in a Recruitment Process.

Some smart scams can trick you into paying for Psychometric Tests.