International Rescue Committee
Data Engineer
Nairobi • Kenya

Flour Mill of Nigeria Plc
Internal Control Checker, Golden Penny Foods
Apapa • Nigeria
International Institute of Tropical Agriculture (IITA)
Administrative Assistant
Ibadan • Nigeria

Bill & Melinda Gates Foundation
Principal Officer, Global Lead, Business Development & Licensing for Global Access
Nairobi • Kenya
Stanbic Bank
Fraud Risk Advisory Specialist
Kampala • Uganda
Diageo
Senior Brand Manager - Spirits
Nairobi • Kenya
UNEP
Forest and Climate Regional Advisor - LAC
Nairobi • Kenya
BURHANI ENGINEERS
Assistant Accountant
Nairobi • Kenya
Asset & Resource Management Holding Company (ARM HoldCo)
IT Auditor, Internal Audit
Lagos • Nigeria

Get personalised job alerts directly to your inbox!
KCB Group
Project Manager, Mifugo Ni Mali
Nairobi • Kenya
Top cities with open vacancies
Jobs in Nairobi, Jobs in Lagos, Jobs in Kampala, Jobs in Abuja, Jobs in Port Harcourt, Jobs in Maiduguri, Jobs in Kano, Jobs in Gulu, Jobs in Thika, Jobs in Ruiru, Jobs in Konza, Jobs in Kiambu, Jobs in Arua, Jobs in Nanyuki, Jobs in Mombasa, Jobs in Ikoyi, Jobs in Ibadan, Jobs in Damaturu, Jobs in Apapa, Jobs in AbeokutaCompanies hiring now
Danish Refugee Council, Fhi 360, International Rescue Committee, Moniepoint Incorporated, UNEPProfession
Accounting, finance, banking, insurance,Administrative, clerical,Agriculture, fishing, forestry, wildlife,Business, strategic management,Customer support, client care,Electrical engineering,Energy, power,General management, leadership,Government, community development, public services,Human resources,Information technology, software development, data,Legal,Manufacturing, operations, quality,Media, communications, languages,Medical, health,Project, program management,Sales, marketing, promotion,Security,Transportation, logistics, driving,
Industry
Agriculture, fishing, forestry,Banking, microfinance, insurance,Computers, software development and services,Construction, renovation, maintenance,Consulting, business support, auditing,Data/Research,Education, academic,Energy, utilities, environment,Engineering, architecture,Finance & FinTech,Financial Services,Governmental,Health care, medical,Housekeeping, maintenance,Human resources, talent development, recruiting,Manufacturing,Non-profit, social work,Outsourcing, leasing,Real estate,Restaurant, hospitality, travel,Retail, wholesale, FMCG,Telecommunications,Transportation, logistics, storage,
© Fuzu Ltd
International Rescue Committee
Non-profit + 1 more
Description
Experience
- 3–6 years of hands-on experience in data engineering, analytics engineering, or a related technical role.
- Demonstrated experience building or maintaining data pipelines in a professional setting.
- Exposure to cloud-based data platforms, preferably Azure (Databricks, Data Factory, or Synapse).
Technical Skills — Required
dbt:
- Working knowledge of dbt model development including staging and mart layers.
- Familiarity with dbt tests, documentation, and source configurations.
- Eagerness to deepen dbt skills including incremental models and CI/CD integration.
Databricks:
- Hands-on experience with Databricks notebooks and basic job/workflow setup.
- Familiarity with Delta Lake concepts and Databricks SQL.
- Exposure to PySpark for data transformation tasks.
SQL:
- Solid SQL skills: joins, CTEs, window functions, aggregations, and basic performance awareness.
- Experience writing SQL for data transformation and validation in a cloud data warehouse.
Pipeline Engineering:
- Experience building or supporting ELT pipelines with monitoring and basic data validation.
- Familiarity with pipeline orchestration tools such as Azure Data Factory or Databricks Workflows.
Python:
- Basic to intermediate Python skills for data processing, scripting, and automation.
- Familiarity with PySpark is a plus.
Data Modeling:
- Understanding of star/snowflake schemas and fact & dimension table concepts.
- Exposure to Lakehouse or medallion architecture (Bronze/Silver/Gold) is a plus.
Soft Skills
- Curious and eager to learn with a proactive approach to problem-solving.
- Good communication skills — able to collaborate across technical and non-technical teams.
- Attention to detail and a strong sense of data quality.
- Comfortable working in a collaborative, fast-paced, and remote team environment.
Preferred Additional Requirements
- Experience with Databricks or Azure Synapse Analytics.
- Familiarity with D365 CRM or Similar data structures.
- Exposure to Git-based workflows and CI/CD practices for data pipeline deployments.
- Experience in a humanitarian, nonprofit, or international development context.
Responsibilities
Pipeline Engineering & Orchestration
- Build and maintain ELT data pipelines using Databricks Workflows and Azure Data Factory for batch and scheduled processing from internal and external sources.
- Support the ingestion of data from key systems (e.g., D365 CRM, ServiceNow) into Lakehouse.
- Monitor pipeline execution, identify failures, and troubleshooting issues in collaboration with senior engineers.
- Contribute to pipeline documentation and help maintain runbooks and process standards.
dbt Development
- Develop and maintain dbt models across staging, intermediate, and mart layers under the guidance of senior team members.
- Write dbt tests and contribute to source freshness checks to support data quality.
- Learn and apply dbt best practices including modular design, ref dependencies, and incremental model patterns.
- Work with analysts and business teams to translate data requirements into dbt models.
SQL & Data Transformation
- Write intermediate to advanced SQL for data extraction, transformation, and validation tasks.
- Apply SQL techniques including joins, CTEs, window functions, and aggregations to support reporting and analytics needs.
- Assist in query optimization and performance troubleshooting within
Databricks SQL environments.
- Support data model maintenance and help accommodate new source fields or schema changes.
Databricks & Cloud Platform
- Develop and maintain Databricks notebooks and jobs for data transformation workloads.
- Gain hands-on experience with Delta Lake concepts and PySpark for data processing.
- Follow Lakehouse design patterns (Bronze/Silver/Gold) as defined by the Data Architect.
- Support cloud resource management including basic cluster configuration and job scheduling.
Collaboration & Learning
- Actively collaborate with the Data Team on pipeline design, troubleshooting, and delivery.
- Participate in code reviews and incorporate feedback to improve code quality.
- Support documentation of processes, standards, and data flows
- Engage with Finance, FP&A, and other business teams to understand data needs and assist in solution delivery.
Start hiring with Fuzu
Recruit better talent faster - on your own or with our support.
Explore recruitment platformJob search tips from Fuzu
Selected articles on cover letters, CV structure, and interview preparation.