Full-Stack Data Engineer (SA Remote)
NIVA Health
Posted: February 10, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
A Full-Stack Data Engineer at NIVA Health is responsible for building real data solutions end-to-end, working across the full data lifecycle from pulling data in, cleaning it up, building pipelines — all the way through to dashboards and usable insights.
Required Skills
Job Description
This role is for you if you enjoy building real data solutions end-to-end — not just one piece of the puzzle.
At NIVA Health, we’re growing our data capability and looking for a Full-Stack Data Engineer who’s comfortable working across the full data lifecycle:
from pulling data in, cleaning it up, building pipelines — all the way through to dashboards and usable insights.
You won’t be boxed into a single lane.
You’ll work on the back end and the front end of data, alongside a collaborative team, solving practical problems that impact healthcare operations every day.
What you’ll be working on
You’ll help design, build, and maintain data solutions that power reporting and decision-making across the business.
That includes:
• Building and maintaining data pipelines using Google Cloud Platform (BigQuery, Cloud Functions, Cloud Composer, Cloud Scheduler).
• Cleaning, transforming, and organising data from multiple sources (APIs, spreadsheets, internal systems).
• Automating ETL / ELT workflows to improve reliability and efficiency.
• Writing Python (and some Bash) scripts to support data processing and internal tools.
• Building and maintaining dashboards and KPI reports using Looker Studio (and supporting data visualisation needs).
• Preparing datasets for simple predictive or forecasting use cases as the team evolves.
This is a hands-on role — you’ll be writing code, fixing issues, improving pipelines, and seeing your work used by real teams.
Requirements:
You’ll be a great fit if
• You have 2+ years’ experience in data engineering, analytics engineering, data science, or software engineering.
• You’re comfortable working with GCP, especially BigQuery.
• You use Python confidently for data processing and automation.
• You have solid SQL skills and understand data modelling basics.
• You’ve built or maintained data pipelines before (batch or streaming).
• You’ve worked with dashboards or BI tools (Looker / Looker Studio preferred).
• You enjoy working across both technical backend tasks and user-facing reporting.
Nice to have
• Experience with Apache Airflow / Cloud Composer.
• Exposure to Apache Beam.
• Familiarity with Vertex AI, AutoML, or basic ML workflows.
• Experience supporting operational or healthcare data.
Salary
$1500 - $1800
Final offer will depend on experience and technical depth.