Full Stack Data Engineer
Aristo Sourcing
Posted: April 30, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
We are seeking a Full Stack Data Engineer to build and maintain robust data pipelines and transformations. The ideal candidate will have excellent analytical skills, experience with data science tools, and a passion for delivering data-driven insights.
Required Skills
Job Description
Our Client is hiring a Full‑Stack Data Engineer to strengthen their data foundation and support growing reporting needs across the organization. This is a hands‑on technical role for someone who thrives across the entire data lifecycle from building pipelines and transformations to delivering user‑facing dashboards and predictive insights.
You will join a collaborative data team, working closely with engineering and analytics colleagues to ensure reliable data ingestion, efficient workflows, and clear reporting outputs that empower operational and clinical leadership.
Requirements:
Key Responsibilities
Data Engineering
• Design, build, and maintain data pipelines using GCP tools (BigQuery, Cloud Functions, Cloud Composer, Cloud Scheduler, Apache Beam, Airflow).
• Clean, transform, and organize data from multiple sources.
• Automate ETL/ELT workflows for reliability and scalability.
• Support ingestion from APIs, spreadsheets, and internal systems.
Backend Development
• Write Python and Bash scripts to process and automate data tasks.
• Develop lightweight backend services and utilities to streamline internal processes.
Front‑End / Dashboards
• Build and update dashboards in Looker Studio and D3.js.
• Deliver clean, intuitive KPI reports for operations and leadership.
• Support visualization needs across the Wellness Division.
Foundational ML / Predictive Work
• Contribute to simple predictive modeling and forecasting tasks.
• Prepare structured datasets for future machine learning initiatives.
Qualifications
Must‑Have:
• 2+ years in data engineering, data science, or software engineering.
Strong experience with GCP, including:
• BigQuery
• Cloud Functions / Cloud Run
• Apache Beam & Airflow
• Looker / Looker Studio Pro
• Vertex AI (AutoML, LLM engineering)
• Advanced Python (data processing, APIs, automation).
• Experience building end‑to‑end pipelines (batch + streaming preferred).
• Strong SQL skills for transformations and modeling.
• Proven ability to develop dashboards, KPIs, and BI outputs.
• Solid understanding of modern data architectures (lakehouse, warehousing, governance).
Nice‑to‑Have:
Exposure to healthcare or multi‑location environments.
Experience with EMR systems or similar platforms.
Familiarity with predictive analytics and ML workflows.
Benefits:
Location: South Africa (Remote)
$1500-$1800
US Time Zone