Senior/Principal Data Engineer
SigmaSoftware2
Posted: March 3, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
Design and implement scalable and future-proof data infrastructure for a rapidly scaling AI-driven SaaS platform.
Required Skills
Job Description
We are looking for a Senior/Principal Data Engineer to join our team and design a scalable, future‑proof data infrastructure.
CUSTOMER
Our customer is a rapidly scaling AI‑driven SaaS platform that helps finance and accounting teams automate critical workflows — from billing and collections to revenue recognition and reporting. Their technology eliminates manual work, accelerates cash flow, and ensures compliance for high‑growth businesses.
PROJECT
This greenfield project focuses on building a robust data platform that will power analytics, business intelligence, and AI‑driven features. You will design scalable data lakes or lakehouses, develop ingestion pipelines, and ensure data quality and observability. The platform will integrate with multiple internal and external systems, enabling self‑service analytics for both business and technical teams.
• Design and implement a scalable data warehouse or data lakehouse to support analytics, reporting, and business KPIs
• Develop and maintain reliable batch and/or streaming data pipelines from internal databases and external systems
• Collaborate with stakeholders to translate business requirements into efficient data models and schemas
• Establish and maintain data modeling standards and best practices
• Implement monitoring, data quality controls, and observability for all data workflows
• Provide well‑structured datasets to enable self‑service analytics for BI and data teams
• Document the data platform, including lineage, definitions, and contracts, to create a shared source of truth for metrics
• 6+ years of experience in data engineering
• Strong programming skills in Python
• Proven track record of designing and delivering early‑stage data platforms from concept (v1) to production
• Strong expertise with modern data tooling (e.g., Snowflake/BigQuery/Redshift, dbt, Airflow/Dagster/Prefect, Fivetran/Airbyte, etc.)
• Solid understanding of data modeling, ETL/ELT, and pipeline optimization
• Strong knowledge of data quality, testing, and monitoring best practices
• Upper‑Intermediate English or higher
WILL BE A PLUS
• Experience with AI/ML data pipelines
• Familiarity with finance/accounting datasets
• Knowledge of compliance frameworks such as SOX or GDPR
• Experience mentoring junior engineers
PERSONAL PROFILE
• Proactive problem‑solver with a hands‑on approach
• Adaptable to fast‑moving environments
• Strong communication skills for cross‑team collaboration
• Ability to take ownership and drive initiatives to completion