Data Quality Engineer
Confidential
Posted: January 30, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Required Skills
Job Description
About Gradera — Digital Twin & Physical AI Platform
At Gradera, we are building a next-generation Digital Twin and Physical AI platform that enables enterprises to model, simulate, and optimize complex real-world systems. Our work brings together strategy, architecture, data, simulation, and experience design to power decision-making across large-scale operational environments such as manufacturing, logistics, and supply chain networks.
This platform-led initiative applies AI-native execution, advanced simulation, and governed orchestration to help organizations test scenarios, predict outcomes, and continuously improve performance. We operate with an enterprise-first mindset prioritizing reliability, transparency, and measurable business impact as we build intelligent systems that scale beyond a single industry or use case.
Data Quality Engineer
Overview
We are seeking a detail-oriented Data Quality Engineer to ensure the integrity, accuracy, and reliability of data powering our digital twin and AI platforms. You will design and implement data quality frameworks, build automated validation pipelines, and establish quality metrics that enable trusted, simulation-ready data products. This role is critical to ensuring that operational decisions and ML models are built on a foundation of high-quality, governed data.
Our core data quality stack includes:
Data Quality Frameworks
Delta Live Tables expectations for declarative quality enforcement
Great Expectations for comprehensive data validation
Databricks data profiling and quality monitoring
Platform & Tools
Databricks SQL and PySpark for quality checks at scale
Unity Catalog for lineage tracking and governance compliance
Python for custom validation logic and anomaly detection
Observability
Quality metrics dashboards and alerting
Data profiling and statistical analysis
Anomaly detection and drift monitoring
Key Responsibilities
Design and implement data quality frameworks using Delta Live Tables expectations and Great Expectations
Build automated data validation pipelines that enforce quality standards at ingestion and transformation stages
Develop data profiling processes to understand data distributions, patterns, and anomalies
Define and track data quality metrics (completeness, accuracy, consistency, timeliness, validity)
Implement anomaly detection mechanisms to identify data drift and quality degradation
Create quality dashboards and alerting systems for proactive issue identification
Collaborate with data engineers to embed quality checks into ETL/ELT pipelines
Partner with data architects to establish data quality standards and governance policies
Investigate and perform root cause analysis for data quality issues
Document data quality rules, thresholds, and remediation procedures
Support data certification processes for simulation-ready and ML-ready datasets
Drive continuous improvement in data quality practices and tooling
Preferred Qualifications
6+ years of experience in data engineering or data quality roles, with 3+ years focused on data quality
Track record of implementing enterprise-scale data quality frameworks
Experience with lakehouse architectures (Delta Lake, Iceberg)
Familiarity with real-time data quality monitoring for streaming pipelines
Experience working in agile, cross-functional teams
Highly Desirable
Experience with data quality for digital twin or simulation platforms
Familiarity with operational state data validation and temporal consistency checks
Experience with graph data quality validation (Neo4j or similar)
Exposure to ML data quality (feature validation, training data quality)
Experience with data observability platforms
Exposure to industrial domains such as Manufacturing, Logistics, or Transportation is a plus
Location: Hyderabad, Telangana
Department: Engineering
Employment Type: Full-Time