Data Quality Automation Engineer
Nix
Posted: April 17, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
The Data Quality Automation Engineer will be responsible for developing and maintaining complex software solutions to improve data quality and efficiency, with a focus on cloud-based systems.
Required Skills
Job Description
About the project:
The Client provides comprehensive operational support and a range of expert services to the world’s leading insurers, brokers, fleet managers, and automotive manufacturers. 3,300 employees across ten countries deliver exceptional standards on a large scale for over 1,200 clients. We help the global insurance market to handle millions of claims each year in the most cost-effective and efficient ways possible.
The Client is embarking on an exciting and challenging transformation program, and our software solutions are a driving force behind this strategy, using cloud computing and leading-edge design patterns.
Key Responsibilities
• Define and implement data quality rules across ingestion, transformation, and reporting layers
• Validate data in Databricks-based pipelines
• Monitor and test Databricks transformations (PySpark/SQL) for correctness and completeness
• Ensure Databricks / Power BI reports reflect accurate and reconciled data
• Set up data validation checks (schema, nulls, duplicates, ranges, referential integrity)
• Identify, log, and track data quality issues with root cause analysis
• Collaborate with data engineers and analysts to fix issues
• Build automated data quality monitoring and alerts
Required Skills
• 4-5+ years of Relevant work experience in data analysis, quality assurance, data governance, or a similar field is highly desirable.
• Strong knowledge of Databricks / Spark (SQL, PySpark)
• Understanding of ETL/ELT pipelines and data transformations (dbt)
• Experience validating BI/reporting outputs (Power BI preferred)
• SQL proficiency for data validation and reconciliation
• Familiarity with data quality frameworks/tools (e.g., Great Expectations is a plus)
Nice to Have
• Experience with AWS data stack
• Experience with data governance or data catalog tools
• Exposure to CI/CD for data pipelines
• Knowledge of data lineage and observability tools
Success Criteria
• Reduced data defects in pipelines and reports
• Automated data quality checks are in place
• Clear visibility and tracking of data issues
We offer*:
• Flexible working format - remote, office-based or flexible
• A competitive salary and good compensation package
• Personalized career growth
• Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
• Active tech communities with regular knowledge sharing
• Education reimbursement
• Memorable anniversary presents
• Corporate events and team buildings
• Other location-specific benefits
*not applicable for freelancers