Specialist, Data Quality (R-19250)
Dnb
Posted: May 14, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
Shape the Future with Dun & Bradstreet
Required Skills
Job Description
Shape the Future with Dun & Bradstreet
At Dun & Bradstreet, we believe data has the power to create a better tomorrow. As a global leader in business decisioning data and analytics, we help companies worldwide grow, manage risk, and innovate. For over 180 years, businesses have trusted us to turn uncertainty into opportunity. We’re a diverse, global team that values creativity, collaboration, and bold ideas. Are you ready to make an impact and help shape what’s next? Join us! Explore opportunities at dnb.com/careers.
About the role:
You will be part of the team responsible for measuring the quality of our Global Data Cloud. As a Data Quality Engineer, you will take a lead role executing data quality strategies, automating data quality processes and collaborating closely with cross-functional teams to ensure the accuracy, consistency, and reliability of our data assets. Your technical skillset will be instrumental in managing the team’s pipelines for Data Quality monitoring.
Key Responsibilities: :
• Execute a comprehensive data quality monitoring strategy which aligns with the organization's Data Quality Standards and business objectives.
• Develop a strong understanding of Dun & Bradstreet’s inventory data.
• Perform baseline data quality monitoring to proactively identify data quality issues.
• Employ advanced data analysis and profiling techniques.
• Liaise with business stakeholders to ensure requirements are clear and documented.
• Automate data quality monitoring solutions and internal processes.
• Create or update data models to ensure that data is stored in an organized structure.
• Utilize PowerBI and/or Looker to design, create, connect, and administer dashboards which derive insights from data quality monitoring results.
• Implement a robust data validation framework with automated testing processes.
• Communicate with the globally distributed stakeholders using JIRA and Confluence.
• Capture requirements accurately and seek strong understanding of use cases.
• Recommend improvements to data quality team’s internal processes.
• Generate regular reports on data quality metrics.
• Review data to identify patterns or trends that may indicate errors in processing.
• Develop comprehensive documentation of data quality processes, procedures, and findings, and ensure junior members document their work.
• Comply with data governance policies and procedures.
• Remain an expert in industry best practices and technologies related to data quality.
• Provide guidance and mentorship to junior data quality engineers, fostering their growth and development.
Key Requirements: :
• Bachelor’s degree in Business Analytics, Computer Science, Information Technology, or a related field.
• 8+ years of experience and demonstrated in-depth knowledge of data analysis, querying languages, data modelling, and the software development life cycle.
• Expertise in SQL (preferably BigQuery).Proficient in Python.
• Familiar with Airflow, GCP Composer and Terraform.
Agile mindset and deep understanding of agile project management (Scrum/Kanban).
• Experience in Database design, modelling, and best practices.
• Experience with cloud computing technologies (preferably GCP).
• Experience with Firebase Studio or other application development platforms.
• Experience using AI tools such as Copilot Studio, Gemini Code Assist or Claude Code.
• Ability to mentor & provide guidance to less experienced members of the team.
• Analytical, process improvement and problem-solving skills.
• Strong communication and the ability to articulate data issues and solutions.
• Commitment to meet deadlines and uphold the release schedule.
• Experience collaborating across time zones as part of a global team.
• Experience with Microsoft Suite, including Excel, Word, Outlook and Teams.
• Experience with DevOps best practices including CI/CD, automation, monitoring, observability, agile project management, version control, and continuous feedback.
• Experience with data observability tools such as Acceldata or Informatica DQ.
• Experience with XML and JSON data structures.
• Understanding of ETL processes and their impact on data quality.
• Knowledge of Machine Learning, specifically anomaly detection.
• Experience developing agents and/or agentic systems.
All Dun & Bradstreet job postings can be found at https://jobs.lever.co/dnb. Official communication from Dun & Bradstreet will come from an email address ending in @dnb.com.
Notice to Applicants: Please be advised that this job posting page is hosted and powered by Lever, a subsidiary of Employ Inc. Your use of this page is subject to Employ's Privacy Notice and Cookie Policy, which governs the processing of visitor data on this platform.