D365 Customer Insight Data Engineer
Confidential
Posted: April 29, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
Write concise 2-3 sentence summary highlighting main responsibilities and key skills required.
Required Skills
Job Description
JOB SUMMARY
This role is hands-on. You will be the day-to-day engineering pair of hands keeping a customer's CID environment healthy from the data landing in the lake, through the pipelines that move and shape it, into CID itself, and out to the unified profiles and segments the business consumes.
JOB RESPONSIBILITIES
D365 CID administration & support
Administer Dynamics 365 Customer Insights Data: data sources, unification (match, merge, consolidate), enrichments, segments, measures, and exports.
Support and extend an existing CID setup onboard new sources, tune unification rules, manage refresh schedules and environment configuration.
Triage and resolve CID issues raised by the customer’s business and IT teams: failed refreshes, broken segments, profile-count discrepancies, export failures, permission issues.
Manage Support tickets with Microsoft relating to CID issues.
Data pipeline support & rectification
Operate and support upstream data pipelines feeding CID, including custom .NET console applications that orchestrate ingestion and transformation steps.
Support data flowing from multiple source systems through Azure Data Lake (ADLS Gen2) into CID folder and Delta layouts, CDM (Common Data Model) folders, partitioning, file freshness.
Validate data on date (is today’s data actually there, on time, complete?) and data fitness for CID (schema, keys, types, nulls, duplicates) — and fix or escalate when it isn’t.
Monitoring, observability & root-cause
Build and maintain monitoring on pipeline runs, refresh status, row counts, and SLAs. Alert before the business notices.
Perform root-cause analysis end-to-end — source → lake → pipeline → CID → consumer — and produce clean write-ups and permanent fixes, not just patches.
Microsoft Fabric (developing capability)
Work alongside the architecture team on the customer’s Fabric roadmap Lakehouse / OneLake patterns, Dataflows Gen2, notebooks, and the path to consolidating ADLS-based pipelines into Fabric where appropriate.
Customer-facing delivery
Run regular service reviews with the customer’s data and CRM stakeholders, log changes through formal change control, and keep documentation current.
Collaborate with PSH onshore architects and functional consultants across AU and UAE time zones.
WHAT WE’RE LOOKING FOR
3–6 years in a data engineering, data operations, or BI/analytics engineering role.
Hands-on production experience with Dynamics 365 Customer Insights — Data (formerly Customer Insights / CDP): data sources, unification, segments, measures, exports, environment administration. Setup, not just consumption.
Strong Azure Data Lake Storage Gen2 experience — folder structures, CDM folders, Delta / Parquet, access control, ingestion patterns.
Practical ability to read, debug, and modify .NET / C# console applications that orchestrate data pipelines — enough to triage failures, fix logic bugs, and ship small enhancements.
Solid SQL and data modelling fundamentals; comfortable with large data volumes and performance troubleshooting.
Disciplined incident management — clean tickets, clear comms, structured root-cause analysis.
to work effectively
within these environments — managing delays pragmatically, navigating stakeholder complexity, and maintaining momentum without becoming frustrated by organisational friction.
JOB QUALIFICATIONS
3–6 years in a data engineering, data operations, or BI/analytics engineering role.
Hands-on production experience with Dynamics 365 Customer Insights — Data (formerly Customer Insights / CDP): data sources, unification, segments, measures, exports, environment administration. Setup, not just consumption.
Strong Azure Data Lake Storage Gen2 experience — folder structures, CDM folders, Delta / Parquet, access control, ingestion patterns.
Practical ability to read, debug, and modify .NET / C# console applications that orchestrate data pipelines — enough to triage failures, fix logic bugs, and ship small enhancements.
Solid SQL and data modelling fundamentals; comfortable with large data volumes and performance troubleshooting.
Disciplined incident management — clean tickets, clear comms, structured root-cause analysis.
Strongly preferred
Exposure to Microsoft Fabric (Lakehouse, OneLake, Dataflows Gen2, Notebooks, Pipelines). Active learning is fine; production experience is a bonus.
Azure Data Factory / Synapse Pipelines, Azure Functions, Logic Apps, or similar orchestration tooling.
Power BI / Dataverse familiarity — enough to follow data through to consumption.
Experience supporting customers across multiple time zones (AU and/or Middle East a plus).
Nice to have
Microsoft certifications: DP-203 (Data Engineer), MB-260 (Customer Data Platform Specialist), DP-600 (Fabric Analytics Engineer).
Python for data wrangling and notebook work.
Source control and CI/CD for data assets (Azure DevOps, Git).