Senior Data Engineer (Microsoft Fabric Engineer)
Weekday AI
Posted: March 9, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
This Senior Data Engineer will design, build, and scale modern cloud-native data platforms using Microsoft Fabric and Azure data technologies.
Required Skills
Job Description
This role is for one of the Weekday's clients
Min Experience: 5 years
Location: Remote (India)
JobType: full-time
We are seeking an experienced Senior Data Engineer (Microsoft Fabric Engineer) to design, build, and scale modern cloud-native data platforms. This role focuses on developing robust ETL/ELT pipelines, data architectures, and high-performance data engineering solutions using Microsoft Fabric and Azure data technologies.
The ideal candidate will combine strong architectural thinking with hands-on engineering expertise to build scalable data pipelines, support advanced analytics, and collaborate with machine learning teams on AI-driven data workflows. The role requires deep experience with Databricks, Spark, Delta Lake, Python, and SQL, along with modern data orchestration and governance practices.
Requirements:
Key Responsibilities
Data Architecture & Platform Design
• Design and implement scalable cloud-native data architectures using Microsoft Fabric and Azure data services.
• Define best practices for data governance, architecture standards, and platform scalability.
• Build robust data models and data warehouse architectures to support analytics and AI workloads.
ETL/ELT Pipeline Development
• Design and develop high-performance ETL and ELT pipelines for large-scale data processing.
• Build and maintain data pipelines using Python and SQL to process and transform complex datasets.
• Ensure reliability, scalability, and performance optimization across data workflows.
Data Engineering & Platform Development
• Develop and manage data engineering workflows using Databricks, Spark, and Delta Lake.
• Implement data ingestion frameworks and support large-scale data processing environments.
• Optimize data pipelines for performance, reliability, and cost efficiency.
Orchestration & Automation
• Design workflow orchestration using tools such as Airflow or Azure-native orchestration services.
• Automate data processing pipelines and maintain operational reliability across systems.
AI & Advanced Data Workflows
• Collaborate with machine learning teams to support LLM, NLP, and AI-driven data workflows.
• Enable feature engineering and data pipelines that support advanced analytics and AI models.
Governance & Best Practices
• Establish best practices for data architecture, pipeline management, documentation, and security.
• Ensure compliance with enterprise data governance and quality standards.
Required Skills & Experience
• 4+ years of experience in data engineering, data architecture, or ETL development.
• Hands-on experience with Microsoft Fabric data engineering capabilities.
• Strong expertise in ETL/ELT development and data pipeline design.
• Experience working with Databricks, Apache Spark, and Delta Lake.
• Strong programming skills in Python and SQL.
• Experience building scalable data platforms on Azure cloud environments.
• Knowledge of data warehousing, data modeling, and large-scale data processing.
• Familiarity with LLM/NLP workflows or AI-driven data pipelines is an advantage.
• Bachelor’s degree in Computer Science, Information Technology, or related field preferred.
Key Skills
• Microsoft Fabric
• ETL / ELT
• Data Engineering
• Data Warehousing
• Data Pipelines
• Azure Data Lake
• Data Management
• Data Architecture
• Azure Data Factory
• Databricks / Spark / Delta Lake