MisuJob - AI Job Search Platform MisuJob

IN-Senior Associate_ Databricks Senior Data Engineer_Data and Analytics_Advisory_Bangalore

PwC

Bengaluru Millenia permanent

Posted: April 30, 2026

Interested in this position?

Create a free account to apply with AI-powered matching

Quick Summary

Design and develop data infrastructure and systems to enable efficient data processing and analytics for clients.

Job Description

Line of Service

Advisory

Industry/Sector

Not Applicable

Specialism

Data, Analytics & AI

Management Level

Senior Associate

Job Description & Summary

At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth.

In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions.

Why PWC

At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us.

At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations.

Job Description & Summary:  

We are seeking hands-on Databricks Data Engineer with 5–7 years of experience who can design, build, and optimize scalable data pipelines on the Databricks Lakehouse. The ideal candidate is strong in SQL, Python, and PySpark, understands Delta Lake inside out, and is experienced with production-grade ETL/ELT, orchestration, cost/performance optimization, and data governance (e.g., Unity Catalog). 

Candidate should have hands-on knowledge of Azure ADF and ADLS. 

Responsibilities: 

 

You will work on: 

• Build reliable, scalable batch and streaming pipelines using PySpark and Databricks Workflows/Jobs. 

• Implement Delta Lake best practices: schema enforcement/evolution, ACID transactions, time travel, OPTIMIZE/ZORDER, and VACUUM. 

• Develop Delta Live Tables (DLT) or Structured Streaming pipelines with Auto Loader for ingestion. 

• Write robust modular code in Python (packaging, logging, configuration, error handling). 

• Write efficient and scalable SQL queries for data extraction and reporting. 

• Optimize Spark jobs (partitions, joins, bucketing, caching, AQE, broadcast hints, file sizing). 

• Familiarity with SQL Warehouses, Photon, DBR versions, cluster policies, and pools. 

• Workflows/Jobs orchestration: tasks, dependencies, parameters, retries, alerts, schedules. 

• Unity Catalog: permissions, data lineage, audit, external locations, shares; catalog-first design. 

• Ensure idempotent, repayable, and incremental pipelines with proper checkpoints and watermarks. 

• Implement data quality checks (expectations tests in DLT/Great Expectations), unit/integration tests for pipelines. 

• Ensure data quality, integrity, and governance across the pipeline. 

• Work with CI/CD tools to automate deployment and testing of data workflows. 

• Work with stakeholders to define data SLAs, schemas, and interface contracts. 

• Document pipelines, data sets (data dictionary, lineage), and usage guidelines. 

 
 
Primary Skills: 

• Databricks Platform 

• PySpark 

• SQL 

• Data Modelling 

• Python (Scripting, Data Manipulation) 

• Azure – ADF, ADLS 

 

Secondary Skills: 

• CI/CD (Git, Jenkins, Azure DevOps, etc.) 

Mandatory skill sets: 

 

Databricks certified Data Engineer Associate 

Preferred skill sets: 

 

Analytical mindset with strong problem-solving ability 

Years of experience required: 

 

Experience: 5 – 7 Years 

 

Education Qualification:

Education (if blank, degree and/or field of study not specified)

Degrees/Field of Study required: Bachelor of Technology, Bachelor of Engineering, Master of Business Administration, Master of Engineering

Degrees/Field of Study preferred:

Certifications (if blank, certifications not specified)

Required Skills

Microsoft Azure, Microsoft Azure Analytics Services

Optional Skills

Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more}

Desired Languages (If blank, desired languages not specified)

Travel Requirements

Available for Work Visa Sponsorship?

Government Clearance Required?

Job Posting End Date

May 14, 2026

Why Apply Through MisuJob?

AI-Powered Job Matching: MisuJob uses advanced artificial intelligence to analyze your skills, experience, and career goals. Our matching algorithm compares your profile against thousands of job requirements to find positions where you have the highest chance of success. This saves you hours of manual job searching and ensures you only see relevant opportunities.

One-Click Applications: Once you create your profile, applying to jobs is effortless. Your resume and cover letter are automatically tailored to highlight the most relevant experience for each position. You can apply to multiple jobs in minutes, not hours.

Career Intelligence: Beyond job matching, MisuJob provides valuable career insights. See how your skills compare to market demands, identify skill gaps to address, and understand salary benchmarks for your experience level. Make data-driven decisions about your career path.

Frequently Asked Questions

How do I apply for this position?

Click the "Register to Apply" button above to create a free MisuJob account. Once registered, you can apply with one click and track your application status in your dashboard.

Is MisuJob free for job seekers?

Yes, MisuJob is completely free for job seekers. Create your profile, get matched with jobs, and apply without any cost. We help you find your dream job without any hidden fees.

How does AI matching work?

Our AI analyzes your resume, skills, and experience to understand your professional profile. It then compares this against job requirements using natural language processing to calculate a match percentage. Higher matches mean better fit for the role.

Can I apply to jobs in other countries?

Absolutely. MisuJob features jobs from companies worldwide, including remote positions. Filter by location or look for remote opportunities to find jobs that match your preferences.

Ready to Apply?

Join thousands of job seekers using MisuJob's AI to find and apply to their dream jobs automatically.

Register to Apply