IN_Senior Associate_ Pyspark Developer _Data & Analytics _Advisory _Kolkjata
PwC
Posted: April 17, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
We are looking for a senior associate to join our data and analytics team in Kolkata, India, where you will work with clients to leverage data to drive insights and make informed business decisions.
Required Skills
Job Description
Line of Service
Advisory
Industry/Sector
Not Applicable
Specialism
Data, Analytics & AI
Management Level
Senior Associate
Job Description & Summary
At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals.
In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage.
*Why PWC
At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us.
At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. "
Job Description & Summary – A career with in.................
Responsibilities:
Key Roles & Responsibilities:
Must have / Primary Skills / Mandatory
· Hands-on experience building data pipelines on Hadoop/Spark ecosystem
· Strong Spark (Scala and/or PySpark), Hive/Impala SQL, and performance tuning
· Working knowledge of Kafka for streaming ingestion and NiFi (or StreamSets) for batch/near-real-time flows
· Experience with Cloudera Manager, YARN/Tez, HDFS, and job orchestration using Oozie/Airflow
· Good understanding of data warehousing concepts (dimensional modelling, partitioning, bucketing)
· Proficiency in Linux/Unix, Shell scripting, Git, and CI/CD (Jenkins/GitLab CI)
· Strong SQL and data modelling for BFSI use cases (lending, liabilities, risk, regulatory reporting)
· Experience in writing technical design documents (HLD/LLD) and unit/integration testing
· Exposure to SDLC/Agile and working in onsite–offshore model
· Location - Mumbai
· Develop robust Spark jobs (batch and streaming) with unit tests and observability
· Implement ingestion patterns (Kafka/NiFi), data quality checks, and job scheduling
· Analyze and tune SQL/Spark for large-scale datasets
Good to have / Secondary Skills / Desired
· Experience with Cloudera Data Platform (CDP) Private Cloud Base/Public Cloud
· Security and governance: Apache Ranger, Atlas; Kerberos; Sentry (legacy)
· Cloud data services – AWS (EMR, Glue, S3), Azure (HDInsight, Synapse, ADLS), or GCP (Dataproc, BigQuery)
· Databricks experience (Spark, Delta Lake) for select workloads
· Containerization and orchestration (Docker/Kubernetes) for micro-batch/ML workloads
· Python for data processing and utilities; familiarity with Scala build tools (sbt/maven)
· Monitoring/observability – Cloudera Manager metrics, Grafana/Prometheus, log aggregation
· Experience with BI consumption patterns and semantic layers for risk/regulatory dashboards
Mandatory skill sets:
ETL Testing
Preferred skill sets:
ETL Testing
Years of experience required:
4—7 Years
Education qualification:
B.E.(B.Tech)/M.E/M.Tech
Education (if blank, degree and/or field of study not specified)
Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration, Bachelor of Technology
Degrees/Field of Study preferred:
Certifications (if blank, certifications not specified)
Required Skills
DevOps
Optional Skills
Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Applied Macroeconomics, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Geopolitical Forecasting {+ 24 more}
Desired Languages (If blank, desired languages not specified)
Travel Requirements
Not Specified
Available for Work Visa Sponsorship?
No
Government Clearance Required?
No
Job Posting End Date
March 12, 2026