AWS Data Engineer
Weekday AI
Posted: March 17, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
The AWS Data Engineer will design, develop, and manage scalable data pipelines and analytics infrastructure within a cloud-native setting, utilizing AWS-managed services and enhancing data performance, security, and observability across various systems.
Job Description
This role is for one of the Weekday's clients
Min Experience: 4 years
Location: Navi Mumbai
JobType: full-time
Role Overview:
The AWS Data Engineer is tasked with the design, development, and management of scalable data pipelines and analytics infrastructure within a cloud-native setting. This engineer will architect intricate ETL processes utilizing AWS-managed services, enhance data performance, and uphold data quality, security, and observability across various systems. The ideal candidate possesses extensive knowledge of AWS, significant experience in ETL design, and a solid understanding of modern data engineering methodologies.
Requirements:
Key Responsibilities:
• Design and execute comprehensive ETL workflows using AWS services such as Glue, Lambda, Step Functions, EMR, Redshift, Kinesis, and S3.
• Develop and sustain data ingestion pipelines that source from structured, semi-structured, and streaming data.
• Design and uphold data lake and data warehouse solutions (S3, Redshift, Lake Formation).
• Create transformation logic using PySpark, SQL, or Python, ensuring both performance and data integrity.
• Coordinate workflows through AWS Glue Workflows, Apache Airflow, or Step Functions.
• Implement data quality validation measures, monitoring systems, and automated alerts to ensure pipeline health.
• Work collaboratively with data scientists, analysts, and application engineering teams to guarantee data accessibility and alignment with analytical use cases.
• Adhere to data governance and security standards (IAM, encryption, GDPR/HIPAA where applicable).
• Contribute to data architecture reviews, sharing insights on best practices for reliability and scalability.
• Document all data flows, transformations, and pipeline specifications to ensure reproducibility and facilitate audits.
Required Technical Skills:
• Strong programming foundation in Python and SQL.
• Proficiency with AWS data services: Glue, Redshift, EMR, S3, RDS, Lambda, Kinesis, CloudWatch, and CloudFormation.
• In-depth understanding of ETL/ELT design patterns, including incremental loads and change data capture (CDC).
• Familiarity with data modeling techniques (Star/Snowflake schemas) and data lakehouse architectures.
• Experience with managing large-scale or real-time datasets.
• Knowledge of data quality frameworks and tools for data observability.
• Familiarity with DevOps practices and CI/CD workflows using Git, CodePipeline, or Terraform.
• Comprehensive knowledge of data security practices within AWS (IAM roles, encryption, network isolation).
Desired Skills:
• Practical experience with Snowflake, Databricks, or Athena.
• Understanding of BI/analytics tools (QuickSight, Power BI, Tableau).
• AWS certifications such as AWS Certified Data Engineer – Associate or AWS Certified Data Analytics – Specialty.
• Strong analytical and communication abilities to convert business data requirements into engineering solutions.
Educational Requirements:
• Master's or Bachelor's degree in Computer Science, Data Engineering, or a related technical field.
• Preferred: AWS Data Engineering or Data Analytics certification.
Skills
AWS Data Engineer