Senior Data Engineer
GSSTech Group
Posted: March 31, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
Design, build, and optimize scalable data pipelines and enable advanced analytics and machine learning initiatives in a fast-paced Agile environment.
Required Skills
Job Description
We are seeking a highly skilled Senior Data Engineer to join our Data Engineering team. The ideal candidate will play a critical role in designing, building, and optimizing scalable data pipelines and enabling advanced analytics and machine learning initiatives.
You will work closely with Data Scientists, Analytics Delivery Leads, and cross-functional teams to transform raw data into actionable insights in a fast-paced Agile environment.
Key Responsibilities
• Collaborate with stakeholders to gather and analyze data requirements
• Perform Exploratory Data Analysis (EDA) to understand data patterns and quality
• Design and develop robust, scalable, and high-performance data pipelines
• Ingest, process, and transform large-scale structured and unstructured datasets
• Implement feature engineering techniques to support machine learning models
• Optimize Spark jobs for performance, scalability, and cost efficiency
• Ensure data quality, integrity, and security across pipelines
• Work closely with Data Scientists and Analytics teams to deploy ML pipelines
• Participate in Agile ceremonies and contribute to continuous improvement
• Communicate technical solutions effectively to both technical and non-technical stakeholders
Required Skills & Qualifications
• Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field
• 8–10 years of experience in Data Engineering and Big Data ecosystems
• Strong programming expertise in Python
• Hands-on experience with PySpark / Apache Spark, including performance tuning
• Solid understanding of Hadoop ecosystem
• Advanced proficiency in SQL
• Experience with data pipeline development and ETL frameworks
• Familiarity with Machine Learning pipelines and feature engineering
• Experience with version control systems (Git)
• Strong problem-solving and analytical skills
Good to Have
• Experience with cloud platforms (AWS / Azure / GCP)
• Knowledge of data warehousing solutions
• Exposure to workflow orchestration tools (Airflow, etc.)
• Experience working in banking or financial services domain
Key Competencies
• Strong collaboration and communication skills
• Ability to work in a fast-paced Agile environment
• Ownership mindset and attention to detail
• Stakeholder management and coordination