Senior Data Engineer (Data CoE)
SigmaSoftware2
Posted: February 2, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
We are seeking a Senior Data Engineer to join our team in Bucharest, Romania. The ideal candidate will have expertise in building scalable, secure, and high-performance data solutions and be passionate about innovation.
Required Skills
Job Description
Are you a Senior Data Engineer passionate about building scalable, secure, and high-performance data solutions? Join our Data Engineering Center of Excellence at Sigma Software and work on diverse projects that challenge your skills and inspire innovation.
At Sigma Software, we value expertise, continuous learning, and a supportive environment where your career path is shaped around your strengths. You’ll be part of a collaborative team, gain exposure to cutting-edge technologies, and work in an inclusive culture that fosters growth and innovation.
PROJECT
Our Data Engineering Center of Excellence (CoE) is a specialized unit focused on designing, building, and optimizing data platforms, pipelines, and architectures. We work across diverse industries, leveraging modern data stacks to deliver scalable, secure, and cost-efficient solutions.
• Research new technologies and design complex, secure, scalable, and reliable solutions, focusing on ETL process enhancement
• Work with the modern data stack to deliver well-designed technical solutions
• Implement data governance practices
• Collaborate effectively with customer teams
• Take ownership of major solution components and their delivery
• Participate in requirements gathering and propose architecture approaches
• Lead data architecture implementation
• Develop core modules and scalable systems
• Conduct code reviews and write unit/integration tests
• Scale distributed systems and infrastructure
• Build/enhance data platforms leveraging AWS or Azure
• 5+ years of experience with Python and SQL
• Hands-on experience with AWS services (API Gateway, Kinesis, Athena, RDS, Aurora)
• Proven experience building ETL pipelines for analytics/internal operations
• Experience developing and integrating APIs
• Solid understanding of Linux OS
• Familiarity with distributed applications and DevOps tools
• Strong troubleshooting/debugging skills
• English level: Upper-Intermediate
WILL BE A PLUS
• 2+ years with Hadoop, Spark, or Airflow
• Experience with DAGs/orchestration tools
• Experience with Snowflake-based data warehouses
• Experience developing event-driven data pipelines
PERSONAL PROFILE
• Strong communication skills
• Interest in dynamic, research-focused environments
• Passion for innovation and continuous improvement