Lead Data Engineer with Google cloud GCP
DeutscheTelekomITSolutionsSlovakia
Posted: May 6, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
We are looking for a Lead Data Engineer to join our team in Košice, Slovakia. The successful candidate will be responsible for designing, developing, and maintaining large-scale data pipelines and architectures in a fast-paced environment. The ideal candidate should have hands-on experience in data engineering and a strong understanding of cloud-based technologies.
Required Skills
Job Description
Our brand Deutsche Telekom IT Solutions Slovakia entered the life of Košice region in 2006 under the name of T-Systems Slovakia and ever since has been inextricably linked with the region when became one of the founding members of Košice IT Valley. We have managed to grow from scratch to the second largest employer in the eastern part of the country with more than 3900 employees. Our goal is to proactively find new ways to improve and continuously transform into the type of company providing innovative information and communication technology services.
Purpose
As a lead data engineer, your purpose is to manage and oversee the entire data engineering process, from data acquisition to data analysis. You will be responsible for designing, building, and maintaining the data infrastructure necessary for effective data management and analysis. You will work closely with data scientists, analysts, and other stakeholders to ensure the accuracy, reliability, and accessibility of data. Your role will also involve mentoring and leading a team of data engineers, setting technical direction, and ensuring the successful delivery of data engineering projects.
About team/project
You will join a newly formed team working on a greenfield data platform, building a modern data product from the ground up. The team is co-located, enabling strong collaboration, fast decision-making, and a supportive environment. You will have real impact on architecture and ways of working from day one. The project leverages modern GCP technologies such as Google BigQuery, Google Cloud Dataflow, and Google Cloud Dataproc, offering interesting technical challenges without legacy constraints.
WHAT WILL YOU DO?
• Design, build, and maintain scalable data pipelines using GCP services such as Google BigQuery, Google Cloud Dataflow, Google Cloud Storage, and Google Cloud SQL.
• Develop and optimize ETL/ELT workflows for transforming and loading large datasets efficiently.
• Work with SQL and Python (including PySpark, Pandas, and NumPy) to process and analyze data.
• Build and maintain distributed data processing solutions using Spark and data parallelism principles.
• Collaborate with data scientists, engineers, and stakeholders to understand data requirements and deliver reliable solutions.
• Implement security best practices, including role-based access control, IAM policies, and data encryption.
• Monitor, troubleshoot, and optimize system performance, identifying bottlenecks and improving efficiency.
• Contribute to data architecture and data modeling (e.g., star and snowflake schemas) where needed.
• Support best practices in version control and CI/CD pipelines (e.g., Git-based workflows).
• Mentor and guide junior data engineers, providing technical and professional development opportunities to help them grow in their careers.
YOU WILL SUCCEED IF YOU:
• Have experience in data engineering, data pipeline development
• strong technical skills in SQL – advanced
• Python (including PySpark, Pandas, NumPy) – advanced
• GCP data services (e.g., Google BigQuery, Google Cloud Dataflow, Google Cloud Storage, Google Cloud SQL, Google Cloud Dataproc) – advanced
• ETL/ELT pipelines – advanced
• Spark and distributed data processing – advanced
• Have analytical skills, problem solving skills, effective communication, presentation skills, cross functional collaboaration, ability to inovate
• Have other skills: team management, project management, agile methodology
• Speak English - advanced (C1)
• Experience with data warehousing and data modeling (star/snowflake schemas)
• Knowledge of cloud architecture and designing scalable data systems
• Have experience with performance tuning and optimization of data systems
• Understand security best practices (IAM, role-based access control, encryption, Git-based workflows)
• Speak German - advantage
WHY SHOULD YOU CHOOSE US?
We believe in balance between work and personal life. An attractive and extensive work-life balance portfolio guarantees lasting motivation for employees and thus a better quality of life, promotes physical and mental well-being and contributes to a positive work environment. All this with the aim of providing more freedom in reconciling work, career growth, private life and individual lifestyle. Therefore we offer to our employees over 25 different benefits to improve their personal and professional life in these areas:
• Financial benefits
• Benefits with focus on learning and development
• Benefits with focus on health and sport
• Benefits with focus on family and work – life balance
• Other benefits
For more information about our benefits click to Benefits
Salary
Final salary is negotiable.
We are offering base salary depending on seniority level and previous experience of candidate. In addition to base salary we provide variable part and other financial benefits. Base salary will not be lower than 3500 € /brutto.
Additional information
* Please be informed that our remote working possibility is only available within Slovakia due to European taxation regulation.