Data Engineer
Sambatv
Posted: March 25, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
We're seeking a skilled Data Engineer to join our Data Technology team in Warsaw, focused on building and maintaining data pipelines to drive business growth for our clients.
Required Skills
Job Description
Samba TV tracks streaming and broadcast video across the world with our proprietary data and technology. We are on a mission to fundamentally transform the viewing experience for everyone. Our data enables media companies to connect with audiences for new shows and movies, and enables advertisers to engage viewers and measure reach across all their devices. We have an amazing story with a unique perspective on culture formed by a global footprint of data and AI-driven insights.
We are seeking a skilled Data Engineer to join our Data Technology team in Warsaw. Our team builds and maintains the data platform that powers the entire organization — from ingestion and analytics to reporting, from viewership and contextual datasets to scalable applications that enable data-driven decision making. You will contribute to the design, development, and maintenance of our data infrastructure, working primarily on AWS, Databricks, BigQuery, and Snowflake technology.
At this level, you are a self-sufficient contributor who takes clear ownership of well-scoped pipeline components and features, collaborates effectively with teammates and cross-functional stakeholders, and begins developing the breadth to navigate the full data lifecycle. You bring 2–4 years of hands-on experience, write production-quality code, and are ready to grow toward greater technical autonomy.
What You'll Do:
• Design, build, and maintain scalable data pipelines supporting both internal and external data consumers, using Apache Spark (PySpark), Airflow, Databricks, and BigQuery/Snowflake
• Develop and optimize data transformations for large-scale datasets, applying modern table formats such as Delta Lake and Iceberg
• Own and operate Airflow DAGs and orchestration workflows, ensuring reliable and timely delivery of data products
• Participate in the modernization of data frameworks and integrations across Databricks and BigQuery environments
• Build and integrate data validation and quality assurance tooling using frameworks such as Great Expectations or similarImplement monitoring, logging, and alerting for data workflows to ensure production reliability
• Debug and resolve pipeline issues across distributed environments, including cloud storage (AWS S3/GCS), databases, and orchestration tools
• Contribute to the implementation of data governance and access controls using Databricks Unity Catalog
• Collaborate with data scientists, analysts, and software engineers to deliver governed, reusable data assets
• Participate in code reviews, contribute to documentation, and help raise engineering standards within the team
• Identify bottlenecks in the development lifecycle and contribute ideas to improve them
Samba TV is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. We strive to empower connection with one another, reflect the communities we serve, and tackle meaningful projects that make a real impact.
Samba TV may collect personal information directly from you, as a job applicant, Samba TV may also receive personal information from third parties, for example, in connection with a background, employment or reference check, in accordance with the applicable law. For further details, please see Samba's Applicant Privacy Policy. For residents of the EU , Samba Inc. is the data controller.