Senior Data Engineer (4 Months Contract ) Remote - Octopus by RTG
robusta
Posted: April 14, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
Design and implement scalable data pipelines and enterprise-level data architecture for analytics, reporting, and AI/ML initiatives.
Required Skills
Job Description
We are seeking a highly skilled Senior Data Engineer to design, build, and optimize scalable data pipelines and enterprise-level data architecture.
This role is focused on technical leadership without people management, where you will drive the architecture and delivery of high-quality data solutions supporting analytics, reporting, and AI/ML initiatives. The ideal candidate is hands-on, analytical, and experienced in building robust data platforms across cloud and on-prem environments.
Key Responsibilities
• Design and implement scalable, high-performance data pipelines for structured and unstructured data
• Build and maintain enterprise-grade data architecture across cloud and on-prem environments (AWS, Azure, or GCP)
• Define and implement data modeling strategies (data warehouses, data lakes, real-time streaming)
• Collaborate with business stakeholders to translate requirements into technical solutions
• Ensure data quality, governance, security, and compliance across all data assets
• Perform complex data analysis and support reporting and analytics initiatives
• Evaluate and recommend modern tools, frameworks, and best practices in data engineering
• Integrate data from multiple sources and optimize data performance
• Provide technical direction and architectural guidance across data projects
Requirements:
• Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field
• 6+ years of experience in data engineering and data architecture in enterprise environments
• Strong experience with ETL/ELT pipelines, data lakes, and data warehouses
• Hands-on experience with cloud platforms (AWS, Azure, or GCP)
• Proficiency in SQL and at least one programming language (Python, Java, or Scala)
• Experience with streaming technologies (Kafka, Spark Streaming, or similar)
• Strong understanding of data modeling, dimensional modeling, and schema design
• Knowledge of data governance, security, and compliance
• Strong problem-solving skills and ability to work independently
• Excellent communication skills with both technical and non-technical stakeholders
Nice to Have
• Experience working in Middle East markets or familiarity with Saudi Arabia regulations
• Exposure to AI/ML integration within data pipelines
• Cloud certifications (AWS, Azure, or GCP Data Engineering certifications)
• Experience with BI tools (Power BI, Tableau, Looker, or similar)