Data Engineer
COSMOTE GLOBAL SOLUTIONS NV
Posted: March 18, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
Design and implement large-scale data pipelines for various data sources, ensuring data quality and integrity through data validation and cleansing.
Required Skills
Job Description
COSMOTE Global Solutions, a key player in the OTE Group of Companies, specializes in providing comprehensive ICT Solutions and Services tailored to meet diverse business needs.
This position for a Data Engineer is pivotal in our mission to enhance data management and infrastructure, ensuring the seamless integration of various data systems.
Key Responsibilities:
• Constructing, maintaining, and optimizing data pipelines for a variety of data sources.
• Working with various databases and cloud technologies to manage and store data effectively.
• Ensuring data quality and integrity through regular monitoring and validation.
• Collaborating with data scientists and analysts to implement data-driven solutions.
• Assisting in the establishment of best practices for data engineering within the organization.
• Contributing to the strategic direction of data-related projects and initiatives.
• Staying updated with the latest industry trends and technologies in data engineering.
Requirements:
• Bachelor's degree in Computer Science, Engineering, or a related field.
• Excellent SQL skills including complex transformations and query optimization
• Proven experience in the design, development, and maintenance of robust ETL/ELT pipelines
• Experience managing data flow across staging, raw, business vault, and data mart layers
• Ability to implement and maintain Data Vault 2.0 models designed by the Data Modeler
• Experience integrating SQL-based and NoSQL data sources (e.g., JSON format data)
• Ability to technically implement governance and lineage tracking mechanisms
• Capability to optimize workloads, queries, and processing performance
• Implementation of data validation, reconciliation, and quality controls in pipelines
• Application of performance, maintainability, and coding best practices
• Experience implementing monitoring, alerting, and logging frameworks
• Ability to define and document data engineering standards and best practices
• Experience querying MongoDB or other NoSQL databases
• Experience implementing automated data quality checks
• Ability to optimize cloud resource usage and manage costs
• Exposure to AI-driven performance optimization techniques
• Experience with modern cloud data platforms
• Experience with large-scale or near real-time data processing environments
• Knowledge of encryption, masking, and secure data handling practices