Data Platform Engineer
BlackStone eIT
Posted: April 10, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
The Data Platform Engineer will design, build, and maintain scalable data platforms that support the company’s data processing and analytics needs.
Required Skills
Job Description
BlackStone eIT is looking for a skilled Data Platform Engineer to join our dynamic team. You will be responsible for designing, building, and maintaining scalable data platforms that support the company’s data processing and analytics needs.
In this role, you will collaborate with data engineers, analysts, and other stakeholders to develop efficient data pipelines, ensure data quality, and contribute to the overall data infrastructure architecture. This is a fantastic opportunity to advance your career by working with cutting-edge technologies and helping shape BlackStone eIT’s data capabilities.
Key Responsibilities
Required Skills & Technologies
- Python — async data pipelines, background jobs, scripting
- PostgreSQL — schema design, migrations (Alembic), query optimization
- Azure Data Lake Storage + Synapse Analytics
- dbt — transformation, testing, documentation
- Apache Airflow or Azure Data Factory
- Data quality frameworks (Great Expectations, dbt tests, or custom)
- Observability — structured logging, alerting, Azure Monitor or Prometheus/Grafana
- Microsoft Graph API — SharePoint, M365 data extraction
- Redis — queue management, caching
- Docker — containerized pipeline jobs
- SQL — advanced analytical queries, window functions, performance tuning
• Develop and maintain data pipelines for ingestion, processing, and storage of large datasets
• Implement ETL/ELT processes to transform raw data into usable formats
• Collaborate with cross-functional teams to understand data requirements and deliver solutions
• Ensure data quality, consistency, and reliability across platforms
• Optimize database performance and monitor platform health
• Assist in the design and implementation of data governance and security measures
• Document data infrastructure and processes for operational clarity
Requirements:
• Bachelor’s degree in Computer Science, Information Technology, or a related field
• 3+ years of experience in data engineering or data platform development
• Proficiency in SQL and experience with relational databases like PostgreSQL or MySQL
• Familiarity with data pipeline and ETL tools such as Apache Airflow, Azure Data Factory, or similar
• Experience with cloud platforms (AWS, Azure, or Google Cloud)
• Knowledge of Python or another programming/scripting language for data processing
• Understanding of data security, governance, and quality best practices
• Strong analytical and problem-solving skills
• Good communication skills and ability to work effectively in a team environment
Benefits:
• Paid Time Off
• Performance Bonus
• Training & Development