Senior Data Engineer
Avacone
Posted: May 4, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
This is a Senior Data Engineer role that requires hands-on expertise in rebuilding a modern data platform from scratch, with a focus on scalability, risk management, and regulatory compliance.
Required Skills
Job Description
The Opportunity
We are supporting a major data platform transformation within a banking environment, moving from a legacy SQL Server and SSIS-based setup to a modern, scalable architecture built on dbt, Dagster, and OpenShift.
This role is not about maintaining existing systems. It is about rebuilding a critical data platform from the ground up, with direct impact on risk, trading PnL, and core financial data flows.
We are looking for a hands-on Senior Data Engineer who can take ownership of complex migration workstreams and deliver reliably in a regulated, high-stakes environment.
What You Will Do
You will play a central role in the end-to-end migration and modernisation of the data platform.
Platform Transformation
• Translate legacy ETL logic from SSIS and stored procedures into modern ELT pipelines using dbt
• Implement Data Vault 2.0 structures including Raw Vault and Business Vault
• Build datamarts and curated datasets for downstream analytics and reporting
Orchestration & Infrastructure
• Design and operate workflows using Dagster, including scheduling, dependencies, and recovery mechanisms
• Deploy and run data workloads on OpenShift / Kubernetes environments
Event-Driven Data Processing
• Enable near real-time data processing using Kafka-triggered pipelines
• Integrate with upstream data lake environments and external data providers
Data Quality & Validation
• Establish robust data validation and reconciliation processes
• Implement automated testing and monitoring using dbt
Operational Ownership
• Support production pipelines and resolve incidents when required
• Create clear documentation and ensure operational readiness
• Continuously improve performance, reliability, and maintainability
What You Will Do
You will play a central role in the end-to-end migration and modernisation of the data platform.
Platform Transformation
• Translate legacy ETL logic from SSIS and stored procedures into modern ELT pipelines using dbt
• Implement Data Vault 2.0 structures including Raw Vault and Business Vault
• Build datamarts and curated datasets for downstream analytics and reporting
Orchestration & Infrastructure
• Design and operate workflows using Dagster, including scheduling, dependencies, and recovery mechanisms
• Deploy and run data workloads on OpenShift / Kubernetes environments
Event-Driven Data Processing
• Enable near real-time data processing using Kafka-triggered pipelines
• Integrate with upstream data lake environments and external data providers
Data Quality & Validation
• Establish robust data validation and reconciliation processes
• Implement automated testing and monitoring using dbt
Operational Ownership
• Support production pipelines and resolve incidents when required
• Create clear documentation and ensure operational readiness
• Continuously improve performance, reliability, and maintainability
Requirements:
What You Bring
Technical Expertise
• Strong experience with SQL Server and T-SQL, including performance optimisation
• Proven hands-on experience with dbt in production environments
• Solid experience with workflow orchestration tools, ideally Dagster
• Practical knowledge of Data Vault 2.0 modelling concepts
• Experience working with container platforms such as OpenShift or Kubernetes
• Familiarity with event-driven architectures and Kafka
Domain Experience
• Experience working with financial data, ideally in banking or trading environments
• Understanding of risk and PnL data structures is a strong advantage
Working Style
• Strong ownership mindset with the ability to work independently
• Structured, pragmatic, and delivery-focused
• Comfortable operating in complex and regulated environments
• Clear communicator across both technical and business stakeholders
What Success Looks Like
Within the first months, you will have:
• Delivered initial Data Vault structures and migrated datasets into the new platform
• Established stable, event-driven pipelines
• Ensured data consistency and validation between legacy and new systems
• Contributed to a production-ready, scalable data platform