Senior Data Architect
Zeal Group
Posted: March 19, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
The Senior Data Architect will be responsible for designing and implementing a cloud-native data warehouse on Google Cloud Platform (GCP) with a focus on scalability, security, cost-efficiency, and analytics readiness.
Required Skills
Job Description
Role Overview
We are seeking a highly experienced Senior Data Architect to lead the design, evolution, and governance of a cloud-native data warehouse on Google Cloud Platform (GCP). This role owns the overall data architecture vision and ensures the data platform is scalable, secure, cost-efficient, and analytics-ready.
You will provide architectural leadership across data engineering and analytics teams, define standards and best practices, and partner with senior stakeholders to enable high-impact data use cases.
Key Responsibilities
• Own and evolve the end-to-end cloud data architecture on GCP, with BigQuery as the core analytical platform
• Define and enforce enterprise data modeling standards using dbt (dimensional, semantic, and analytics-layer models)
• Architect and govern ELT pipelines orchestrated by Airflow and/or Dagster, ensuring reliability and scalability
• Provide technical leadership and architectural guidance to data engineers and analytics engineers
• Review and approve data designs, dbt models, and pipeline implementations for architectural consistency
• Drive BigQuery performance optimization and cost governance, including partitioning, clustering, and workload management
• Establish and mature data quality, testing, observability, and lineage frameworks
• Define and enforce security, access control, and data governance standards across the data platform
• Partner with product, analytics, and business leaders to translate complex requirements into scalable data solutions
• Lead architectural decision-making for new data sources, tools, and platform enhancements
• Balance business requirements and data platform cost expense, optimize the costs based on target
Requirements:
• 5+ years of experience in data architecture, data engineering, or analytics engineering roles
• Proven experience leading the design and implementation of cloud data warehouses
• Deep hands-on experience with GCP, especially BigQuery
• Strong expertise in dbt for data modeling, testing, documentation, and deployments
• Extensive experience with Airflow and/or Dagster for workflow orchestration
• Advanced SQL skills and strong command of data modeling patterns
• Experience designing scalable, reliable ELT architectures in production environments
• Ability to lead architecture discussions and influence technical direction across teams
• Experience with large-scale or multi-domain data platforms
• Knowledge of additional GCP services such as Cloud Storage, Dataproc, and IAM
• Experience enabling BI and semantic layers (e.g. PowerBI, dbt metrics, Cube)
• Familiarity with data governance, metadata management, and data catalog tools
• Experience in regulated industries or environments with strict data controls
• Experience balancing platform scalability with cost efficiency