Data Architecture and Modeling
Two95 International Inc.
Posted: March 11, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
Designing and implementing efficient data models, schemes, and databases to support data storage, retrieval, and analysis.
Required Skills
Job Description
We are looking for Data Architecture and Modeler with 10+ years of experience.
Required Technical Skillset : TM forum data models, schemes, and databases to support efficient data storage, Database expertise, Cloud computing, Programming language SQL, Python.
Must-Have:
• Designing and implementing TM forum data models, schemes, and databases to support efficient data storage, retrieval, and analysis
• Collaborating with stakeholders to understand business requirements and translating these requirements into technical specifications and data architecture designs
• Define overall data architecture, data ingestion framework best practices etc.
• Selecting and managing the best cloud services to support end-to-end Data platform implementation for the organization’s specific needs
• Develop and maintain conceptual, logical, and physical data models leveraging GCP services such as BigQuery, Cloud SQL etc.
• Work on end-to-end data initiatives, from gathering requirements to cutover.
• Translate business requirements from stakeholders into effective data model designs.
• Play a key role in migrating data from legacy on-premises systems to cloud-native GCP solutions.
Good-to-Have:
• Data Management Knowledge, MySQL, Oracle, Data model tool (Erwin), Big Data Technologies, ETL knowledge.
Responsibilities of the role:
• Monitoring and optimizing data systems, data lifecycle, and infrastructure to ensure performance, scalability, and integrity
• Offering guidance to development teams, data analysts, and other stakeholders on best practices, data standards, and governance to maximize data products effectiveness
• Defining and enforcing usage standards, cost control policies, and overall governance models for GCP environments
• Optimize data models for performance-oriented OLAP (Online Analytical Processing) and OLTP (Online Transactional Processing) systems.
• Ensure data consistency, quality, and reliability by defining constraints, validation rules, and data governance policies within data models.
• Collaborate with data engineers, data scientists, and business teams to understand data needs and deliver structured data for analytics and machine learning.
• Create and maintain comprehensive data model documentation, including entity-relationship diagrams, data dictionaries, and metadata definitions.
• Leverage understanding of telecom-specific data, such as subscriber data, billing information, network performance metrics, and customer insights.
• Proficiency in data modeling techniques (dimensional, relational) and tools such as Erwin, PowerDesigner, or open-source alternatives.
• Expert-level SQL skills for querying and analyzing large datasets in BigQuery.
• Strong hands-on experience with core GCP services
• Understanding of telecom data sources and industry-specific metrics.