MisuJob - AI Job Search Platform MisuJob

Senior Database Administrator Engineer

integrichain1

Pune, MH, India permanent

Posted: January 29, 2026

Interested in this position?

Create a free account to apply with AI-powered matching

Quick Summary

We are seeking a Senior Database Administrator Engineer to join our team in Pune, India. The ideal candidate will have experience in database administration, data integration, and software development, with a strong background in life sciences and a passion for precision medicine.

Job Description

IntegriChain is the data and application backbone for market access departments of Life Sciences manufacturers. We deliver the data, the applications, and the business process infrastructure for patient access and therapy commercialization. More than 250 manufacturers rely on our ICyte Platform to orchestrate their commercial and government payer contracting, patient services, and distribution channels. ICyte is the first and only platform that unites the financial, operational, and commercial data sets required to support therapy access in the era of specialty and precision medicine. With ICyte, Life Sciences innovators can digitalize their market access operations, freeing up resources to focus on more data-driven decision support.  With ICyte, Life Sciences innovators are digitalizing labor-intensive processes – freeing up their best talent to identify and resolve coverage and availability hurdles and to manage pricing and forecasting complexity.

We are headquartered in Philadelphia, PA (USA), with offices in Ambler, PA (USA); Pune, India; and Medellín, Colombia. For more information, visit www.integrichain.com, or follow us on Twitter @IntegriChain and LinkedIn.

Join our DevOps Engineering team as a Senior Database Administrator Engineer responsible for managing, optimizing, and securing our cloud-based database platforms. This hands-on role focuses on performance, reliability, and automation across AWS RDS (Oracle and PostgreSQL) environments. You’ll collaborate closely with DevOps and Product Engineering to ensure scalable, compliant, and resilient data operations supporting business-critical applications.

Key Responsibilities: 

Modern Data Architecture & Platform Engineering

• Design, build, and optimize database solutions using Snowflake, PostgreSQL, and Oracle RDS.
• Design and evolve cloud-native data lakehouse architectures using Snowflake, AWS, and open data formats where appropriate.
• Implement and manage Medallion Architecture (Bronze / Silver / Gold) patterns to support raw ingestion, curated analytics, and business-ready datasets.
• Build and optimize hybrid data platforms spanning operational databases (PostgreSQL / RDS) and analytical systems (Snowflake).
• Develop and maintain semantic layers and analytics models to enable consistent, reusable metrics across BI, analytics, and AI use cases.
• Engineer efficient data models, ETL/ELT pipelines, and query performance tuning for analytical and transactional workloads.
• Implement replication, partitioning, and data lifecycle management to enhance scalability and resilience.
• Manage schema evolution, data versioning, and change management in multienvironment deployments

 Advanced Data Pipelines & Orchestration

• Engineer highly reliable ELT pipelines using modern tooling (e.g., dbt, cloud-native services, event-driven ingestion).
• Design pipelines that support batch, micro-batch, and near–real-time processing.
• Implement data quality checks, schema enforcement, lineage, and observability across pipelines.
• Optimize performance, cost, and scalability across ingestion, transformation, and consumption layers.

AI-Enabled Data Engineering

• Apply AI and ML techniques to data architecture and operations, including:
• Intelligent data quality validation and anomaly detection
• Automated schema drift detection and impact analysis
• Query optimization and workload pattern analysis
• Design data foundations that support ML feature stores, training datasets, and inference pipelines.
• Collaborate with Data Science teams to ensure data platforms are AI-ready, reproducible, and governed.

Automation, DevOps & Infrastructure as Code

• Build and manage data infrastructure as code using Terraform and cloud-native services.
• Integrate data platforms into CI/CD pipelines, enabling automated testing, deployment, and rollback of data changes.
• Develop tooling and automation (Python, SQL, APIs) to streamline provisioning, monitoring, and operational workflows.

Security, Governance & Compliance

• Implement enterprise-grade data governance, including role-based access control, encryption, masking, and auditing.
• Enforce data contracts, ownership, and lifecycle management across the lakehouse.
• Partner with Security and Compliance teams to ensure audit readiness and regulatory alignment.
• Build and manage data infrastructure as code using Terraform and cloud-native services.
• Integrate data platforms into CI/CD pipelines, enabling automated testing, deployment, and rollback of data changes.
• Develop tooling and automation (Python, SQL, APIs) to streamline provisioning, monitoring, and operational workflows.

• 5+ years of experience in data engineering, database engineering, or data platform development in production environments.
• Strong hands-on experience with Snowflake, including performance tuning, security, and cost optimization.
• Deep expertise with PostgreSQL and AWS RDS in cloud-native architectures.
• Proven experience designing lakehouse or modern data warehouse architectures.
• Strong understanding of Medallion Architecture, semantic layers, and analytics engineering best practices.
• Experience building and operating advanced ELT pipelines using modern tooling (e.g., dbt, orchestration frameworks).
• Proficiency with SQL and Python for data transformation, automation, and tooling.
• Experience with Terraform and infrastructure-as-code for data platforms.
• Solid understanding of data governance, observability, and reliability engineering.               

 What Success Looks Like Within the First 90 Days:

• Fully onboarded and delivering enhancements to Snowflake and RDS environments.
• Partnering with DevOps and Product Engineering on data infrastructure improvements.
• Delivering optimized queries, schemas, and automation for key systems.

Ongoing Outcomes:

• Consistent improvement in data performance, scalability, and reliability.
• Effective automation of database provisioning and change management.
• Continuous collaboration across teams to enhance data availability and governance.

Bonus Experience (Nice to Have)

• Experience with dbt, AWS Glue, Airflow, or similar orchestration tools.
• Familiarity with feature stores, ML pipelines, or MLOps workflows.
• Exposure to data observability platforms and cost optimization strategies.
• Relevant certifications (Snowflake SnowPro, AWS Database Specialty, etc.).

What does IntegriChain have to offer?

• Mission driven: Work with the purpose of helping to improve patients' lives! 
• Excellent and affordable medical benefits + non-medical perks including Flexible Paid Time Off (PTO) and much more!
• Robust Learning & Development opportunities including over 700+ development courses free to all employees

#LI-NS1

IntegriChain is committed to equal treatment and opportunity in all aspects of recruitment, selection, and employment without regard to race, color, religion, national origin, ethnicity, age, sex, marital status, physical or mental disability, gender identity, sexual orientation, veteran or military status, or any other category protected under the law. IntegriChain is an equal opportunity employer; committed to creating a community of inclusion, and an environment free from discrimination, harassment, and retaliation.

Why Apply Through MisuJob?

AI-Powered Job Matching: MisuJob uses advanced artificial intelligence to analyze your skills, experience, and career goals. Our matching algorithm compares your profile against thousands of job requirements to find positions where you have the highest chance of success. This saves you hours of manual job searching and ensures you only see relevant opportunities.

One-Click Applications: Once you create your profile, applying to jobs is effortless. Your resume and cover letter are automatically tailored to highlight the most relevant experience for each position. You can apply to multiple jobs in minutes, not hours.

Career Intelligence: Beyond job matching, MisuJob provides valuable career insights. See how your skills compare to market demands, identify skill gaps to address, and understand salary benchmarks for your experience level. Make data-driven decisions about your career path.

Frequently Asked Questions

How do I apply for this position?

Click the "Register to Apply" button above to create a free MisuJob account. Once registered, you can apply with one click and track your application status in your dashboard.

Is MisuJob free for job seekers?

Yes, MisuJob is completely free for job seekers. Create your profile, get matched with jobs, and apply without any cost. We help you find your dream job without any hidden fees.

How does AI matching work?

Our AI analyzes your resume, skills, and experience to understand your professional profile. It then compares this against job requirements using natural language processing to calculate a match percentage. Higher matches mean better fit for the role.

Can I apply to jobs in other countries?

Absolutely. MisuJob features jobs from companies worldwide, including remote positions. Filter by location or look for remote opportunities to find jobs that match your preferences.

Ready to Apply?

Join thousands of job seekers using MisuJob's AI to find and apply to their dream jobs automatically.

Register to Apply