MisuJob - AI Job Search Platform MisuJob

Informatica (ETL) Developer

Sutherland

Hyderabad, TS, India permanent

Posted: April 16, 2026

Interested in this position?

Create a free account to apply with AI-powered matching

Quick Summary

Design, implement, and maintain complex data engineering solutions in the Business Intelligence and Analytics team. This role involves working with various data sources, designing and developing ETL solutions, and ensuring data quality and integrity.

Job Description

Sutherland is seeking a reliable and technical person to join us as Informatica (ETL) Developer who will play a key role in driving our continued product growth and innovation. If you are looking to build a fulfilling career and are confident you have the skills and experience to help us succeed, we want to work with you!!

The Informatica (ETL) Developer Designs, implements, and maintains complex data engineering solutions in the Business Intelligence and Analytics team.

Responsible for design, development, implementation, testing, documentation, and support of analytical and data solutions/projects requiring data aggregation/data pipelines/ETL/ELT from multiple sources into an efficient reporting mechanism, database/data warehouse using appropriate tools like Informatica, Azure Data Factory, SSIS. This includes interacting with business to gather requirements, analysis, and creation of functional and technical specs, testing, training, escalation, and follow-up.

Support of the applications would include resolving issues reported by users.  Issues could be caused by bugs in the application or user errors or programming errors.  Resolution process will include, but not limited to, investigate known bugs on software vendor support website, create tickets or service requests with software vendor, develop scripts to fix data issues, make program changes, test fixes and apply the changes to production. 

These tasks and activities will be completed with the help and under the guidance of the supervisor.  Participation in team and / or project meetings, to schedule work and discuss status, will be required.

The position also requires staying abreast with changes in technology, programming languages, and software development tools. 

Responsibilities:

• Data Pipeline / ETL (40%): Designs and implements data stores and ETL data flows and data pipelines to connect and prepare operational systems data for analytics and business intelligence (BI) systems.
• Support & Operations (10%): Manages production deployments and automation, monitoring, job control and production support. Works with business users to test programs in Development and Quality. Investigates issues using vendor support website(s).
• Data Modeling / Designing Datasets (10%): Reviews and understands business requirements for development tasks assigned and applies standard data modelling and design techniques based upon a detailed understanding of requirements.
• Data Architecture and Technical Infrastructure (10%): Plans and drives the development of data engineering solutions ensuring that solutions balance functional and non-functional requirements. Monitors application of data standards and architectures including security and compliance.
• SDLC Methodology & Project Management (5%): Contributes to technical transitions between development, testing, and production phases of solutions' lifecycle, and the facilitation of the change control, problem management, and communication processes.
• Data Governance and Data Quality (5%): Identifies and investigates data quality/integrity problems, determine impact and provide solutions to problems.
• Metadata Management & Documentation (5%): Documents all processes and mappings related to Data Pipelines work and follows development best practices as adopted by the BIA team.
• End-User Support, Education and Enablement (5%): Contributes to training and Data Literacy initiatives within the team and End user community.
• Innovation, Continuous Improvement & Optimization (5%): Continuously improves and optimizes existing Data Engineering assets/processes.
• Partnership and Community Building (5%): Collaborates with other IT teams, Business Community, data scientists and other architects to meet business requirements. Interact with DBAs on data designs optimal for data engineering solutions performance

Required Skills:

• Experience: 4 to 7 years of proven experience as part Development, maintenance, and enhancement of Data Pipelines (ETL/ELT) and processes with thorough knowledge of star/snowflake schemas
• Developing complex SQL queries and SQL optimization.
• Experience with other cloud platforms (e.g., AWS, Google Cloud) and multi-cloud environments
• Development experience must be full Life Cycle experience including business requirements gathering, data sourcing, testing/data reconciliation, and deployment within Business Intelligence/Data Warehousing Architecture.
• Skills:• Proficiency in Informatica / PowerCenter / IDMC tools (4+ Years).
• Informatica IDMC (Cloud Data Integration / Application Integration) (3+ Years)
• Data pipeline development using cloud platforms
• Snowflake data warehousing (2+ Years)
• Salesforce (SFDC) integration (2+ Years)
• Informatica Salesforce Data Connector configuration (SFDC)
• Real-time and batch data integration of Salesforce to Snowflake and Snowflake to Salesforce.
• Data migration and ETL/ELT processes
• API-based integrations and data orchestration
• Strong SQL and data modeling skills
• Performance tuning and troubleshooting of data pipelines

• Certifications: Snowflake, Salesforce connector, IDMC Tools are Plus
• Development, maintenance, and enhancement of Data Pipelines (ETL/ELT) and processes with thorough knowledge of star/snowflake schemas
• Understanding of Data Architecture
• Knowledge of ETL and data engineering standards and best practices for the design and development of data pipelines and data extract, transform and load processes
• Design, build and test data products based on feeds from multiple systems, using a range of different storage technologies, access methods or both
• Knowledge of data warehousing concepts, including multi-dimensional models and ETL logic for maintaining star-schemas
• Good Understanding of concepts and principles of data modelling. 
• Ability to produce, maintain and update relevant data models for specific needs.
• Can reverse-engineer data models from a live system
• SQL programming desirable (i.e., stored procedures dev.)
• Proficient in data analysis, defect identifications and resolutions.
• Strong professional verbal and written communication skills.
• Ability to work with little supervision and within changing priorities.
• Ability to analyze requirements and troubleshoot problems.

• Bachelor’s degree in Computer Science, Information Technology, or a related field.
• Hybrid work model: In-office on Monday, Wednesday, and Friday.
• Working Time: India shift timings will be until 11:30 PM IST
• Work Location: Pune / Hyderabad

All your information will be kept confidential according to EEO guidelines.

Why Apply Through MisuJob?

AI-Powered Job Matching: MisuJob uses advanced artificial intelligence to analyze your skills, experience, and career goals. Our matching algorithm compares your profile against thousands of job requirements to find positions where you have the highest chance of success. This saves you hours of manual job searching and ensures you only see relevant opportunities.

One-Click Applications: Once you create your profile, applying to jobs is effortless. Your resume and cover letter are automatically tailored to highlight the most relevant experience for each position. You can apply to multiple jobs in minutes, not hours.

Career Intelligence: Beyond job matching, MisuJob provides valuable career insights. See how your skills compare to market demands, identify skill gaps to address, and understand salary benchmarks for your experience level. Make data-driven decisions about your career path.

Frequently Asked Questions

How do I apply for this position?

Click the "Register to Apply" button above to create a free MisuJob account. Once registered, you can apply with one click and track your application status in your dashboard.

Is MisuJob free for job seekers?

Yes, MisuJob is completely free for job seekers. Create your profile, get matched with jobs, and apply without any cost. We help you find your dream job without any hidden fees.

How does AI matching work?

Our AI analyzes your resume, skills, and experience to understand your professional profile. It then compares this against job requirements using natural language processing to calculate a match percentage. Higher matches mean better fit for the role.

Can I apply to jobs in other countries?

Absolutely. MisuJob features jobs from companies worldwide, including remote positions. Filter by location or look for remote opportunities to find jobs that match your preferences.

Ready to Apply?

Join thousands of job seekers using MisuJob's AI to find and apply to their dream jobs automatically.

Register to Apply