#119004 - Data Engineer
LiftedanUpworkCompany
Posted: May 12, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
Design, build, and maintain scalable data pipelines to power business functions, with a focus on reliability and data availability.
Job Description
We are seeking a Data Engineer to support the development of data products that power critical business functions. This role focuses on building reliable, scalable data pipelines and improving data availability for a variety of stakeholders. You will work in a collaborative, cross-functional Agile environment and partner closely with technical and business teams to deliver high-quality data solutions. Enterprise experience strongly preferred.
 
Key Responsibilities
- Design, build, and maintain scalable data pipelines
- Develop and optimize ETL processes to support data products
- Work with structured and unstructured data across SQL and NoSQL systems
- Collaborate with cross-functional stakeholders to understand data requirements
- Ensure data quality, reliability, and availability across systems
- Manage and orchestrate workflows using pipeline management tools
- Contribute to Agile sprint planning and delivery commitments
- Troubleshoot and resolve data pipeline and workflow issues
Must-Have Skills
- Strong Python programming experience
- 5+ years of experience in Data Engineering or Data Warehousing
- 5+ years building data pipelines using ETL tools
- 3+ years hands-on experience with big data technologies such as Snowflake, Redshift, Hive, Kafka, Spark, or similar
- Extensive experience with SQL and NoSQL databases
- Experience with workflow management tools such as Airflow
- Strong communication and stakeholder management skills
- Experience working in Agile, cross-functional teams
- Ability to manage competing priorities and deliver within sprint commitments
 
Nice-to-Have Skills
- Experience with dbt
- Experience supporting data needs for finance, accounting, payments, or tax functions
- Advanced degree in Math, Statistics, Computer Science, or related field
Required Tools & Platforms
- Python
- SQL and NoSQL databases
- ETL tools (dbt preferred)
- Airflow or similar workflow orchestration tools
- Big data platforms such as Snowflake, Redshift, Hive, Kafka, or Spark
 
Location, Time & Engagement
- Location: LATAM (remote)
- Engagement: Contract
- Allocation: Full-time (40 hours per week)
- Duration: Through March 31, 2027