DataOps Engineer
Trustly
Posted: April 7, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
Develop and implement data operations processes to ensure seamless and secure data exchange between our platforms and customers.
Required Skills
Job Description
WHO WE ARE
At Trustly, we're building a smarter, faster, and more secure financial future by revolutionizing the world of payments. As a global leader in Open Banking Payments, we are establishing Pay by Bank as the new standard at checkout, providing unparalleled freedom, speed, and ease to millions of consumers and merchants worldwide.
Our Ambition: To build the world’s most disruptive payment network and redefine what the payment experience should feel like.
Trustly is a global team of innovators, collaborators, and doers. If you are driven by a strong sense of purpose and thrive in a dynamic, entrepreneurial, and high-growth environment, join us and be part of a team that’s transforming the way the world pays.
About the team
Trustly's DataOps team is responsible for delivering the data generated by the application to interested areas, as well as data from APIs and other tools. All this thought in a safe, structured, scalable and generic way, because we work with multiple environments (in different regions) and we need to maintain consistency. We work with both the batch layer (using Airflow) and the streaming layer (Kafka). We help areas in process automation to deliver data more quickly and reliably. We are also concerned with the quality of the data (Data Quality), creating an observability layer for the data to act in a preventive and immediate way to the inconsistencies and failures of our processes. We also work on the delivery of products and services to facilitate the use and consultation of our data, such as maintaining tools such as Redshift and QuickSight. Last but not least, we interacted with the Data Science area to provide the necessary support and infrastructure for the production of models. To achieve our goals, we follow good code and development practices, apply end-to-end encryption in all our processes, use infrastructure as code, etc.
Our work makes it possible for teams to have full capacity to work with the company's data. The Data Analytics area develops several internal processes to facilitate the delivery of more structured data to the business areas, developing views and reports that leverage the data in decision making. In addition, we were able to provide more productivity to the areas with the automation of processes, guaranteeing the dependencies and leaving the areas more focused on their activities. We make data available in a secure manner for use by the Risk & Data Science areas, to facilitate the training and production of models that have a direct impact on the conversion of our transactions and the fight against fraud. Our work is summarized in delivering data that will be decisive for the areas in their processes and also in making it a valuable asset as a parameter of our growth and decision-making.
We still have many challenges ahead, such as having a robust monitoring and visibility system on top of the lake, using dedicated tools to do this control. Develop and maintain various services in Kubernetes. Create layers of abstractions to facilitate the creation of new data for the company, used by the areas. Make our processes more robust and automated with CI/CD. Improve the visibility of the data we have available for consultation by the areas, increasing their productivity. And keep improving our processes, applying optimizations and giving reliability to our changes with unit tests.
About the role
As a DataOps Engineer at Trustly, you are the backbone of our data delivery pipeline. You will be responsible for building and maintaining the infrastructure that transforms raw data into a strategic asset. Your mission is to ensure that data flows seamlessly—whether via Airflow for batch processing or Kafka for streaming—while upholding our rigorous standards for encryption and Infrastructure as Code (IaC).
What you’ll do:
• Design, implement, and maintain the scalable data platform infrastructure (AWS, Kubernetes, EMR, Redshift, Glue, etc.).
• Data Modeling & Transformation: Design, develop, and maintain modular and scalable data models (using dbt and SQL) within our Redshift data warehouse to transform raw data into analytics-ready layers.
• Manage and evolve client-faced tools and frameworks for data processing (e.g., QuickSight, Redshift Editor, Athena IDE, Metabase).
• Build and maintain secure, automated CI/CD pipelines for data components and infrastructure-as-code.
• Collaborate with DevOps and Security teams to ensure compliance, reliability, and scalability of the data platform.
• Provide development environments, standardized workflows, and tooling (e.g., Sagemaker Studio, Athena IDE, Redshift Editor, etc.) to improve developer experience.
• Support version control, release workflows, and automation for data transformation, jobs, data workloads and integrations used by data producers and consumers.
• Ensure high availability and performance of data tools through observability and alerting integrations.
• Implement data quality and validation checks (e.g., dbt tests, unit tests).
• Contribute to maintaining data catalogs and documentation for lineage and governance.
• Investigate and resolve issues in data pipelines, workloads, and integrations, ensuring SLAs are met.
• Documentation of data flows, architectural setup and data model.
Who you are:
• Bachelor’s or Master’s degree in IT/Math/CS/Engineering or
other technical discipline
• Successful history of building big data pipelines, and orchestrating data workloads
• Experience with AWS cloud services (EKS, EC2, EMR, RDS) and big data tools (Spark, Redshift)
• Experience with relational databases (preferably Postgres), strong SQL coding, data modeling, data warehouses
• Experience with IaC, Terraform, etc
• Experience with Kubernetes, Docker
• Experience with CI/CD tools
• Experience with automation and workflow management tools (e.g. Airflow, Sagemaker)
• Python programming skills
• Professional working proficient in English for daily collaboration across a distributed global team
Our perks and benefits::
• Bradesco health and dental plan, for you and your dependents, with no co-payment cost;
• Life insurance with differentiated coverage;
• Meal voucher and supermarket voucher;
• Home Office Allowance;
• Wellhub - Platform that gives access to spaces for physical activities and online classes;
• Trustly Club - Discount at educational institutions and partner stores;
• English Program - Online group classes with a private teacher;
• Extended maternity and paternity leave;
• Birthday Off;
• Flexible hours/Home Office - our culture is remote-first! You can work in every city in Brazil;
• Welcome Kit - We work with Apple equipment (Macbook Pro, iPhone) and we send many more treats! Spoiler alert: Equipment can be purchased by you according to internal criteria!;
• Referral Program - If you refer a candidate and we hire the person, you will receive a reward for that!
At Trustly, we embrace and celebrate diversity of all forms and the value it brings to our employees and customers. We are proud and committed to being an Equal Opportunity Employer and believe an open and inclusive environment enables people to do their best work. All decisions regarding hiring, advancement, and any other aspects of employment are made solely on the basis of qualifications, merit, and business need.