MisuJob - AI Job Search Platform MisuJob

Senior Databricks DWH Engineer - Banking

Qualysoft

Bucharest Hybrid permanent

Posted: April 6, 2026

Interested in this position?

Create a free account to apply with AI-powered matching

Quick Summary

Delivering End to End IT Consulting Services - From Team Augmentation and Dedicated Teams to Custom Software Development

Job Description

About Qualysoft

· 25 years of experience in software engineering, established in Vienna, Austria
· Active in Romania since 2007, with office in central Bucharest (Bd. Iancu de Hunedoara 54B)
· Delivering End to End IT Consulting Services - From Team Augmentation and Dedicated Teams to Custom Software Development
· We deliver scalable enterprise systems, intelligent automation frameworks, and digital transformation platforms
· Cross-industry experience by sustaining global players in BSFI (Banking, financial services and insurance), Telecom,Retail & E-commerce, Energy and Utilities, Automotive, Manufacturing, Logitics, High Tech
· Global Presence: Switzerland, Germany, Austria, Sweden, Hungary, Slovakia, Serbia, Romania, and Indonesia
· International team of 500+ software engineers
· Strategic partnerships: Microsoft Cloud Certified Partner, Tricentis Solutions Partner in Test Automation and Test Management, Creatio Exclusive Partner, Doxee Implementation Partner
· Powered by cutting-edge technologies: AI, Data & Analytics, Cloud, DevOps, IoT, and Test Automation.
· Project beneficiaries ranging from large-scale enterprises to startups
· Stable growth and revenue increase year over year, a resilient organisation in volatile IT market conditions
· Quality-first mindset, culture of innovation, and long-term client partnerships
· Global and local reach – trusted by key industry players in Europe and the US


Responsibilities::
• Advanced Design & Implementation: Designing and implementing robust, scalable, high-performance
ETL/ELT data pipelines using PySpark/Scala and Databricks SQL on the Databricks platform.
• Delta Lake: Expertise in implementing and optimizing the Medallion architecture (Bronze, Silver, Gold)
using Delta Lake to ensure data quality, consistency, and historical tracking.
• Lakehouse Platform: Efficient implementation of the Lakehouse architecture on Databricks, combining
best practices from DWH and Data Lake environments.
• Performance Optimization: Optimizing Databricks clusters, Spark operations, and Delta tables (e.g., Zordering, compaction, query tuning) to reduce latency and compute costs.
• Streaming: Designing and implementing real-time/near–real-time data processing solutions using Spark
Structured Streaming and Delta Live Tables (DLT).
• Unity Catalog: Implementation and administration of Unity Catalog for centralized data governance, finegrained security (row- and column-level security), and data lineage.
• Data Quality: Defining and implementing data quality standards and rules (e.g., using DLT or Great
Expectations) to maintain data integrity.
• Orchestration: Developing and managing complex workflows using Databricks Workflows (Jobs) or external
tools (e.g., Azure Data Factory, Airflow) to automate pipelines.
• DevOps/CI/CD: Integrating Databricks pipelines into CI/CD processes using tools such as Git, Databricks
Repos, and Bundles.
• Collaboration: Working closely with Data Scientists, Analysts, and Architects to understand business
requirements and deliver optimal technical solutions.
• Mentorship: Providing technical guidance to junior developers and promoting best practices.


Qualifications::
• Professional Experience: Minimum 5+ years of experience in Data Engineering, including at least 3+ years
working with Databricks and large-scale Spark.
• Databricks Platform: Proven, expert-level experience with the full Databricks ecosystem (Workspace,
Cluster Management, Notebooks, Databricks SQL).
• Apache Spark: Deep knowledge of Spark architecture (RDD, DataFrames, Spark SQL) and advanced
optimization techniques.
• Delta Lake: Expertise in implementing and administering Delta Lake (ACID properties, Time Travel, Merge,
Optimize, Vacuum).
• Programming Languages: Advanced/expert-level proficiency in Python (with PySpark) and/or Scala (with
Spark).
• SQL: Advanced/expert skills in SQL and Data Modeling (Dimensional, 3NF, Data Vault).
• Cloud: Strong experience with a major Cloud platform (AWS, Azure, or GCP), particularly with storage
services (S3, ADLS Gen2, GCS) and networking.
• Unity Catalog: Hands-on experience with implementing and administering Unity Catalog.
• Lakeflow: Experience with Delta Live Tables (DLT) and Databricks Workflows.
• ML/AI Fundamentals: Understanding of basic MLOps concepts and experience with MLflow to support
integration with Data Science teams.
• DevOps: Experience with Terraform or equivalent tools for Infrastructure as Code (IaC).
• Certifications: Databricks certifications (e.g., Databricks Certified Data Engineer Professional) are a strong
advantage.


What We Offer

· Premium medical package
· Lunch Tickets & Pluxee Card
· Bookster subscription
· 13th salary and yearly bonuses
· Enterprise job security with a startup mentality (diverse & engaging environment, international exposure, flat hierarchy) under the stability of a secure multinational
· A supportive culture (we value ownership, autonomy, and healthy work-life balance) with great colleagues, team events and activities
· Flexible working program and openness to remote work
· Collaborative mindset – employees shape their own benefits, tools, team events and internal practices
· Diverse opportunities in Software Development with international exposure
· Flexibility to choose projects aligned with your career path and technical goals
· Access to leading learning platforms, courses, and certifications (Pluralsight, Udemy, Microsoft, Google Cloud)
· Career growth & learning – mentorship programs, certifications, professional development opportunities, and above-market salary

Why Apply Through MisuJob?

AI-Powered Job Matching: MisuJob uses advanced artificial intelligence to analyze your skills, experience, and career goals. Our matching algorithm compares your profile against thousands of job requirements to find positions where you have the highest chance of success. This saves you hours of manual job searching and ensures you only see relevant opportunities.

One-Click Applications: Once you create your profile, applying to jobs is effortless. Your resume and cover letter are automatically tailored to highlight the most relevant experience for each position. You can apply to multiple jobs in minutes, not hours.

Career Intelligence: Beyond job matching, MisuJob provides valuable career insights. See how your skills compare to market demands, identify skill gaps to address, and understand salary benchmarks for your experience level. Make data-driven decisions about your career path.

Frequently Asked Questions

How do I apply for this position?

Click the "Register to Apply" button above to create a free MisuJob account. Once registered, you can apply with one click and track your application status in your dashboard.

Is MisuJob free for job seekers?

Yes, MisuJob is completely free for job seekers. Create your profile, get matched with jobs, and apply without any cost. We help you find your dream job without any hidden fees.

How does AI matching work?

Our AI analyzes your resume, skills, and experience to understand your professional profile. It then compares this against job requirements using natural language processing to calculate a match percentage. Higher matches mean better fit for the role.

Can I apply to jobs in other countries?

Absolutely. MisuJob features jobs from companies worldwide, including remote positions. Filter by location or look for remote opportunities to find jobs that match your preferences.

Ready to Apply?

Join thousands of job seekers using MisuJob's AI to find and apply to their dream jobs automatically.

Register to Apply