MisuJob - AI Job Search Platform MisuJob

Data Engineer (Fabric)

JobsForHumanity

Beirut, Beirut Governorate, Lebanon Hybrid permanent

Posted: March 30, 2026

Interested in this position?

Create a free account to apply with AI-powered matching

Quick Summary

We are looking for entrepreneurs, techies, passionate, eager to learn, to lead the development of sustainable coffee solutions.

Job Description

The Company:

Sucafina is the leading sustainable Farm to Roaster coffee company, with a family tradition in commodities that stretches back to 1905. Today, with more than 1,400 employees in 34 countries, we help stakeholders worldwide to find the perfect coffee solutions. We embed technology, innovation, and sustainability throughout the supply chain, creating shared value for all by Investing in Farmers, Caring for People, and Protecting Our Planet. For more information, visit www.sucafina.com.

What are we looking for:

We are looking for entrepreneurs, techies, passionate, eager to learn, humble, with a positive attitude and a high level of integrity People. Flexible and willing to take challenges, work and live in coffee-producing countries, People who want to build expertise and a career in the coffee business and are ready to go the extra mile.

What we offer:

We offer within our pleasant family environment, great opportunities to learn and grow, we offer challenges and exposure to multicultural environments, on-merit base compensation, and free coffee around the clock!

Job Overview

The Microsoft Fabric Data Engineer designs, builds, and operates modern data platforms using Microsoft Fabric. This role focuses on ingesting, modeling, and serving data via OneLake, Lakehouse, Data Warehouse, Data Pipelines, and Power BI—delivering trusted, performant datasets and governed analytics at scale. The role collaborates closely with data architects, analytics engineers, BI developers, and business stakeholders.

Key Responsibilities

1. Data Platform Engineering (Fabric)

• Build and manage Lakehouses (Delta Lake) and Fabric Data Warehouses.
• Develop Data Pipelines and Dataflows Gen2 for batch and near-real-time ingestion.
• Create and optimize Notebook-based transformations (PySpark/SQL) and SQL stored procedures for DW workloads.
• Implement medallion architecture (bronze/silver/gold) for scalable curation.
• Publish certified semantic models and Power BI datasets aligned to business domains.

2. Performance & Reliability

• Optimize storage/compute in OneLake (file formats, partitioning, z-ordering).
• Tune Spark and SQL workloads (caching strategies, concurrency, workload isolation).
• Implement robust retry, alerting, and monitoring (Fabric Monitoring Hub, Metrics app).
• Conduct end-to-end pipeline performance testing and scalability assessments.

3. Governance, Security & Compliance

• Enforce data governance with sensitivity labels, row-level/column-level security, and workspace roles.
• Manage item-level permissions (Lakehouse tables, DW schemas, datasets) and Managed Identities for sources.
• Apply data quality rules, lineage, and documentation (Descriptions, Tags, Owner metadata; Purview if applicable).
• Ensure compliance with organizational standards (PII handling, audit, retention).

4. DevOps & Lifecycle Management

• Use Fabric Git integration and Deployment Pipelines for CI/CD across dev/test/prod.
• Parameterize pipelines and environments; externalize configuration and secrets (Key Vault).
• Implement automated testing for data transformations and schemas.
• Drive release management, change control, and rollback strategies.

5. Collaboration & Stakeholder Engagement

• Partner with analytics engineers and BI teams to design star schemas, semantic models, and DAX measures.
• Work with data source owners for SLAs, schema change management, and contracts.
• Translate business requirements into technical designs and document architecture decisions.
• Provide knowledge transfer, best practices, and support to data consumers.

Required Skills & Qualifications

Technical Skills

• Microsoft Fabric (hands-on):• OneLake, Lakehouse (Delta), Fabric Data Warehouse, Data Pipelines, Dataflows Gen2, Notebooks, Semantic Models/Power BI, Monitoring Hub.

• Programming & Querying:• PySpark, SQL (T-SQL), Delta Lake operations; DAX familiarity is a plus.

• Modeling & Architecture:• Dimensional modeling, Data Vault or medallion patterns, data quality frameworks.

• Performance & Ops:• Partitioning, file formats (Parquet/Delta), caching/z-ordering, job orchestration, monitoring.

• DevOps:• Git, Fabric Deployment Pipelines, YAML CI/CD (GitHub Actions/Azure DevOps), IaC exposure (Bicep/Terraform for non-Fabric infra).

• Security & Governance:• RLS/CLS, sensitivity labels, access patterns, audit/logging, lineage.

Preferred Qualifications

• Experience with Power BI modeling (star schemas, relationships, calculation groups, DAX).
• Exposure to streaming/real-time: Eventstream, Real-Time Hub, KQL databases (if applicable).
• Experience integrating with external sources (SQL Server,
• SAP, Dataverse, REST APIs).
• Familiarity with Microsoft Purview for governance/lineage.
• Certifications:• DP-600: Microsoft Fabric Analytics Engineer Associate (strongly preferred)
• DP-203: Data Engineering on Microsoft Azure (nice to have)

Soft skills

• Strong analytical skills and capacity to challenge the financial information received
• High sense of organisation and able to manage multiple tasks with strong attention to detail
• Excellent communication skills with the ability to interact with international stakeholders
• Curious, proactive,  keen to learn and ready for new challenges
• Ability to work independently while also having a team-oriented mindset.

Languages

• Excellent knowledge of English (written and verbal communication skills)
• Knowledge of any other language is a plus (French)

Why Apply Through MisuJob?

AI-Powered Job Matching: MisuJob uses advanced artificial intelligence to analyze your skills, experience, and career goals. Our matching algorithm compares your profile against thousands of job requirements to find positions where you have the highest chance of success. This saves you hours of manual job searching and ensures you only see relevant opportunities.

One-Click Applications: Once you create your profile, applying to jobs is effortless. Your resume and cover letter are automatically tailored to highlight the most relevant experience for each position. You can apply to multiple jobs in minutes, not hours.

Career Intelligence: Beyond job matching, MisuJob provides valuable career insights. See how your skills compare to market demands, identify skill gaps to address, and understand salary benchmarks for your experience level. Make data-driven decisions about your career path.

Frequently Asked Questions

How do I apply for this position?

Click the "Register to Apply" button above to create a free MisuJob account. Once registered, you can apply with one click and track your application status in your dashboard.

Is MisuJob free for job seekers?

Yes, MisuJob is completely free for job seekers. Create your profile, get matched with jobs, and apply without any cost. We help you find your dream job without any hidden fees.

How does AI matching work?

Our AI analyzes your resume, skills, and experience to understand your professional profile. It then compares this against job requirements using natural language processing to calculate a match percentage. Higher matches mean better fit for the role.

Can I apply to jobs in other countries?

Absolutely. MisuJob features jobs from companies worldwide, including remote positions. Filter by location or look for remote opportunities to find jobs that match your preferences.

Ready to Apply?

Join thousands of job seekers using MisuJob's AI to find and apply to their dream jobs automatically.

Register to Apply