ARCHIVED
This job listing has been archived and is no longer accepting applications.
MisuJob - AI Job Search Platform MisuJob

Senior Data Engineer - Airflow and PySpark (Hybrid)

Detroit Labs

Auburn Hills, Michigan, United States Hybrid permanent

Posted: January 28, 2026

Interested in this position?

Create a free account to apply with AI-powered matching

Quick Summary

We're a fast-growing company in Auburn Hills, MI, looking for a Senior Data Engineer to join our team as a remote engineer, where you'll be working with Airflow and PySpark, building scalable and efficient solutions that drive business results.

Job Description

This role is based in Auburn Hills, Michigan, and follows a hybrid work model, requiring in-office presence three days per week.

Please note: Due to the in-office requirement, we will only be considering candidates that are local to the Metro Detroit area at this time.

Detroit Labs was founded in 2011 with a vision for building digital products, services, and the teams that power them. We create digital solutions that transform the way our clients do business. We build genuine relationships based on respect, trust, and results. We foster a diverse and inclusive culture that values people - providing them with the tools, resources, and support they need to thrive professionally, exceed client expectations, and be themselves at work. We have a variety of client teams we work with ranging from startups to Fortune 500 companies so there are always new and exciting projects going on.

Detroit Labs is looking for an experienced Data Engineer with expertise in Airflow and Pyspark to join an exciting project with an industry leading automotive client. As a Senior Data Engineer & Technical Lead, you will play a pivotal role in delivering major data engineering initiatives within the Data & Advanced Analytics space. This position requires hands-on expertise in building, deploying, and maintaining robust data pipelines using Python, PySpark, and Airflow, as well as designing and implementing CI/CD processes for data engineering projects.

Our Application

At Detroit Labs, a member of our team will read over every application (including yours), and will review your resume in addition to your responses to the application questions.

To help us get to know you better, we encourage you to answer these questions genuinely and honestly. We value each applicant and want to learn about the real you. Be yourself in your responses, and our team will look forward to understanding what you can bring to this role!


Requirements:
7+ years of Data Engineering experience building production-grade data pipelines using Python and PySpark

• Experience designing, deploying, and managing Airflow DAGs in enterprise environments
• Experience maintaining CI/CD pipelines for data engineering workflows, including automated testing and deployment
• Experience with cloud workflows and containerization, using Docker and cloud platforms (GCP preferred) for data engineering workloads
• Knowledge and ability to follow twelve-factor design principles
• Experience and ability to write object-oriented Python code, manage dependencies, and follow industry best practices
• Proficiency with Git for source code management and collaboration (commits, branching, merging, GitHub/GitLab workflows).
• Experience working with command lines in Unix/Linux-like environments
• Solid understanding of SQL for data ingestion and analysis
• Engineering mindset. Able to write code with an eye for maintainability and testability
• Collaborative mindset. Comfortable with code reviews, paired programming, and using remote collaboration tools effectively
• Detroit Labs is not currently able to hire candidates who will reside outside of the United States during their term of employment

Responsibilities

• Design, develop, and optimize scalable data pipelines using Python and PySpark for batch and streaming workloads
• Build, schedule, and monitor complex workflows using Airflow, ensuring reliability and maintainability
• Architect and implement CI/CD pipelines for data engineering projects using GitHub, Docker, and cloud-native solutions (GCP preferred)
• Apply test-driven development (TDD) practices and automate unit/integration tests for data pipelines
• Implement secure coding best practices and design patterns throughout the development lifecycle
• Work closely with Data Architects, QA teams, and business stakeholders to translate requirements into technical solutions
• Create and maintain technical documentation, including process/data flow diagrams and system design artifacts
• Lead and mentor junior engineers, providing guidance on coding, testing, and deployment best practices
• Analyze and resolve technical issues across the data stack, including pipeline failures and performance bottlenecks
• Cross-train team members outside the project team (e.g., operations support) for full knowledge coverage


Benefits:
• The salary range for this role is from $160,000 - $180,000 commensurate with experience
• Full medical, dental, vision benefits
• 401K contribution options
• Quarterly outings and events
• Paid holidays and vacation time
• Parental leave program
• Monthly budgets for “team fun” bonding events
• Free lunch for various company meetings and Lunch & Learns
• Access to our mentorship program and employee resource groups (ERGs)
• Volunteer opportunities
• All-company remote-friendly activities
• Plenty of Detroit Labs swag

Why Apply Through MisuJob?

AI-Powered Job Matching: MisuJob uses advanced artificial intelligence to analyze your skills, experience, and career goals. Our matching algorithm compares your profile against thousands of job requirements to find positions where you have the highest chance of success. This saves you hours of manual job searching and ensures you only see relevant opportunities.

One-Click Applications: Once you create your profile, applying to jobs is effortless. Your resume and cover letter are automatically tailored to highlight the most relevant experience for each position. You can apply to multiple jobs in minutes, not hours.

Career Intelligence: Beyond job matching, MisuJob provides valuable career insights. See how your skills compare to market demands, identify skill gaps to address, and understand salary benchmarks for your experience level. Make data-driven decisions about your career path.

Frequently Asked Questions

How do I apply for this position?

Click the "Register to Apply" button above to create a free MisuJob account. Once registered, you can apply with one click and track your application status in your dashboard.

Is MisuJob free for job seekers?

Yes, MisuJob is completely free for job seekers. Create your profile, get matched with jobs, and apply without any cost. We help you find your dream job without any hidden fees.

How does AI matching work?

Our AI analyzes your resume, skills, and experience to understand your professional profile. It then compares this against job requirements using natural language processing to calculate a match percentage. Higher matches mean better fit for the role.

Can I apply to jobs in other countries?

Absolutely. MisuJob features jobs from companies worldwide, including remote positions. Filter by location or look for remote opportunities to find jobs that match your preferences.

Ready to Apply?

Join thousands of job seekers using MisuJob's AI to find and apply to their dream jobs automatically.

Register to Apply