Software Engineer 3 - Linux/Bash/Python/Apache Airflow/SQL/Jupyter Notebook/NumPy/JSON
Captivation
Posted: April 7, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
Build software to drive business growth and improve efficiency.
Required Skills
Job Description
Build to something to be proud of.
Captivation has built a reputation on providing customers exactly what is needed in a timely manner. Our team of engineers take pride in what they develop and constantly innovate to provide the best solution. Captivation is looking for software developers who can get stuff done while making a difference in support of the mission to protect our country.
Description
Captivation Software is looking for a senior level software engineer who shall possess extensive expertise in dataflow design, data transport mechanisms, and Apache Spark based distributed processing. In this role, the Software Engineer shall be responsible for designing, implementing, and optimizing data ingress/egress pathways to ensure efficient, scalable, and reliable processing of the organization’s analytics workloads.
Requirements
Security Clearance:
• Must currently hold a Top Secret/SCI U.S. Government security clearance with a favorable Polygraph, therefore all candidates must be a U.S. citizen
Minimum Qualifications:
• Master's degree in Computer Science or related discipline from an accredited college or university, plus five (5) years of experience as a SWE, in programs and contracts of similar scope, type, and complexity.
• Bachelor's degree in Computer Science or related discipline from an accredited college or university, plus seven (7) years of experience as a SWE, in programs and contracts of similar scope, type, and complexity
• Nine (9) years of experience as a SWE, in programs and contracts of similar scope, type, and complexity.
Required Skills:
• Experience using the Linux CLI and Linux tools
• Experience developing Bash scripts to automate manual processes
• Recent software development experience using Python and Java
• Experience using Apache Airflow (DAG design, scheduling, operators, sensors) to orchestrate, schedule, and monitor complex workflows
• Experience with Distributed Big Data processing engines including Apache Spark
• Familiar with SQL technologies such as MySQL, MariaDB, and PostgreSQL for querying, joining, and aggregating large datasets
• Experience using Jupyter Notebook
• Experience with data wrangling and preprocessing using tools such as pandas, NumPy
• Experience working with structured, semi-structured, and unstructured data such as Parquet, JSON, CSV, XML
• Familiarity with data quality concepts, data validation, and anomaly detection
• Experience with Git Source Control System
Desired Skills:
• Familiar with HPC Job Scheduling tools including Slurm
• Experience with the Atlassian Tool Suite including Jira and Confluence
This position is open for direct hires only. We will not consider candidates from third party staffing/recruiting firms.
Benefits
• Annual Salary: $130,000 - $270,000 (Depends on the Years of Experience)
• Up to 20% 401k contribution (No Matching Required and Vested from Day 1)
• Above Market Hourly Rates
• $3,600 HSA Contribution
• 6 Weeks Paid Time Off
• Company Paid Employee Medical/Dental/Vision Insurance/Life Insurance/Short-Term & Long-Term Disability/AD&D