Jr./Mid Machine Learning Engineer – Time-Series & Inertial AI
Confidential
Posted: March 18, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
As a Machine Learning Engineer for Smart Inertial Sensors and Systems, you will pioneer the use of Deep Learning to enhance the raw performance of Commercial Off-The-Shelf (COTS) MEMS sensors in Riyadh, Saudi Arabia.
Required Skills
Job Description
We are building a new Systems-Level Integration (SLI) team focused on Smart high-performance Inertial Sensors and Systems. As our first Machine Learning Engineer in the Riyadh office, you will pioneer the use of Deep Learning to enhance the raw performance of Commercial Off-The-Shelf (COTS) MEMS sensors.
In the initial phase of this role, you will focus entirely on software, simulation, and data-driven modeling. You will work with datasets provided by our core engineering team to train models that correct stochastic errors, denoise signals, and improve sensor performance.
What You Will Do
• Time-Series AI Development: Design, train, and validate neural networks (CNNs, LSTMs, TCNs, or Transformers) to denoise raw inertial sensors data and fuse them for better performance.
• Virtual Sensing: Develop AI models that enhance MEMS sensor outputs using self-supervised or supervised learning techniques.
• Data Pipeline Engineering: Build robust data processing pipelines to handle massive, high-frequency inertial systems datasets (filtering, normalization, augmentation, and windowing).
• Cross-Border Collaboration: Work closely with the Egypt-based Systems and Firmware teams to ensure your models are designed within the computational limits of edge microcontrollers (TinyML).
• Rapid prototyping: You will be implementing ML algorithms based on published academic research papers.
• Research: You will be asked to prepare a detailed literature review on the state of the art usages of ML in improving inertial systems performance.
• Model Optimization: Quantize and prune trained PyTorch/TensorFlow models for eventual deployment on resource-constrained embedded targets (e.g., ARM Cortex-M).