ARCHIVED
This job listing has been archived and is no longer accepting applications.
MisuJob - AI Job Search Platform MisuJob

Senior Cloud Platform Developer

Telesat

Ottawa, Ontario Hybrid permanent

Posted: August 12, 2025

Interested in this position?

Create a free account to apply with AI-powered matching

Job Description

Telesat (Nasdaq and TSX: TSAT) is a leading global satellite operator, providing reliable and secure satellite-delivered communications solutions worldwide to broadcast, telecommunications, corporate and government customers for over 55 years. Backed by a legacy of engineering excellence, reliability and industry-leading customer service, Telesat has grown to be one of the largest and most successful global satellite operators.

Telesat Lightspeed, our revolutionary Low Earth Orbit (LEO) satellite network, scheduled to begin service in 2027, will revolutionize global broadband connectivity for enterprise and Government users by delivering a combination of high capacity, security, resiliency and affordability with ultra-low latency and fiber-like speeds. Telesat is headquartered in Ottawa, Canada, and has offices and facilities around the world.

The company’s state-of-the-art Satellite fleet consists of 14 GEO satellites, the Canadian payload on ViaSat-1 and one LEO 3 demonstration satellite. For more information, follow Telesat on X and LinkedIn or visit www.telesat.com

We are seeking a highly skilled Kafka Expert with deep expertise in Apache Kafka, Linux systems (Red Hat and Debian), and Kubernetes to join our data platform team. This is a hands-on engineering role focused on designing, deploying, and optimizing Kafka-based data streaming solutions that are scalable, secure, and production-ready. You will work closely with both the infrastructure and development teams to align Kafka and stream processing architectures with platform standards and to build robust use cases for real-time data streaming, observability, and microservices. Your contributions will be critical to ensuring high availability, performance, and operational excellence across distributed systems.


Key Responsibilities:
• Design, deploy, and manage Apache Kafka clusters in development/testing/production environments.
• Proven experience deploying and managing Apache Spark and Apache Flink in production environments.
• Optimize Kafka performance, reliability, and scalability for high-throughput data pipelines.
• Ensure seamless integration of Kafka with other systems and services.
• Manage and troubleshoot Linux-based systems (Ubuntu) supporting Kafka infrastructure.
• Manage, fine-tune, deploy and operate Kafka on Kubernetes clusters, using Helm, Operators, or custom manifests Kafka
• Collaborate with cross-functional teams to identify and implement Kafka use cases.
• Contribute to automation and Infrastructure as Code (IaC) practices through CI/CD pipeline with gitlab
• Monitor system health, implement alerting, and ensure high availability.
• Participate in incident response and root cause analysis for Kafka and related systems.
• Evaluate and recommend Kafka ecosystem tools like Kafka Connect, Schema Registry, MirrorMaker, and Kafka Streams.
• Build automation and observability tools for Kafka using Prometheus, Grafana, Fluent Bit, etc.
• Deep understanding of streaming and batch processing architectures.
• Familiarity with Spark Structured Streaming and Flink DataStream API.
• Work with teams to build end-to-end Kafka-based pipelines for various applications (data integration, event-driven microservices, logging, monitoring).
• Experience running Spark and Flink on Kubernetes, YARN, or standalone clusters.
• Proficiency in configuring resource allocation, job scheduling, and cluster scaling.
• Knowledge of checkpointing, state management, and fault tolerance mechanisms.
• Ability to tune Spark and Flink jobs for low latency, high throughput, and resource efficiency.
• Experience with memory management, shuffle tuning, and parallelism settings.
• Familiarity with Spark UI, Flink Dashboard, and integration with Prometheus/Grafana.
• Ability to implement metrics collection, log aggregation, and alerting for job health and performance.
• Understanding of TLS encryption, Kerberos, and RBAC in distributed environments.
• Experience integrating with OAuth, or other identity providers.
• Familiarity with time-series databases


Required Qualifications:
• 5+ years of experience administering and supporting Apache Kafka in production environments.
• Strong expertise in Linux system administration (Red Hat and Debian).
• Solid experience with Kubernetes (CNCF distributions, OpenShift, Rancher, or upstream K8s ).
• Proficiency in scripting (Bash, Python) and automation tools (Ansible, Terraform).
• Experience with Kafka security, monitoring (Prometheus, Grafana, Istio), and schema management.
• Familiarity with CI/CD pipelines and DevOps practices.
• Proficient in scripting and automation (Bash, Python, or Ansible).
• Comfortable with Helm, YAML, Kustomize, and GitOps, GitLab principles.
• 4+ years of experience in Apache Spark development, including building scalable data pipelines and optimizing distributed processing.


The successful candidate must be able to work in Canada and obtain Canadian Reliability Clearance.

#LI-CK1

At Telesat, we take pride in being an equal opportunity employer that values equality in the workplace. We are committed to providing the best candidate experience possible including any required accommodations at every stage of our interview process. All qualified applicants that have been selected for an interview that require accommodations, are advised to inform the Telesat Talent team accordingly. We will work with you to meet your needs. All accommodation information provided will be treated as confidential.

Why Apply Through MisuJob?

AI-Powered Job Matching: MisuJob uses advanced artificial intelligence to analyze your skills, experience, and career goals. Our matching algorithm compares your profile against thousands of job requirements to find positions where you have the highest chance of success. This saves you hours of manual job searching and ensures you only see relevant opportunities.

One-Click Applications: Once you create your profile, applying to jobs is effortless. Your resume and cover letter are automatically tailored to highlight the most relevant experience for each position. You can apply to multiple jobs in minutes, not hours.

Career Intelligence: Beyond job matching, MisuJob provides valuable career insights. See how your skills compare to market demands, identify skill gaps to address, and understand salary benchmarks for your experience level. Make data-driven decisions about your career path.

Frequently Asked Questions

How do I apply for this position?

Click the "Register to Apply" button above to create a free MisuJob account. Once registered, you can apply with one click and track your application status in your dashboard.

Is MisuJob free for job seekers?

Yes, MisuJob is completely free for job seekers. Create your profile, get matched with jobs, and apply without any cost. We help you find your dream job without any hidden fees.

How does AI matching work?

Our AI analyzes your resume, skills, and experience to understand your professional profile. It then compares this against job requirements using natural language processing to calculate a match percentage. Higher matches mean better fit for the role.

Can I apply to jobs in other countries?

Absolutely. MisuJob features jobs from companies worldwide, including remote positions. Filter by location or look for remote opportunities to find jobs that match your preferences.

Ready to Apply?

Join thousands of job seekers using MisuJob's AI to find and apply to their dream jobs automatically.

Register to Apply