2026-0029 BMC Helix ITSM/SRM Engineering Services (NS) - MON 20 Apr
EMW, Inc.
Posted: April 7, 2026
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
BMC Helix ITSM/SRM Engineering Services (NS) - MON 20 Apr
Required Skills
Job Description
Deadline Date: Monday 20 April 2026
Requirement: Provision of BMC Helix ITSM/SRM Engineering Services
Location: Brussels, BE
Full Time On-Site: Yes
Period of Performance: 2026 BASE: As soon as possible but not later than 01 June 2026 – 30 December 2026 with possibility to exercise the following options:
2027 Option 1: from 1 January 2027 until 30 December 2027
2028 Option 2: from 1 January 2028 until 30 December 2028
Required Security Clearance: NATO SECRET
1 INTRODUCTION
1.1 NCIA – CSU BRU
Within the Agency CIS Support Unit (CSU) Brussels provides consistent, reliable and cost-effective ICT service delivery to all NATO customers located in the NATO compound in Brussels, including understanding and managing the interface with the Secretary General and Deputy Director General International Military Staff (DG IMS), through his/her delegated representatives ICTM/EXCO IMS, who act in the role of Intelligent Customer.
1.2 Coherence Branch
The Coherence (COH) Branch supports the Agency’s Demand Management (DM) organization, and is responsible for liaison with all customers in the CSU’s AoR and supports the Commander CSU in the role as NCIA representative and provides a single-entry point for customers. The Coherence Branch contributes and/or conducts monitoring and measurement of customer satisfaction. COH supports the management of all agreements concerning Service Provision, Operations and Exercises within the CSU AoR. The Coherence Branch supports Service Lines in the implementation and improvement of service management processes.
1.3 Service Management and Control (SMC) Team
Within the Coherence Branch, the Service Management and Control (SMC) Team is responsible for the ITSM and System Monitoring Toolsets.
Under the direction of the Head of the Branch and the SMC Toolsets Manager, the BMC SRM Engineering Service Contractor is responsible for the maintenance, operating and upgrading the Service Request Management (SRM) system and datasets based on BMC Helix ITSM (a.k.a. BMC Remedy) and BMC Helix Service Request Management.
The ITSM systems are deployed on two networks over various classifications containing BMC ARS, BMC Mid-Tiers, BMC RSSO, BMC ITSM, BMC SRM, BMC SLM, BMC CMDB, BMC Discovery ver. 20.02 or later. The BMC SRM Engineering Service Contractor is responsible in creating and updating Processes and Service Requests and Reports.
2 OBJECTIVES
NCIA NHQ SMC requires specialist support to design, develop, test and deliver Service Request Definitions (SRDs), BPMN process designs, and reporting for BMC Helix ITSM / SRM (ver. 20.02+). The contracting is based into a performance and delivery-based model with monthly measurable deliverables, acceptance criteria, KPIs and payment triggers. The main objectives of this statement of work can be summarized as follows:
• Service Catalogue Development: Design, develop, test, update and support the BMC Service Request Definitions based on the current BMC Service Request Management ver.20.02 and future BMC ITSM versions (e.g. BMC Digital Workplace Catalogue ver. 25.x) following the best practice standards.
• ITSM Reporting: Design, develop, test, update and support the BMC ITSM Reports based on the current BMC Smart Reporting version 20.02 and future BMC ITSM Reporting toolsets (e.g. BMC Helix Dashboards v25.x) following the best practice standards.
• Deliver Processes definitions, process design (BPMN), SRD specifications, SRD development, testing/UAT, and required reports.
• Target up to 18 points per month (complexity-weighted) equivalent to approximately 2 - 6 Processes and SRDs monthly.
• Provide monthly evidence enabling the SMC Manager to validate and accept deliverables.
• ITSM Documentation: Documenting the developed and updated ITSM code, Service Request Definitions and Reports, and creating the Knowledge Articles.
3 SCOPE OF WORK
3.1 Monthly deliverables (per SRD)
• Meetings with the processes’ stakeholders.
• BPMN process diagram and process documentation uploaded to the portal.
• SRD design (fields, workflows, approvals, AOT/PDT definitions).
• Developed SRD deployed to test and production after acceptance.
• UAT report (testers, defects, remediation).
• Smart Report/dashboard item if required.
• Monthly summary report (points delivered, links to artifacts, blockers, next month plan).
3.2 Supporting activities
The Contractor shall perform ongoing supporting activities, including but not limited to:
• Facilitating and attending design workshops, UAT sessions, planning and weekly standups.
• Maintaining code/configuration documentation.
• Providing knowledge transfer and documentation updates.
• Meetings, workshops, documentation, and knowledge transfer are supporting activities (not billable units) and shall be included within the fixed price of the Deliverables.
• These activities are required to produce the outcome-based deliverables and do not trigger payment.
4 COMPLEXITY MODEL
4.1 SRDs Complexity Table
Simple SRD:
Questions: Up to 6
AOTs: 1
Approval Groups: Up to 1
Points: New: 3; Update: 1
Medium SRD:
Questions: Up to 12
AOTs: Up to 3
Approval Groups: Up to 3
Points: New: 6; Update: 2
Complex SRD:
Questions: Up to 30
AOTs: Up to 12
Approval Groups: Up to 6
Points: New: 12; Update: 4
4.2 Reports Complexity Table
Simple Report:
Columns: Up to 6
Calculated Fields: Up to 1
Data Sources / Joins: 1
Points: New: 1; Update: 0.5
Medium Report:
Columns: Up to 12
Calculated Fields: Up to 3
Data Sources / Joins: Up to 3
Points: New: 2; Update: 1
Complex Report:
Columns: Up to 30
Calculated Fields: Up to 12
Data Sources / Joins: Up to 6
Points: New: 4; Update: 2
4.3 Monthly target
The expected delivery rate is planned as an average of approximately eighteen (18) performance points per month, based on the agreed annual performance target.
Monthly delivery may vary due to operational priorities, dependencies, or other agreed factors. Variations in monthly delivery do not constitute non-performance, provided that the cumulative delivery over the applicable performance year meets the agreed annual performance target.
The composition of SRDs and reports contributing to the monthly delivery shall be agreed during the monthly planning meeting.
After the first three (3) months, the complexity model and planning assumptions may be adjusted by mutual written agreement between the Contractor and NCIA, represented by the SMC Manager.
4.4 Meetings
The Contractor shall participate in the daily reporting and planning activities (daily stand-ups) as well as the required participation in workshops, events and conferences related to the supported services, up to 18 hours (3 days) per month as requested by the SMC Manager.
5 ACCEPTANCE CRITERIA and KPIs
5.1 Per SRD acceptance criteria
• Functional completeness: SRD implements all fields, workflows and integrations as specified.
• Process quality: Process documented in the NCIA Portal based on existing template.
• BPMN quality: BPMN diagram documented based on existing template and best practice standards.
• Defect rate: ≤ 2 critical defects in UAT; defects resolved before production.
• UAT sign-off: SMC Manager acceptance and approval.
5.2 Monthly KPIs:
The following KPIs shall be used to monitor delivery progress and quality on a monthly basis:
• Delivery progress: Monthly delivery aligned with the agreed planning baseline, contributing toward the annual performance target.
• On-time delivery: ≥ 90% of accepted deliverables delivered within the agreed monthly planning window.
• UAT defect density: ≤ 2 critical defects per SRD.
• Stakeholder satisfaction: average ≥ 4/5.
5.2.1 KPI 1.1 – SRD Delivery Acceptance Rate
Target: ≥90% of SRDs accepted at first submission
Acceptance Criteria: SRD fully functional in BMC environment (no blocking defects); Request workflows correctly configured (approvals, tasks, SLAs); Request fields, validations, and dependencies implemented as per specification; Naming conventions and categorization aligned with standards
5.2.2 KPI 1.2 – SRD Quality Compliance
Target: 90% compliance
Acceptance Criteria: Conforms to BMC best practices (version-specific); No critical or high defects in UAT; Reusable components applied where applicable; Performance does not degrade catalogue usability
5.2.3 KPI 1.3 – SRD Deployment Success Rate
Target: ≥90% successful deployments
Acceptance Criteria: Successfully migrated to production without rollback; No Sev1/Sev2 incidents within 5 business days post-release
5.2.4 KPI 2.1 – Report Acceptance Rate
Target: ≥90% accepted at first submission
Acceptance Criteria: Data validated against source systems (≤2% variance); Filters, prompts, and drill-downs working correctly; Layout and visualization meet stakeholder requirements; Performance within agreed thresholds
5.2.5 KPI 2.2 – Reporting Quality Score
Target: ≥90% compliance
Acceptance Criteria: No broken queries or data inconsistencies; Follows naming conventions and governance standards; Security roles and permissions correctly applied
5.2.6 KPI 2.3 – UAT Completion Efficiency
Target: ≥90% passed within 2 cycles
Acceptance Criteria: All test cases executed and signed off; Defects resolved and retested successfully
5.2.7 KPI 3.1 – End-to-End Delivery Acceptance
Target: ≥90% accepted without major rework
Acceptance Criteria: BPMN diagrams compliant with BPMN 2.0 standards; Process flows validated and approved by stakeholders; SRDs aligned with process design; Test scripts executed with ≥90% pass rate
5.2.8 KPI 3.2 – UAT Acceptance Rate
Target: ≥90%
Acceptance Criteria: The critical test scenarios passed; No high/critical defects open at sign-off; Business approval documented
5.2.9 KPI 3.3 – Process Effectiveness
Target: ≥85% stakeholder satisfaction
Acceptance Criteria: Process meets defined business outcomes; No major usability issues reported within 2 weeks
Monthly KPIs are intended to support delivery planning, transparency, and cont