Purpose: Why a Maturity Framework Matters
Regulators and auditors do not expect perfection. They expect demonstrable progress. A maturity assessment framework provides the structured basis for an organisation to understand where it stands, identify gaps, prioritise improvements, and — critically — prove to regulators that it is moving in the right direction.
Without a maturity framework, organisations face two common problems:
- Overestimation: Teams believe their DevSecOps practices are more mature than they actually are, leading to unpleasant surprises during audits
- Unfocused investment: Improvement efforts are scattered rather than targeted at the areas that matter most for regulatory compliance and risk reduction
This framework is designed for compliance officers, auditors, and risk managers — not engineers. It focuses on what to assess, what evidence to look for, and how to interpret findings in a regulatory context.
Maturity Levels Defined
Level 1: Initial / Ad-Hoc
Security is addressed reactively and inconsistently. There are no formal processes for integrating security into software delivery pipelines. Security activities depend on individual initiative rather than organisational policy.
Characteristics:
- No formal security integration in CI/CD pipelines
- Manual, ad-hoc security testing (if any)
- No documented security policies for software delivery
- Reactive incident response — issues found in production, not during development
- No systematic evidence collection for compliance purposes
Level 2: Defined
Basic security tools and processes are in place, and policies are documented. However, enforcement is inconsistent, and significant gaps remain in coverage and evidence generation.
Characteristics:
- Basic security scanning tools integrated into some pipelines
- Security policies documented but inconsistently enforced
- Some roles and responsibilities defined for security activities
- Vulnerability management exists but with incomplete coverage
- Evidence collection is manual and incomplete
Level 3: Managed
Security controls are embedded in pipelines as enforced policy gates. Controls are applied consistently, roles are clearly defined, and evidence is generated systematically to support compliance.
Characteristics:
- Policy-enforced security gates in all production pipelines
- Consistent control application across teams
- Automated evidence generation and retention
- Defined RACI matrix for DevSecOps activities
- Regular metrics reporting to management
- Exception management process with documented approvals
Level 4: Optimised
Continuous compliance is achieved through automation. Security controls are continuously refined based on metrics and threat intelligence. The organisation demonstrates predictive risk management and continuous improvement.
Characteristics:
- Continuous compliance monitoring and automated evidence collection
- Metrics-driven improvement cycles with documented outcomes
- Predictive risk management using trend analysis
- Advanced supply chain security and integrity controls
- Board-level reporting with clear risk appetite alignment
- Regular maturity reassessment with demonstrated progression
Assessment Dimensions
The maturity assessment covers ten dimensions. Each dimension is assessed independently, as organisations commonly have uneven maturity across areas.
Assessment Matrix
| Dimension | Level 1 (Initial) | Level 2 (Defined) | Level 3 (Managed) | Level 4 (Optimised) |
|---|---|---|---|---|
| Security testing integration | No automated security testing | Basic SAST or SCA in some pipelines | SAST, DAST, and SCA in all production pipelines with enforced gates | Comprehensive testing including IAST, container scanning, IaC scanning; continuously tuned for false positive reduction |
| Policy enforcement | No formal policies for pipeline security | Policies documented but manually enforced | Policies enforced as automated gates; bypass requires documented approval | Policy-as-code with version control, automated compliance checking, and continuous policy refinement |
| Access governance | Ad-hoc access with no regular review | Documented access policy; some role-based access | Role-based access enforced; regular access reviews; privileged access managed | Just-in-time access; automated access certification; continuous monitoring of access anomalies |
| Secrets management | Secrets in code or configuration files | Centralised secrets store for some applications | All secrets managed centrally; rotation policies enforced; no secrets in code (verified by scanning) | Automated rotation; dynamic secrets; comprehensive audit trail; secrets breach detection |
| Artifact integrity | No controls on artifact provenance | Basic artifact repository with some access controls | Signed artifacts; verified provenance; approved base images enforced | Full supply chain integrity (SLSA Level 3+); automated provenance verification; SBOM generation and monitoring |
| Vulnerability management | No systematic vulnerability tracking | Vulnerabilities tracked but no SLAs; inconsistent remediation | SLA-driven remediation; risk-based prioritisation; regular reporting | Predictive vulnerability management; automated remediation for known patterns; continuous SLA monitoring |
| Incident readiness | No incident response plan for pipeline security events | Basic incident response plan exists; not tested | Tested incident response plan; defined roles; post-incident review process | Automated incident detection in pipelines; playbook-driven response; regular exercises; continuous improvement from lessons learned |
| Compliance evidence | No systematic evidence collection | Manual evidence collection; incomplete coverage | Automated evidence generation; evidence mapped to control requirements; retention policy enforced | Continuous compliance monitoring; real-time evidence dashboards; automated regulatory reporting |
| Third-party governance | No visibility into third-party component risk | Basic dependency inventory; occasional review | Comprehensive SBOM; third-party risk assessment process; approved component policies | Continuous third-party monitoring; automated risk scoring; supplier security requirements enforced contractually and technically |
| Culture and training | No security training for development teams | Annual security awareness training; ad-hoc secure coding guidance | Role-specific training; security champions programme; training effectiveness measured | Continuous learning culture; gamified security training; knowledge sharing across teams; training tied to maturity improvement |
Self-Assessment Questionnaire
For each dimension, answer the following questions to determine your approximate maturity level. If the answer to all questions in a level is “yes”, the organisation has achieved that level for the dimension.
Security Testing Integration
- Are any automated security scanning tools integrated into CI/CD pipelines? (Level 2)
- Are SAST, DAST, and SCA scans executed in all production-bound pipelines? (Level 3)
- Do security scan failures block deployments unless explicitly approved? (Level 3)
- Are scanning tools continuously tuned based on false positive analysis and threat intelligence? (Level 4)
Policy Enforcement
- Are security policies for software delivery formally documented? (Level 2)
- Are policies enforced as automated gates in pipelines (not just documented)? (Level 3)
- Does every policy bypass require documented approval from an authorised person? (Level 3)
- Are policies managed as code, version-controlled, and subject to change management? (Level 4)
Access Governance
- Is there a documented access control policy for CI/CD platforms? (Level 2)
- Is role-based access control enforced across all pipeline platforms? (Level 3)
- Are access reviews conducted at least quarterly with documented outcomes? (Level 3)
- Is just-in-time or time-limited privileged access implemented? (Level 4)
Secrets Management
- Is a centralised secrets management solution in use? (Level 2)
- Are all application and pipeline secrets managed centrally with no secrets in code? (Level 3)
- Are secret rotation policies defined and enforced? (Level 3)
- Are dynamic, short-lived secrets used where possible? (Level 4)
Vulnerability Management
- Are vulnerabilities from security scans tracked in a central system? (Level 2)
- Are remediation SLAs defined by severity and consistently enforced? (Level 3)
- Is vulnerability remediation status reported to management regularly? (Level 3)
- Are remediation trends analysed to identify systemic issues and drive preventive action? (Level 4)
Compliance Evidence
- Is some compliance evidence collected from CI/CD processes? (Level 2)
- Is evidence generation automated and mapped to specific control requirements? (Level 3)
- Is evidence retained according to a defined retention policy? (Level 3)
- Is compliance posture continuously monitored with real-time dashboards? (Level 4)
Gap Analysis Approach
Once the self-assessment is complete, the gap analysis should follow a risk-prioritised approach:
- Map current maturity by dimension — create a heat map showing Level 1 through Level 4 for each of the ten dimensions
- Identify regulatory minimum requirements — determine the minimum maturity level required by applicable regulations (see below)
- Prioritise gaps — focus first on dimensions where current maturity is below the regulatory minimum
- Assess effort and dependencies — some improvements require foundational capabilities (e.g., you cannot achieve Level 3 in compliance evidence without Level 3 in security testing integration)
- Define improvement roadmap — sequence improvements logically, with clear milestones and owners
Regulatory Minimum Expectations
| Regulatory Framework | Typical Minimum Maturity | Rationale |
|---|---|---|
| DORA (financial services) | Level 3 across all dimensions | DORA requires systematic ICT risk management, tested controls, incident response capabilities, and third-party governance — all Level 3 characteristics |
| NIS2 (critical infrastructure) | Level 3 across all dimensions | NIS2 Article 21 requires “appropriate and proportionate” measures covering supply chain, incident handling, and business continuity — achievable at Level 3 |
| ISO 27001 | Level 2–3 depending on scope | ISO 27001 requires documented policies and evidence of control effectiveness. Level 2 may suffice for initial certification; Level 3 for mature ISMS |
| SOC 2 | Level 2–3 depending on criteria | SOC 2 Trust Services Criteria require designed and operating controls. Level 3 provides the strongest evidence base |
| PCI DSS (CDE scope) | Level 3 for CDE-touching pipelines | PCI DSS requirements for change management, access control, and vulnerability management align with Level 3 characteristics |
Roadmap Template
Use the following template to structure the improvement roadmap. Each row represents one improvement initiative linked to a specific dimension and maturity gap.
| Dimension | Current Level | Target Level | Key Actions | Timeline | Owner | Dependencies |
|---|---|---|---|---|---|---|
| Security testing | 2 | 3 | Extend SAST/DAST/SCA to all production pipelines; implement enforced gates | Q2 2026 | DevSecOps Lead | Pipeline inventory completion |
| Policy enforcement | 1 | 3 | Document policies; implement as automated gates; establish exception process | Q3 2026 | Security Architect | Security testing at Level 3 |
| Compliance evidence | 1 | 3 | Implement automated evidence collection; map to control framework; define retention | Q3 2026 | Compliance Officer | Policy enforcement at Level 3 |
| Access governance | 2 | 3 | Enforce RBAC; implement quarterly reviews; manage privileged access | Q2 2026 | Platform Team Lead | None |
| Vulnerability management | 2 | 3 | Define SLAs by severity; implement tracking; establish management reporting | Q2 2026 | DevSecOps Lead | Security testing at Level 2+ |
This is illustrative. Each organisation should populate based on their own assessment results.
What Auditors Should Verify
Documented Maturity Assessments
- Has the organisation conducted a formal DevSecOps maturity assessment?
- Is the assessment documented with evidence supporting each dimension rating?
- Was the assessment conducted by someone independent of the DevSecOps team (or at minimum, validated independently)?
- Is the assessment dated and version-controlled?
Improvement Plans
- Is there a documented improvement roadmap linked to the maturity assessment?
- Are improvement actions assigned to specific owners with target dates?
- Is the roadmap prioritised based on regulatory requirements and risk?
- Is there budget allocation supporting the improvement plan?
Evidence of Progress
- Has the maturity assessment been repeated (at least annually)?
- Is there evidence of maturity improvement over time?
- Where maturity has not improved, is there a documented explanation and revised plan?
- Are improvement milestones tracked and reported to management?
Red Flags for Auditors and Compliance Officers
| Red Flag | Why It Matters | Likely Finding |
|---|---|---|
| No maturity assessment has ever been performed | Organisation cannot demonstrate awareness of its own security posture | Governance deficiency — no baseline for improvement |
| Maturity is stagnant over multiple assessment periods | Improvement plans are not effective or not resourced | Management commitment concern — DORA Art. 5 / NIS2 Art. 20 |
| Self-assessed maturity does not match evidence | Assessment is unreliable — potential overstatement of controls | Control design or operating effectiveness gap |
| No improvement roadmap despite identified gaps | Assessment without action provides no value | Risk management deficiency |
| Maturity below Level 3 for DORA/NIS2-regulated entity | Below regulatory minimum expectations | Non-compliance risk requiring urgent remediation |
| Assessment performed only by the DevSecOps team with no independent validation | Self-assessment bias — may overstate maturity | Objectivity concern |
Next Steps
A maturity assessment is not a one-time exercise. It is the foundation of a continuous improvement cycle that regulators and auditors expect to see. Organisations should:
- Conduct an initial assessment using this framework, ideally with independent validation
- Document the results and share them with management and the compliance function
- Develop a prioritised improvement roadmap aligned with regulatory requirements
- Reassess at least annually, retaining historical assessments as evidence of progress
- Report maturity trends to the board as part of the security posture reporting cycle
For further guidance, see our resources on DevSecOps program governance, security architecture, and audit and governance readiness.
Related for Auditors
- Glossary — Plain-language definitions of technical terms
- Core CI/CD Security Controls
- Executive Audit Briefing
- Dual-Compliance Architecture
New to CI/CD auditing? Start with our Auditor’s Guide.