Many organizations preparing for DORA Article 28 focus on collecting documents.
Auditors, however, do not evaluate compliance based on document volume — they evaluate control effectiveness, consistency, and traceability.
This article explains how auditors actually review an Article 28 evidence pack, what they expect to see behind each evidence category, and why CI/CD and third-party tooling play a central role in audit outcomes.
This content complements, rather than duplicates, the detailed checklist and evidence lists already provided in the DORA Article 28 Evidence Pack — What to Show Auditors.
How Auditors Approach an Article 28 Review
Auditors do not start with tools or architecture diagrams.
They start with risk questions:
- Do third-party ICT providers introduce unmanaged operational risk?
- Are controls enforceable beyond internal systems?
- Can the organization demonstrate continuous oversight?
- Is evidence objective and time-bound?
Evidence is assessed as proof of operational control, not as policy confirmation.
Third-Party Inventory: Completeness Over Formality
What auditors expect
Auditors first verify that the organization has a complete and accurate inventory of ICT third-party providers.
They expect:
- inclusion of CI/CD SaaS platforms, not just traditional vendors,
- visibility into cloud services, registries, and code hosting,
- linkage between suppliers and actual systems in use.
How they assess it
Auditors cross-check:
- the supplier inventory,
- CI/CD and cloud configurations,
- procurement and contract records.
Any mismatch is treated as a control failure, not a documentation issue.
Supplier Risk Classification: Meaningful, Not Generic
What auditors expect
Risk classification must be:
- documented,
- consistent,
- and used to drive controls.
Auditors expect to see:
- clear criteria for criticality,
- justification for high-risk suppliers,
- escalation paths for critical ICT providers.
Common audit observation
Risk classification that does not influence:
- contractual requirements,
- monitoring depth,
- exit strategy obligations
is considered theoretical.
Contract Clauses: Enforceability Matters
What auditors expect
Auditors review contracts to confirm the presence of clauses covering:
- information security,
- audit and inspection rights,
- incident notification timelines,
- exit and termination conditions.
More importantly, they verify that these clauses are operationally enforceable.
Typical auditor challenge
“You have audit rights in the contract — how do you exercise them in practice?”
CI/CD logs, access controls, and monitoring outputs are often used as technical proof that contractual rights can be exercised.
Monitoring Evidence: Continuous, Not Periodic
What auditors expect
Auditors expect ongoing monitoring, not annual reviews.
They look for:
- live or historical monitoring data,
- alerts tied to third-party services,
- visibility into CI/CD platform availability and integrity.
Monitoring evidence must demonstrate that:
- third-party degradation would be detected,
- incidents would be escalated within defined timelines.
Static risk assessments alone are insufficient.
Incident Notification SLAs: Tested in Reality
What auditors expect
Auditors verify that:
- incident notification SLAs exist contractually,
- internal processes can receive and act on notifications,
- timelines are realistic and tested.
They often request:
- past incident examples,
- timestamps showing notification delays,
- evidence of escalation and response.
An SLA that has never been exercised is treated as unproven.
CI/CD Logs Involving Third-Party Tools
Why CI/CD logs matter to auditors
CI/CD platforms provide some of the strongest Article 28 evidence because they are:
- time-stamped,
- immutable,
- system-generated.
Auditors expect logs showing:
- access to third-party platforms,
- pipeline executions involving external services,
- approvals and policy enforcement,
- artifact generation and distribution.
These logs demonstrate control in operation, not just control design.
Evidence Quality: How Auditors Judge Credibility
Auditors assess evidence against four implicit criteria:
- Objectivity – system-generated, not manually edited
- Traceability – linked to a specific supplier or control
- Continuity – produced consistently over time
- Integrity – protected against alteration
Evidence failing any of these criteria weakens the entire pack.
Common Reasons Evidence Packs Fail Audits
From audit experience, evidence packs often fail because:
- CI/CD SaaS platforms are excluded from supplier scope
- Evidence exists but cannot be linked to a control
- Logs are available but not retained long enough
- Incident SLAs exist but were never tested
- Exit strategies are documented but unsupported by technical reality
These issues usually surface during the audit, not during preparation.
How to Use This Article in Practice
This article should be read alongside:
- DORA Article 28 Evidence Pack — What to Show Auditors (what to collect)
- DORA Article 28 Architecture — Explained (where controls sit)
- DORA Article 28 — Auditor Checklist (how compliance is assessed)
Together, they provide:
- the what,
- the where,
- and the how of Article 28 compliance.
Key Takeaway
DORA Article 28 evidence packs are not judged by completeness alone.
They are judged by credibility.
Auditors expect evidence that proves:
- third-party risks are known,
- controls are enforced,
- and compliance is continuous.
CI/CD pipelines and third-party platforms are not supporting elements — they are central sources of audit evidence.
- DORA Article 28 Evidence Pack — What to Show Auditors
- DORA Article 28 — Tools → Controls → Evidence Mapping
- DORA Article 28 — Auditor Checklist (Yes / No / Evidence)
- CI/CD Audit Red Flags