DORA Dry Run Preparation Guide (2026)


A DORA dry run is not just a technical rehearsal. For most financial entities, it is the first time operational teams, compliance, procurement, and ICT risk functions try to produce a regulator-ready Register of Information (ROI) dataset end-to-end, under time pressure, with defensible evidence. If you treat the exercise as “just export XBRL,” you will typically discover late-stage issues: inconsistent service inventories, unclear criticality assessments, missing LEIs, or uncontrolled changes between reporting cycles. This guide explains how to prepare for an ESA-style DORA dry run using practical controls that align to the Digital Operational Resilience Act (DORA) and the Implementing Technical Standards (ITS) on the Register of Information. If you need a refresher on scope and intent, start with what is dora.
Contents
What a DORA dry run is testing in practice
Even when a competent authority calls it a “pilot” or “dry run,” the exercise typically tests whether your institution can produce a complete, internally consistent ROI dataset that maps to the ESA taxonomy and technical requirements. In practice, the dry run pressures three areas at once:
This is why it helps to treat the exercise as an operational resilience workflow, not a one-time reporting task. The underlying DORA requirement sits within ICT third-party risk oversight and governance expectations (see digital operational resilience act dora and dora regulation explained). Your ROI is also closely linked to day-to-day third-party contracting and service management, which is where many “hidden gaps” surface.
Define the reporting perimeter and ownership
Before you fix data fields, confirm what your dry run is actually covering. Depending on your structure, the hardest part is often consolidating across entities and ensuring consistent interpretations of “ICT service,” “function,” and “service provider.”
1) Fix your dry-run perimeter in writing
If your institution is still aligning terminology and inventory boundaries, revisit your internal definition of the dora register and the supervisory intent behind the dora register of information.
2) Assign a single accountable owner for the ROI output
In most operating models, the owner is not purely “Compliance” or purely “IT.” A practical approach is a single accountable owner (for sign-off), with named data stewards per ROI module area (procurement/contracts, vendor management, ICT operations, risk, and business owners of functions).
3) Define the minimum evidence package for the dry run
A dry run is an opportunity to standardize evidence, so you can defend the process later. Typically, your evidence pack should include:

ROI data readiness: the checks that usually fail first
Most dry-run failures are not “XBRL problems.” They are basic inventory and consistency issues that only become visible when the ROI is forced into a structured taxonomy. The following checks are high-yield for dry-run readiness.
Check A: Contract and service inventory alignment
A common gap is a mismatch between what procurement considers a contract and what ICT considers a service. The ITS ROI structure expects you to tie contracts, service providers, and ICT services together in a way that is consistent across reporting tables.
Check B: Service provider identifiers and LEI quality
Identifier quality matters because it affects matching, de-duplication, and supervisory analytics. If you are relying on manual entry, you typically see invalid or missing Legal Entity Identifiers (LEIs), or inconsistent country codes.
Based on Dorapp documentation, DORApp includes automatic LEI validation and enrichment using public GLEIF data sources during record creation and import workflows, which can reduce rework when providers are added late in the cycle.
Check C: Function mapping and criticality consistency
Supervisory review often focuses on whether your function mapping is coherent and consistently applied. Misalignment shows up as:
Check D: Supply chain representation (where applicable)
Many institutions discover they cannot represent ICT service supply chains in a structured way because their vendor processes stop at “direct providers.” If your operating model includes significant subcontracting or layered cloud services, dry-run preparation should include a pragmatic method for capturing supply chain linkages, even if imperfect. You can refine accuracy over reporting cycles, but you need a consistent approach.
Check E: Validation rules and controlled remediation
Dry runs tend to become chaotic when validation issues are found late and resolved ad hoc. A more defensible approach is to treat validation findings like workflow-controlled remediation tasks with clear ownership and re-check gates.
Dorapp documentation states that DORApp validates ROI data across more than 250 points based on RTS and ITS references and provides record-level validation feedback. This can be useful, but you should still calibrate validation outcomes to your internal interpretations and supervisor feedback.
ESA reporting package readiness: data model, validation rules, and filing constraints
Here’s the thing: many DORA dry runs fail after “data is ready” because the reporting package and filing rules introduce constraints that your internal data model did not anticipate. Under DORA, the obligation to maintain the Register of Information sits within the ICT third-party risk framework, and the reporting mechanics are implemented through ITS developed by the European Supervisory Authorities (EBA, EIOPA, and ESMA) via the Joint Committee. DORA has been applicable since January 2025, and most supervisors expect institutions to be able to produce a regulator-ready dataset at entity, sub-consolidated, and consolidated levels where relevant.
From a practical standpoint, align your dry run plan to three deliverables, not one:
1) Track the exact taxonomy and “data point” expectations used in the exercise
Dry run instructions typically point you to a specific taxonomy version, a data model, and a dictionary of data points. If your institution builds mappings against an older package, you can end up with “correct data” in the wrong place, or with outdated enumerations that fail validation. Treat the reporting package as version-controlled input to your program, with an accountable owner who can confirm:
This content is for informational purposes only and does not constitute legal advice. Financial institutions should seek qualified regulatory counsel for institution-specific DORA compliance guidance.
2) Pre-test validation rules before business sign-off
What many compliance teams overlook is sequencing: if business owners sign off on “their data,” then validation rules change the allowed values or tighten dependencies, you end up reopening approvals under time pressure. A better control is to run a “validation-first” pre-check before the formal internal sign-off stage, so approvals reflect what will actually be exportable and fileable.
In most cases, build a short dry run cadence that looks like this:
3) Understand consolidated reporting friction points early
If your dry run perimeter includes sub-consolidated or consolidated levels, you will typically hit friction in harmonization, not “missing fields.” Common examples include inconsistent provider de-duplication, differences in contract naming, and mismatched interpretations of ICT service boundaries across subsidiaries. These are governance issues that become visible as technical errors when you attempt to generate a single, consistent output.
Consider formalizing a group-level “interpretation note” for the dry run, even if it is lightweight. The goal is to document how your group interprets recurring classification choices so you can apply them consistently and defend them if challenged by internal audit or supervisors.
XBRL readiness: validate output, not just data entry
In a DORA dry run, the most visible deliverable is usually the XBRL output aligned to the ESA taxonomy and technical requirements. Many programs over-focus on “populating fields” and under-focus on whether the exported structure validates correctly and is reproducible.
1) Treat XBRL export as a repeatable control
The control objective is not “we exported once.” The objective is “we can export reliably and re-run after remediation without breaking the dataset.”
DORApp documentation describes a DORA Report workflow that exports the ROI dataset as a ZIP in XBRL format and will fail the export if validation errors remain, forcing remediation before a compliant output is produced.
2) Confirm snapshot logic and reporting date semantics
One subtle but important point is the meaning of the reporting date. Dorapp documentation notes that the “Date of report” can represent the date to which ROI data is valid (a snapshot date), which may differ from the execution date of the export. In a dry run, you should document this explicitly and align it with internal change control and audit expectations.
3) Plan a conversion and review path for business stakeholders
XBRL is not designed for human review. You need a method to convert and check the output for plausibility. DORApp documentation describes a “DORA ROI XBRL Converter” module that can convert XBRL into XLSX and other formats. A conversion capability can materially speed up dry-run review cycles because business and compliance teams can spot anomalies without specialized tooling.
If you need a conceptual refresher for non-technical stakeholders, point them to xbrl and document how your institution reviews and approves the resulting dataset.

Evidence and controls: prove the process, not only the dataset
Supervisors may ask not only for the ROI output but also for how you produced it. A dry run is the right moment to establish a lightweight control framework around ROI production.
Control 1: Workflow governance and sign-offs
For higher-risk institutions, a spreadsheet-based approach can struggle to prove who approved what and when. Dorapp documentation describes an Execution Governance Engine with configurable review gates and single- or multi-user sign-off across modules. For a dry run, the practical benefit is defensible approvals, especially for criticality decisions and late-cycle fixes.
Control 2: Audit trail and change tracking
Dry runs commonly expose uncontrolled changes: a service provider renamed, a contract updated, a function reclassified, and the export no longer matches what teams thought they approved. Dorapp documentation states DORApp provides a comprehensive audit trail of system activity, including record changes, workflow transitions, approvals, timestamps, and decision rationale. This can reduce debate later about what changed and why.
Control 3: Data import discipline
If you import ROI data from Excel or CSV, sequence and referential integrity matter. Dorapp documentation describes a recommended data import sequence and predefined Excel templates with field names aligned to the system and validation rules. For dry-run preparation, standard templates can reduce mapping errors and speed up onboarding of distributed data owners.
Control 4: Repeatability across cycles
A pilot exercise often becomes the foundation for ongoing reporting. The goal should be a repeatable cycle where:
What supervisors and the ESAs learn from the dry run (and why it matters)
A DORA dry run is not only a test of your readiness. It is also a test of how usable your ROI is for supervision and for the ESAs’ broader DORA objectives. Under Chapter V of DORA, the ESAs have an oversight framework for critical ICT third-party service providers, and the Register of Information is a core input into that ecosystem. This is why supervisors and the ESAs tend to focus on whether your dataset supports reliable aggregation, comparison, and concentration analysis across the market.
Consider this as you prepare: if your ROI cannot be analyzed consistently, you should expect follow-up questions, even when the file technically validates.
1) Data quality is supervisory signal, not “administrative hygiene”
In most cases, supervisors care about internal consistency because it indicates how well your third-party risk management is governed. Typical red flags include:
When these issues appear, the dry run becomes more than a technical exercise. It can trigger governance remediation plans, internal audit involvement, or additional supervisory dialogue.
2) The dry run supports market-wide objectives, including CTPP designation
One reason the ESAs emphasize the ROI reporting structure is that aggregated registers support the designation process for critical ICT third-party service providers. That designation process is run at EU level under the ESAs, typically coordinated through the Joint Committee. Your institution is not expected to “decide who is critical” at market level, but your data must be good enough to contribute to that determination.
This is also why identifier quality and de-duplication matter. Weak identifiers can distort concentration analysis and create noise in supervisory datasets.
3) Expect iterative feedback and plan to operationalize it
Dry runs often produce feedback on data quality, interpretation choices, and recurring validation errors. Treat that feedback as program input, not as a one-off comment. A practical control is to maintain a dry run “findings register” that includes:
This content is for informational purposes only and does not constitute legal advice. Financial institutions should seek qualified regulatory counsel for institution-specific DORA compliance guidance, particularly if supervisor feedback implies changes to governance, contracting, or third-party risk controls.
Commercial evaluation: what to look for in a dry-run capable ROI platform
If you are evaluating whether to run your dry run in a dedicated platform versus manual tooling, assess solutions against the operational realities above. A dry run rewards platforms that reduce ambiguity, enforce process, and produce a validated export.
Capability checklist (dry-run focused)
How Dorapp fits this use case (based on published documentation)
DORApp is described as a modular DORA-focused platform with a dedicated ROI capability, DORA Report export to XBRL (ZIP output), record-level validation, and audit trail. It also includes automatic LEI validation and enrichment via public GLEIF data sources and provides predefined import templates and a defined import sequence. For institutions that need to make a dry run repeatable, these features can reduce manual reconciliation and improve evidence quality.
If you want to see how a controlled dry-run workflow could operate in practice, you can review DORApp modules and functions at DORApp Modules and DORApp Functions, or request a walkthrough via Book a Demo. Keep in mind that tooling supports compliance, but it does not replace your internal governance decisions and legal interpretation.

Strengths and Challenges
Strengths
Implementation Considerations
Frequently Asked Questions
What is a “DORA dry run” in practical terms?
A DORA dry run is typically a supervised or internally initiated rehearsal of producing the Register of Information in the format expected by competent authorities, often aligned to the ESA taxonomy and technical requirements. The point is not only to “create an XBRL file,” but to prove that your institution can collect, validate, approve, and reproduce the dataset under controlled processes. The exact scope may vary by supervisor and entity type.
Is an ESA pilot exercise the same as a formal submission?
Not necessarily. A pilot exercise is often described as non-binding, but it can still reveal issues that would block or weaken a formal submission later. Many financial entities treat it as a de facto readiness test because it simulates supervisory expectations around completeness, consistency, and operational control. You should document assumptions and keep evidence, because outcomes may influence later supervisory dialogue.
What usually causes dry-run failure for the DORA ROI?
The most common blockers are inconsistent inventories and missing linkages: contracts not tied to services, providers duplicated under different names, missing identifiers such as LEIs, and inconsistent classification of functions and criticality. XBRL validation failures are often symptoms of these underlying issues. Early validation and controlled remediation tend to be more effective than late-cycle “data cleanup sprints.”
How should we organize ownership for ROI preparation?
Most institutions benefit from a single accountable ROI owner (responsible for the export and sign-off) supported by data stewards across procurement, vendor management, ICT operations, risk, and business function owners. The key is to document who owns which fields and who approves which decisions. Without clear ownership, dry-run cycles can devolve into email-based reconciliation that is difficult to evidence.
How does XBRL affect the dry-run process?
XBRL is the machine-readable format used for structured reporting. It can be difficult to review manually, so you typically need a repeatable export control and a conversion or review method (often to XLSX) for plausibility checks by business and compliance teams. If you want a conceptual overview for stakeholders, reference xbrl and document your internal review and approval approach.
Do we need perfect supply chain data for the first dry run?
That depends on supervisory scope and your ICT outsourcing model. In practice, many entities start with a pragmatic representation of supply chains and improve it across reporting cycles. The dry run is still valuable because it forces you to define a consistent method and identify the highest-risk gaps. You should avoid over-claiming completeness if subcontracting visibility is limited.
What evidence should we retain after the dry run?
Keep an evidence package that can explain how the ROI was produced: data sources and lineage, validation results and remediation actions, approvals and sign-offs for key classifications, and a record of the final export artifact. Tool-based audit trails can help, but you should also keep written decisions for higher-impact judgments. Evidence expectations may evolve as the European Supervisory Authorities issue further guidance.
How can Dorapp support an ROI-focused dry run?
Based on Dorapp documentation, DORApp supports ROI maintenance with validation, predefined import templates, automated LEI enrichment via GLEIF, an audit trail, and DORA Report export to an XBRL ZIP output aligned to ESA technical requirements. It also describes an XBRL converter to XLSX for review. To evaluate fit, you should test the workflow on your own sample services and contracts and confirm how approvals and change control would operate in your model.
Do we still need legal review if we use an ROI tool?
Yes, in most cases. A platform can structure data, enforce validation, and produce evidence, but it cannot replace legal interpretation of DORA obligations, contractual requirements, or supervisory expectations for your specific entity classification. You should involve qualified legal counsel to confirm how your ROI perimeter, definitions, and disclosures align with DORA, ITS, and any national supervisory expectations.
Is there an official “DORA certification” we can obtain after a dry run?
No. DORA (Regulation (EU) 2022/2554) does not establish an official certification program for financial entities or individuals. In practice, institutions demonstrate alignment through governance, controls, evidence, and supervisory engagement, including the ability to produce accurate regulatory reporting outputs such as the Register of Information. Training and third-party attestations may support your internal capability building, but they do not replace DORA obligations or supervisor expectations.
What is the ESA “dry run procedure” for the Register of Information?
Dry run procedures can vary by competent authority and the specific exercise design, but they typically include instructions to produce ROI data aligned to the ITS and to submit or test a report package that validates against the ESA taxonomy and validation rules. The ESAs (EBA, EIOPA, and ESMA) have previously used dry runs to identify common data quality issues and provide feedback to participating financial entities. Your internal procedure should mirror that reality: version-control the reporting package, run validations early, and retain evidence of remediation and sign-off.
Which ESA guidance is most relevant for ROI dry run readiness?
For ROI reporting mechanics, the most relevant references are the ITS on the Register of Information and any ESA resources that accompany official reporting, such as validation rules and technical reporting packages. The ESAs have also published dry run findings and, later, guidance on DORA oversight activities for critical ICT third-party service providers. You should treat these materials as implementation guidance that supports, but does not replace, the legal requirements in DORA and the ITS.
What should we do if our XBRL file validates but the content still looks wrong?
Validation success means your output meets technical constraints. It does not guarantee that your classifications, mappings, and inventories reflect supervisory intent. In that situation, focus on plausibility checks that mirror how supervisors may analyze the data, for example provider de-duplication, concentration signals, and critical function consistency. A controlled review workflow, including business validation of converted outputs (often via XLSX), typically helps catch “valid but wrong” issues earlier.
Key Takeaways
Conclusion
A well-run DORA dry run is a structured operational exercise: define scope, assign ownership, validate ROI data early, and treat XBRL export as a repeatable control with evidence. That approach typically reduces late-cycle remediation, improves audit readiness, and makes future reporting cycles more predictable. If you are assessing tooling to support this, focus on validation discipline, traceable approvals, audit trails, and reliable export against the ESA technical requirements. If you want a practical walkthrough of an ROI dry-run workflow, you can explore DORApp Modules and request a consultative session via Book a Demo to test your own sample dataset and reporting process assumptions.
Disclaimer: This article is intended for informational purposes only and does not constitute legal advice. DORA compliance obligations vary depending on the classification and size of your financial institution. Consult qualified legal or regulatory counsel to assess your specific obligations under the Digital Operational Resilience Act and applicable regulatory technical standards.
About the Author
Matevž Rostaher is Co-Founder and Product Owner of DORApp. He brings deep experience in building secure and compliant ICT solutions for the financial sector and is positioned by DORApp as an expert trusted by financial institutions on complex regulatory and operational challenges. DORApp’s own webinar materials list him as CEO and Co-Founder of Skupina Novum d.o.o. and CEO and Co-Founder of FJA OdaTeam d.o.o. His articles should carry the voice of someone who understands not just compliance requirements, but the systems and delivery realities behind them.