Due diligence refers to the structured process of investigation and risk evaluation undertaken before entering into agreements, launching products, or integrating third-party technologies. Within the scope of the EU AI Act, due diligence plays a pivotal role in ensuring that AI systems—especially those deemed high-risk—are assessed for compliance, ethical risks, legal obligations, and operational integrity before deployment or procurement.
1. Background and Establishment
Traditionally rooted in corporate law and mergers, due diligence has evolved into a critical function of AI governance. It entails the thorough examination of the legal, ethical, technical, and operational implications associated with acquiring, developing, or deploying an AI system.
Under the EU Artificial Intelligence Act, due diligence is not merely a business best practice—it is an implicit compliance expectation. Organizations that fail to vet AI systems or partners before deployment may face regulatory penalties, reputational loss, and ethical liability.
2. Purpose and Role in the EU AI Ecosystem
In the AI context, due diligence is a risk-prevention mechanism that supports:
- Compliance with the EU AI Act before contractual commitment or market entry
- Evaluation of AI system classification (e.g., high-risk, prohibited, general-purpose)
- Assessment of third-party providers and data sources
- Scrutiny of conformity assessment procedures and CE markings
- Review of ethical risks such as bias, discrimination, or opacity
By conducting structured due diligence, organizations ensure that they do not unknowingly become liable for the failures or shortcomings of upstream vendors, software components, or data pipelines.
3. Key Contributions and Impact
Effective due diligence can:
- Prevent non-compliant AI systems from entering operations
- Reduce the likelihood of administrative fines
- Identify systemic weaknesses before scaling deployment
- Enhance trust among customers, regulators, and partners
- Inform contract terms, liability allocation, and audit clauses
In an increasingly interconnected AI ecosystem—where components, models, and APIs often originate from third parties—due diligence provides visibility into the unknown and assurance against inherited risk.
4. Connection to the EU AI Act and the EU AI Safety Alliance
The EU AI Act does not use the term “due diligence” in isolation but incorporates its function through several obligations:
- Articles 16–29 – Responsibilities of providers, importers, and distributors
- Article 28b – Obligations for general-purpose AI models
- Annex IV – Requirements for technical documentation and risk assessment
The EU AI Safety Alliance helps operationalize due diligence through:
- Pre-deployment risk templates
- Provider integrity and compliance scorecards
- Standardized checklists for vendor and technology evaluations
- Verification pathways for CE conformity and notified body certificates
Working with the EU AI Safety Alliance allows organizations to document their diligence and demonstrate good faith efforts in regulatory audits.
5. Stakeholder Roles in Due Diligence Processes
A robust due diligence workflow includes contributions from:
- Legal counsel – Reviews regulatory and contractual compliance
- Procurement and vendor management teams – Vet third-party partners
- Technical leads – Assess system capabilities and risks
- Ethics and risk officers – Evaluate societal impacts and reputational risks
- Data protection officers (DPOs) – Scrutinize data provenance and protection
Each stakeholder ensures that due diligence covers not only legality and cost but also integrity, safety, and accountability.
6. Core Areas to Examine in AI Due Diligence
Key focus areas for AI-specific due diligence include:
- Risk classification of the AI system
- CE conformity status and supporting documentation
- Transparency mechanisms (e.g., explainability features, disclaimers)
- Bias and fairness audits (especially in employment, finance, and healthcare)
- Human oversight structures
- Data governance, including origin, legality, and representativeness
- Cybersecurity and system resilience
- Vendor history, including prior compliance breaches or litigation
This process should result in a diligence report, retained internally and updated prior to significant AI system changes.
7. How to Implement an Effective Due Diligence Process
To execute effective due diligence under the EU AI Act:
- Initiate a formal diligence protocol for all AI acquisitions or deployments.
- Use standardized review templates provided by the EU AI Safety Alliance.
- Request documentation from vendors, including CE markings, Annex IV files, and audit records.
- Engage technical, legal, and ethical reviewers before contract signing.
- Require vendors to complete a compliance self-assessment or third-party certification.
- Incorporate diligence findings into contracts, especially regarding liability, updates, and incident notification.
- Periodically refresh due diligence to reflect regulatory or technological changes.
Due diligence is not a box-checking exercise—it is a dynamic process that fortifies the foundation of safe and lawful AI deployment.