
Project Noor: Shedding Light on AI's Black Box for Financial Supervisors
Artificial Intelligence is both the biggest opportunity and the greatest risk for the financial sector today. While banks leverage AI for unprecedented efficiency, they are also creating "black boxes" that even they don't fully understand. What happens when these black boxes make biased decisions or fail unexpectedly?
This is the crisis that Project Noor, a landmark initiative by the BIS Innovation Hub, HKMA, and FCA, was created to solve. This analysis breaks down the 'perfect storm' of regulatory pressure and market failures that made Project Noor inevitable, explains its core mission, and outlines what it means for the future of financial compliance and Compliance-as-a-Service (CaaS).
The Inevitable Crisis: Why Regulators Were Forced to Act
The journey to Project Noor began long before its official launch. A confluence of regulatory pressures and market failures created an urgent need for supervisors to look inside the "black box" of AI models.
The "Black Box" Dilemma
The term "black box" isn't just technical jargon; it's a massive business liability. When a bank cannot explain why its AI model denied someone a loan or flagged a transaction, it faces regulatory fines, legal challenges, and severe reputational damage. A 2024 AI Survey by the Bank of England and FCA revealed that while 75% of financial firms use AI, only 34% have a complete understanding of how it works. [6, 7] This is an untenable risk.
The Regulatory Ticking Clock: The EU AI Act
The single most significant catalyst was the European Union's Artificial Intelligence Act, which entered into force on August 1, 2024. [2, 4] Due to the "Brussels Effect," this regulation is now the de-facto global standard. Any international bank operating in Europe must comply, creating immense urgency. Its core mandate is clear: high-risk AI systems, including those in finance, must be explainable and auditable. This was the final push that forced the issue of AI transparency from a "nice-to-have" to a legal and compliance imperative.
The Erosion of Trust: Public Backlash and Algorithmic Bias
It's not just about compliance; it's about market trust. High-profile cases of algorithmic bias have shown that AI can perpetuate historical discrimination in lending and other financial decisions. [19] This has led to public outrage and a growing demand for supervisors to have tools to independently audit AI models for fairness, ensuring that customers are treated equitably.
The Response: Project Noor's Three-Pillar Solution
In response to this crisis, the BIS Innovation Hub, HKMA, and FCA launched Project Noor. [1] Their collaboration is critical for creating an international standard. The project's mission is built on three pillars designed to turn opaque AI into a transparent and trustworthy tool.
- Pillar 1: Driving Transparency (Countering the Black Box)
Project Noor is developing methods to translate complex AI logic into human-readable language and visuals. This directly solves the "black box" problem by empowering supervisors to understand and verify a model's decision-making process. - Pillar 2: Enforcing Fairness (Rebuilding Trust)
The project will create practical tools and benchmarks to audit AI for algorithmic bias. This directly addresses the erosion of trust by providing a concrete way to test for and mitigate unfair outcomes. - Pillar 3: Ensuring Robustness (Preventing Catastrophe)
Finally, the initiative will deliver tools to test how AI models react to unexpected market shocks or malicious attacks. This is crucial for ensuring the stability of an increasingly AI-driven financial system.
The Ripple Effect: What Project Noor Means for the Financial Industry
Project Noor is not an academic exercise; it is the starting gun for a new market reality.
For Banks: The End of "Compliance Theatre"
The era of simply ticking a box for AI governance is over. Banks will now be held to a higher, provable standard. They must invest in truly explainable AI (XAI) and robust, independent validation processes. Those who fail to adapt will face significant regulatory and business risks.
For Compliance-as-a-Service (CaaS): The Next Frontier
For CaaS firms, Project Noor represents a massive opportunity. The firms that thrive will be those that can offer specialized, high-value services to meet this new demand, including:
- AI Model Validation & Bias Auditing: Services to test client models against emerging regulatory standards for fairness and transparency.
- XAI Reporting & Governance Tools: Platforms that help banks automatically generate the transparency reports and governance documentation that regulators will demand.
- Regulatory Intelligence: Expertise to guide clients through the rapidly evolving landscape of global AI regulation.
A Timeline of the Perfect Storm
August 1, 2024
The EU AI Act enters into force, creating a legal mandate for explainable AI. [4, 6]
September 2024
The HKMA publishes its guidance on generative AI in financial services.
December 2024
The BIS publishes its FSI Insights paper "Regulating AI in the financial sector". [3, 16]
January 2025
The FCA hosts its AI Sprint, where the industry calls for regulatory clarity and international collaboration. [9, 11]
February 2025
The UK Government publishes its AI innovation strategy, setting the stage for international initiatives.
August 18, 2025
Project Noor is officially launched by the BIS Innovation Hub, HKMA, and FCA. [1, 12]
Conclusion: A New Era of Accountable AI
A crisis of complexity and mistrust was brewing in finance, the EU AI Act lit the fuse, and Project Noor is the world's coordinated response. It is more than a technical project; it's a fundamental shift in financial regulation.
It signals the end of the "black box" era and the beginning of a new age of accountable, transparent, and fair AI. For firms in the financial ecosystem, the time to prepare is now.
"Clear, human-readable explanations can strengthen confidence and help keep digital finance fair for everyone." [1]
References
- Bank for International Settlements. (2025, August 18). Project Noor: explaining AI models for financial supervision. https://www.bis.org/about/bisih/topics/suptech_regtech/noor.htm
- European Commission. (2024, August 1). AI Act enters into force. https://commission.europa.eu/news-and-media/news/ai-act-enters-force-2024-08-01_en
- Goodwin Law. (2024, August 9). EU AI Act: Key Points for Financial Services Businesses. https://www.goodwinlaw.com/en/insights/publications/2024/08/alerts-practices-pif-key-points-for-financial-services-businesses
- GDPR.eu. Article 22 GDPR -- Automated individual decision-making, including profiling. https://gdpr-info.eu/art-22-gdpr/
- Bank for International Settlements. (2024, December). Regulating AI in the financial sector: recent developments and main challenges. FSI Insights No 63. https://www.bis.org/fsi/publ/insights63.pdf
- Bank of England. (2024, November 21). Artificial intelligence in UK financial services - 2024. https://www.bankofengland.co.uk/report/2024/artificial-intelligence-in-uk-financial-services-2024
- Global Relay. (2024, November 26). FCA report shows 75% of firms use AI, only 34% know how it works. https://www.globalrelay.com/resources/thought-leadership/fca-report-shows-75-of-firms-are-now-using-ai-but-only-34-know-how-it-works/
- Financial Conduct Authority. (2025, April 23). AI Sprint summary. https://www.fca.org.uk/publications/corporate-documents/ai-sprint-summary
- Hong Kong Monetary Authority. (2024, September). Generative Artificial Intelligence in the Financial Services Space. https://brdr.hkma.gov.hk/eng/doc-ldg/docId/getPdf/20241118-4-EN/20241118-4-EN.pdf
- Bank for International Settlements. (2024, December). Regulating AI in the financial sector: recent developments and main challenges. FSI Insights No 63. https://www.bis.org/fsi/publ/insights63.pdf
- Financial Conduct Authority. AI Update. https://www.fca.org.uk/publication/corporate/ai-update.pdf



