Attention: You are using an outdated browser, device or you do not have the latest version of JavaScript downloaded and so this website may not work as expected. Please download the latest software or switch device to avoid further issues.
| 3 Mar 2026 | |
| Blogs |
The Global Financial Crisis (GFC) revealed extraordinary weaknesses in the ability of financial firms and their regulators to analyze and understand risks. When regulators can’t see the buildup of leverage, when banks can’t evaluate their own exposures, and when firms can’t reconcile their positions, systemic risk grows quietly.
Rumblings and significant market shifts that led to the GFC began breaking the surface in the United States in June 2007 and spread internationally by September 2008, prompting policy and regulatory responses that shaped the financial sector for years to follow.
Hindsight is built on experience. What we know now, we might have known then. But we didn’t have a way to connect the dots.
In the two decades since the initial shock waves of the GFC, financial regulators and financial firms have built up oversight and reporting requirements. But “messy” or “bad” data—data that cannot transfer from one system to another, might be outdated by weeks or months, and are muddled with errors—stymies insights and oversight while costing firms millions in compliance costs. Standardized data are essential for running financial businesses, managing risks, and monitoring systemic risks.
Standardized data deliver essential benefits:
Standardized data can deliver high-value returns by making it easier to transmit context, meaning, and analysis, and trace data lineage. Still, developing standards for financial data presents specific challenges:
Policymakers must take these concerns seriously in seeking to collect new data or promote new standards. In some cases, when incentives align, the private sector has developed standards without substantial public intervention. For example, the International Swaps and Derivatives Association (ISDA) has long provided data standards for derivatives contracts, including the Common Domain Model and Financial products Markup Language (FpML) formats. But there are limits.
Despite the challenges, there have been important successes in recent years:
These innovations demonstrate that the public and private sectors can collaborate on effective and valuable data standards.
Gaps remain, especially when it comes to emerging financial markets and innovations:
The mission of improving data and analysis has often received bipartisan support. The Foundations for Evidence-Based Policymaking Act of 2018 (Evidence Act), passed during the Trump administration, strengthened federal data governance by requiring agencies to develop data agendas, appoint chief data officers, and improve access to government data for policymaking.
Between 2018 and 2022, and even now, the proliferation and generation of data continue to grow exponentially. Financial reports still function in partial digital formats and in static documents, like PDFs, making data retrieval, review, and use labor-intensive and prone to additional errors. Expanding the consistent use of data standards can improve data quality. The passage of the FDTA reflected a recognition that data modernization can reduce regulatory burden, improve oversight, and support more competitive markets.
On the flipside, when markets and regulators rely on poor financial data, investors misprice risk, regulators miss red flags, and ultimately, taxpayers underwrite bailouts.
This blog was written by Greg Feldberg, a lecturer at the Yale School of Management and a member of the Open Data Standards Task Force, a collaborative initiative co-chaired by the Data Foundation and Bloomberg that brings together experts from government, businesses, non-profits, and academia to address challenges in implementing common business identifiers, entity identifiers, and data standards in the United States, particularly within financial services. He is a former senior associate director at the Office of Financial Research and Federal Reserve bank supervisor.
References
Bair, Sheila, and Larry Goodman (2025). "A Government Agency Worth Saving." Wall Street Journal. https://www.wsj.com/opinion/a-government-agency-worth-saving-research-ofr-0bbdaf2c?mod=commentary_more_article_pos27
Couillault, Bertrand, Jun Mizuguchi, and Matt Reed (2017). “Collective Action: Toward Solving a Vexing Problem to Build a Global Infrastructure for Financial Information.” OFR Brief no. 17-01, February 2, 2017. https://www.financialresearch.gov/briefs/2017/02/02/collective-action-toward-solving-a-vexing-problem-to-build-global-infrastructure/
Feldberg, Greg (2020). "Fixing Financial Data to Assess Systemic Risk." Brookings Institution. https://www.brookings.edu/articles/fixing-financial-data-to-assess-systemic-risk/
Data Foundation (2024). "Response to Joint FDTA Data Standards Proposed Rule." https://datafoundation.org/news/financial-data-transparency-hub/416/
Financial Stability Oversight Council (2024). 2024 Annual Report. https://home.treasury.gov/system/files/261/FSOC2024AnnualReport.pdf
U.S. Congress (2019). Foundations for Evidence-Based Policymaking Act of 2018. https://www.congress.gov/bill/115th-congress/house-bill/4174
U.S. SEC et al. (2024). "FDTA Joint Data Standards Proposed Rule." https://www.sec.gov/files/rules/proposed/2024/33-11295.pdf
U.S. Senate Banking Committee (2025). "ICYMI: Yellen, Bernanke, Over 50 Bipartisan Experts Urge Congress to Prevent the Elimination of Key Financial Stability Watchdog." Minority Press Release, June 18, 2025. https://www.banking.senate.gov/newsroom/minority/icymi-yellen-bernanke-over-50-bipartisan-experts-urge-congress-to-prevent-the-elimination-of-key-financial-stability-watchdog