Attention: You are using an outdated browser, device or you do not have the latest version of JavaScript downloaded and so this website may not work as expected. Please download the latest software or switch device to avoid further issues.

ANALYSIS > Blogs > The Value and Challenge of Data Standards in Finance

The Value and Challenge of Data Standards in Finance

Greg Feldberg, a member of the Open Data Standards Task Force, explores the benefits of standardized financial data, success stories so far, and challenges to developing financial data standards.
3 Mar 2026
Blogs

The Global Financial Crisis (GFC) revealed extraordinary weaknesses in the ability of financial firms and their regulators to analyze and understand risks. When regulators can’t see the buildup of leverage, when banks can’t evaluate their own exposures, and when firms can’t reconcile their positions, systemic risk grows quietly. 

Rumblings and significant market shifts that led to the GFC began breaking the surface in the United States in June 2007 and spread internationally by September 2008, prompting policy and regulatory responses that shaped the financial sector for years to follow. 

Hindsight is built on experience. What we know now, we might have known then. But we didn’t have a way to connect the dots. 

In the two decades since the initial shock waves of the GFC, financial regulators and financial firms have built up oversight and reporting requirements. But “messy” or “bad” data—data that cannot transfer from one system to another, might be outdated by weeks or months, and are muddled with errors—stymies insights and oversight while costing firms millions in compliance costs. Standardized data are essential for running financial businesses, managing risks, and monitoring systemic risks.

Standardized data deliver essential benefits:

  • Comparability and Consistency: Markets function best when information is accurate and consistently defined. Standards eliminate ambiguity.
  • Operational Efficiency: Standards reduce costs and streamline compliance, freeing up capital for productive use.
  • Systemic Risk Monitoring: Standards allow regulators to aggregate exposures across firms, identify concentrated vulnerabilities, and assess interconnections.
  • Innovation Enablement: New technologies depend on clean, structured data. Fintech, RegTech, and AI tools all require data that can be parsed and processed.
  • Transparency and Accountability: Data standards support market discipline and regulatory oversight.

Challenges: Financial Data Are Especially Hard to Standardize

Standardized data can deliver high-value returns by making it easier to transmit context, meaning, and analysis, and trace data lineage. Still, developing standards for financial data presents specific challenges:

  • Fragmented Regulatory Structure: This fragmentation frustrates financial companies and their regulators. Regulators need to share data among themselves to assess systemic risks. Financial data must follow standards for identifying legal entities, financial products, and financial transactions, so examiners and financial stability analysts can compare and aggregate them. No single entity has the mandate to impose standards across the board.
  • Dependence on Market Participants: The data that regulators use for financial stability analysis come from the private sector. Many firms rely on legacy IT systems not designed for modern data formats. If financial firms don't have quality data to assess and manage their businesses and risks, regulators won't either.
  • Coordination Problems: Standardization is a collective action problem. Each institution bears local costs while benefits accrue system-wide. 
  • Continual Innovation. Innovation drives profits but creates new challenges for risk monitoring. Data standards and collections need to evolve to keep up.
  • Proprietary Standards: Private vendors may develop proprietary partial solutions that are very valuable to market participants but promote data balkanization.
  • Confidentiality Concerns: Market participants resist disclosure, even in anonymized form, citing competitive sensitivity. 

Policymakers must take these concerns seriously in seeking to collect new data or promote new standards. In some cases, when incentives align, the private sector has developed standards without substantial public intervention. For example, the International Swaps and Derivatives Association (ISDA) has long provided data standards for derivatives contracts, including the Common Domain Model and Financial products Markup Language (FpML) formats. But there are limits. 

Success Stories

Despite the challenges, there have been important successes in recent years:

  • Legal Entity Identifier (LEI): The G20, Financial Stability Board, and Office of Financial Research (OFR) supported the creation of the Legal Entity Identifier (LEI) to address fundamental information gaps within financial transactions. The LEI assigns unique identifiers to legal entities engaged in financial transactions and enables regulators to aggregate risk across affiliates and markets. As of early 2026, over 3 million entities have been assigned LEIs, with regulators in Europe and Asia requiring the use as a cost of doing business. In the United States, the Commodity Futures Trading Commission requires LEIs in swaps transactions. The OFR, Financial Stability Oversight Council (FSOC), and Commodities Futures Trading Commission (CFTC) have continued to encourage further adoption and it has been included in the proposed joint rule on data standards as part of the FDTA.
  • XBRL for Financial Filings: The Security and Exchange Commission’s (SEC) requirement that public companies file in XBRL has made corporate financials machine-readable and accessible for automated analysis. Adoption challenges remain, but the long-term benefits are real. Benefits listed in the December 2025 SEC Semi-Annual Report to Congress on the agency’s use of machine-readable data for corporate disclosures include improved market competition, reduced costs related to information processing, fewer reporting errors, and an improvement in the quality of information, including correct metadata, for training artificial intelligence (AI) and large language models (LLMs). 
  • ISO 20022 for Payments: ISO 20022 enables structured messaging in payment systems, facilitating faster settlements and better compliance. It illustrates how modern data standards can support both innovation and oversight.
  • TRACE and Bond Market Transparency: Post-trade transparency in corporate bond and Treasury markets—via FINRA’s Trade Reporting and Compliance Engine (TRACE) system and platforms like MTS, offering daily insights into Europe’s bond market—has improved price discovery and reduced trading costs.

These innovations demonstrate that the public and private sectors can collaborate on effective and valuable data standards.

Remaining Gaps

Gaps remain, especially when it comes to emerging financial markets and innovations:

  • Stablecoins and Digital Assets: Stablecoins, under the new legal framework passed by Congress, raise novel data and disclosure questions.
  • Private Credit: Nonbank lenders now originate a substantial share of corporate credit. These activities often occur outside traditional bank supervision and reporting frameworks. The absence of loan-level data and consistent identifiers makes it difficult to assess systemic exposure.
  • Other Nonbanks and Shadow Banking Activities: Other types of nonbanks—including hedge funds, private equity funds, and family offices—engage in leverage with limited regulatory reporting requirements. These blind spots raise concerns about market functioning in stress events.

Momentum and Support 

The mission of improving data and analysis has often received bipartisan support. The Foundations for Evidence-Based Policymaking Act of 2018 (Evidence Act), passed during the Trump administration, strengthened federal data governance by requiring agencies to develop data agendas, appoint chief data officers, and improve access to government data for policymaking.

Between 2018 and 2022, and even now, the proliferation and generation of data continue to grow exponentially. Financial reports still function in partial digital formats and in static documents, like PDFs, making data retrieval, review, and use labor-intensive and prone to additional errors. Expanding the consistent use of data standards can improve data quality. The passage of the FDTA reflected a recognition that data modernization can reduce regulatory burden, improve oversight, and support more competitive markets. 

On the flipside, when markets and regulators rely on poor financial data, investors misprice risk, regulators miss red flags, and ultimately, taxpayers underwrite bailouts.


This blog was written by Greg Feldberg, a lecturer at the Yale School of Management and a member of the Open Data Standards Task Force, a collaborative initiative co-chaired by the Data Foundation and Bloomberg that brings together experts from government, businesses, non-profits, and academia to address challenges in implementing common business identifiers, entity identifiers, and data standards in the United States, particularly within financial services. He is a former senior associate director at the Office of Financial Research and Federal Reserve bank supervisor.


References

Bair, Sheila, and Larry Goodman (2025). "A Government Agency Worth Saving." Wall Street Journal. https://www.wsj.com/opinion/a-government-agency-worth-saving-research-ofr-0bbdaf2c?mod=commentary_more_article_pos27

Couillault, Bertrand, Jun Mizuguchi, and Matt Reed (2017). “Collective Action: Toward Solving a Vexing Problem to Build a Global Infrastructure for Financial Information.” OFR Brief no. 17-01, February 2, 2017. https://www.financialresearch.gov/briefs/2017/02/02/collective-action-toward-solving-a-vexing-problem-to-build-global-infrastructure/

Feldberg, Greg (2020). "Fixing Financial Data to Assess Systemic Risk." Brookings Institution. https://www.brookings.edu/articles/fixing-financial-data-to-assess-systemic-risk/

Data Foundation (2024). "Response to Joint FDTA Data Standards Proposed Rule." https://datafoundation.org/news/financial-data-transparency-hub/416/

Financial Stability Oversight Council (2024). 2024 Annual Report. https://home.treasury.gov/system/files/261/FSOC2024AnnualReport.pdf

U.S. Congress (2019). Foundations for Evidence-Based Policymaking Act of 2018. https://www.congress.gov/bill/115th-congress/house-bill/4174

U.S. SEC et al. (2024). "FDTA Joint Data Standards Proposed Rule." https://www.sec.gov/files/rules/proposed/2024/33-11295.pdf

U.S. Senate Banking Committee (2025). "ICYMI: Yellen, Bernanke, Over 50 Bipartisan Experts Urge Congress to Prevent the Elimination of Key Financial Stability Watchdog." Minority Press Release, June 18, 2025. https://www.banking.senate.gov/newsroom/minority/icymi-yellen-bernanke-over-50-bipartisan-experts-urge-congress-to-prevent-the-elimination-of-key-financial-stability-watchdog

image

DATA FOUNDATION
1100 13TH STREET NORTHWEST
SUITE 800, WASHINGTON, DC
20005, UNITED STATES

INFO@DATAFOUNDATION.ORG

This website is powered by
ToucanTech