Attention: You are using an outdated browser, device or you do not have the latest version of JavaScript downloaded and so this website may not work as expected. Please download the latest software or switch device to avoid further issues.
| 21 Feb 2026 | |
| Blogs |
The following blog is the first in a new series that explores the value of data standards in financial services. The series draws from a compendium and accompanying white paper developed by the Open Data Standards Task Force, a collaborative initiative co-chaired by the Data Foundation and Bloomberg that brings together experts from government, businesses, non-profits, and academia to address challenges in implementing common business identifiers, entity identifiers, and data standards in the United States, particularly within financial services.
Common data standards lay the groundwork for consistent representation of information. By reducing miscommunication, standards facilitate understanding across diverse backgrounds, which is especially important in this era of digitalization and hyper-connectivity. Even through differences in representations across different fields, countries, and languages, the adherence to consistent standards within each domain simplifies conversion and interpretation for individuals with limited or no prior familiarity with different standardized systems.
The standardization of time, distance, volume, weight, and temperature are generally agreed upon, even when the representation is expressed differently. While various groups and countries have established their own conventions for representing time (e.g. the military's 24-hour format vs. the civilian 12-hour format with a.m./p.m.), these internal standards are readily understood and converted. This interoperability is further cemented by universal standards like ISO 8601 that allows consistent and unambiguous communication of time across diverse fields and international boundaries, despite the continued use of varied localized representations (e.g. 2026-02-17 13:00:00.000).
There are different standardized systems used in different contexts. While the majority of the world uses the metric system, the United States, Liberia, and Myanmar are countries that stick to the imperial system. These differences create different representations such as: kilometer vs. mile; litre vs. gallon; kilograms vs. pounds; and celsius vs. fahrenheit. Although three countries use a different measurement system than the rest of the world, these values communicated within their standards can be understood by those who do not use these standards due to their easy convertibility and interpretability.
These standards guide our decision-making. Having a solid understanding of what 79ºF (and converting that to 26ºC) is can inform what you choose to wear that day. Standardized data simplifies decision making without having to make uninformed decisions on flawed or confusing information.
Standardized systems also reduce errors. Simple interoperability allows for easy verification of data validity and justifies trust in the resulting conversions. Just as the scientific method ensures high-quality experiments through well-documented steps and measurements, data should be easily accessible and reproducible to ensure high quality.
Additionally, this standardization makes it easier to scale projects. Common characteristics enable the efficient integration of new data and its interpretation at an accelerated, larger scale. This adaptability allows the same data to transition across various objectives, ultimately leading to more effective utilization. For instance, like how a recipe for four servings makes it easier to adjust for two or eight servings, a standardized dataset simplifies scaling projects to suit different sizes.
All these efforts lead to cost saving through improved efficiencies within different sectors and stakeholders.
Unlike these units of measurements that are more agreed upon and commonly used, the digital data that exists today in many different fields are often fragmented with no clear system. Unclean, unconnected data makes it more difficult for interpretation of data and conversions to data standards understood by different stakeholders.
The Financial Data Transparency Act (FDTA) of 2022 aims to establish these standards for the financial sector in the US. It directs the “covered agencies” in the Financial Stability Oversight Council (FSOC) to develop common data standards, including common identifiers, to ensure information reported to financial regulators is machine-readable, is open and available for bulk download in a human-readable format, and is interoperable. To meet these requirements for data quality, access, reliability, and transparency, the FDTA pushes for common data standards for financial reporting.
Although there are strides being made for the U.S. financial sector data, many different fields still have their data in separate places in different formats, making it difficult to transfer important information from one place to another. This fragmentation presents a challenge: without a shared structure for organizing and exchanging data, even advanced technologies struggle to operate efficiently across different sectors.
The challenge with applying standards to digital data raises several questions:
Why are data standards important? Are they even needed in the age of AI? What elements help support valuable data standards? Are there examples of measurable value seen with the uses of data standards? These questions and more are addressed in the Open Data Standards Task Force’s compendium on the “Value of Data Standards.”
As the authors featured in this compendium and resulting blog series present the value of data standards from their perspectives and experiences, each have agreed on the following attributes of quality and useful data standards:
“The Value of Data Standards: A Compendium from the Open Data Standards Task Force” is a collection of six separate chapters that each address the value of data standards in financial services written by members of the Open Data Standards Task Force. Chapters of the compendium will be published as individual blogs, and include:
This blog was written by Ashley Nelle-Davis, the Vice President at the Data Foundation who also leads the organization's efforts related to financial regulatory data and co-chairs the Open Data Standards Task Force.