Attention: You are using an outdated browser, device or you do not have the latest version of JavaScript downloaded and so this website may not work as expected. Please download the latest software or switch device to avoid further issues.
Climate Week NYC brings together thousands of leaders from across the globe, working in different sectors and levels of government to piece together solutions to the climate puzzle. For our team at the Data Foundation, participating in this week's conversations reinforced a key insight: We need reliable and valid climate data to help inform conversations, define the scope and scale of the problems we face, and inform decision-makers around the world about what to do. Given that a growing number of companies in the U.S. and abroad face reporting requirements related to their greenhouse gas emissions, the collection and sharing of climate data isn’t a voluntary political act anymore: For many organizations, it’s now a part of doing business.
On the first day of Climate Week NYC in 2025, a roundtable session convened by XBRL US brought together state legislators from both coasts, sustainability professionals, investors, and technology providers, all grappling with the same challenge—how do we create climate data that's not just compliant, but actually useful, reliable, and fit for purpose?
The conversation revealed something encouraging: despite the charged political climate around environmental policy, there's surprising consensus on what makes data work in practice. New York State Assemblymember Deborah Glick explained, "We use the Greenhouse Gas Protocol,” a framework for measuring and reporting greenhouse gas emissions that has become the world’s most widely-used way of accounting for greenhouse gas emissions. “There's a standardized, recognized methodology,” Glick said, adding, “that's important for business to know that it's not me and my colleagues making something up."
California State Sen. Scott Wiener echoed the importance of practical validity, explaining how his state’s Climate Corporate Data Accountability Act, which he co-authored, was designed "to be as user friendly and streamlined as possible."
Both legislators understand what we've learned through years of data policy work: Standards succeed when they produce information that's reliable enough to trust, valid enough to compare across entities, and useful enough to inform real decisions.
The roundtable also included experts in corporate reporting, assurance, accounting, investment, technology, and standard setting. The corporate perspective they provided from revealed the core challenge: Companies aren't resisting climate data collection because they oppose environmental goals. Indeed, many investors are prioritizing sustainability as part of investment goals. Companies are struggling because the data systems they're being asked to build often aren't fit for the purpose they're supposed to serve. As one participant noted during the XBRL US event, when you examine most corporate climate data infrastructure, there's often little that would meet basic reliability standards for financial reporting. That is a major issue we can address.
This gap between compliance requirements and data utility is where the Data Foundation’s Climate Data Collaborative's work is essential today and the years ahead. We've seen this pattern in financial transparency—fragmented systems that generate data that technically meet regulatory requirements but don't actually inform the decisions they're supposed to support.
The solution isn't to abandon data standards, as some have suggested. Instead we should be applying consensus data standards to make them work better by focusing on what makes data genuinely useful across users and communities. Data must be actionable—timely enough for decisions, granular enough for specific contexts, and accessible enough for the people who need it. It must be reliable—consistent in methodology, verifiable through independent sources, and transparent about limitations. And it must be valid—measuring what it claims to measure and comparable across similar entities.
California's decision to align with the GHG Protocol creates this kind of validity by establishing consistency with global reporting frameworks. When New York follows California’s example, it's building toward data that works across jurisdictions rather than creating compliance silos.
Sen. Wiener offered one example of how climate data designed to inform decisions, not just meet reporting requirements, could spur action even at the individual level: In the future, consumers could choose between products based on their carbon footprint, just as they might compare prices or nutritional information. But that future would require data infrastructure that connects project-level measurements to regional and national systems, creating information that's both locally relevant and globally comparable.
The technology providers in the room were already building these capabilities. Software platforms are emerging that handle multi-jurisdictional reporting while maintaining data validity, AI-enabled systems that reduce compliance burden without sacrificing reliability, and digital standards that make information genuinely interoperable across contexts.
What struck me most from the session was the consensus on data quality principles. Republicans and Democrats, business leaders and environmental advocates—everyone wants information that's reliable enough to trust, useful enough to act on, and efficient enough to collect without duplicating effort unnecessarily. The data infrastructure needed for any effective climate strategy can advance through the same pragmatic approach that has worked for financial transparency and public health monitoring—focusing on utility, reliability, and validity rather than ideological positions.
States like California and New York are leading the way on developing policies and programs to address climate change, but they're also creating a template for climate data that works in context—transparent enough for public trust, standardized enough for cross-jurisdictional comparison, granular enough for local decisions, and reliable enough for long-term planning. When climate policy momentum returns at the national level, state efforts to lay the foundation for fit-for-purpose data will be essential for any strategy that actually delivers results.