Attention: You are using an outdated browser, device or you do not have the latest version of JavaScript downloaded and so this website may not work as expected. Please download the latest software or switch device to avoid further issues.

LEARN > Blogs > Why Long-Term Climate Observations and Innovation Matter for Risk Management

Why Long-Term Climate Observations and Innovation Matter for Risk Management

A recent event during Climate Week NYC explored how traditional long-term observations and new data sources complement each other in managing climate risk.
22 Oct 2025
Written by J.B. Wogan
Blogs

As changes in climate and extreme weather events exacerbate physical asset and financial risks, climate data has become increasingly important for accurate risk assessment. However, another variable raises concerns – traditional sources of long-term observational climate data. During Climate Week in New York City, the Data Foundation and the Keeling Curve Foundation convened insurance executives, venture capital investors, and climate scientists to examine how climate data informs decision-making across sectors. The event coincided with the one-year anniversary of the Climate Data Collaborative, a cross-sector initiative of the Data Foundation that is accelerating climate action by building connections between climate data and decision-makers. 

Franklin Nutter, president of the Reinsurance Association of America, moderated the conversation with the following speakers: 

  • Ryan Alexander, executive director of the Data Foundation’s Climate Data Collaborative
  • Kieran Bhatia, senior vice president and head of climate and sustainability for North America at Guy Carpenter, a global risk and reinsurance company
  • Ralph Keeling, professor of geochemistry at Scripps Institution of Oceanography, as well as president and founder of the Keeling Curve Foundation
  • Nancy Pfund, founder and managing partner at DBL Partners, a venture capital fund
  • Charlie Sidoti, executive director of InnSure, a nonprofit innovation hub focused on creating novel insurance solutions to address climate risk challenges

Here are three key takeaways from the conversation.

1. Long-Term Observations Are Essential for Climate Model Reliability

Ralph Keeling, of the Scripps Institution of Oceanography and the Keeling Curve Foundation, noted that climate models don't stand alone as predictive tools. They are extrapolation mechanisms built on decades of consistent climate observations. Long-term observations provide the foundation for reliable global climate models that inform risk assessment at all scales.

These models help the business community and policymakers understand possible future scenarios, but their reliability depends entirely on the quality and continuity of the underlying data and observations.

"Those long-term datasets are the basis for reliability and unreliability of models," Keeling explained. For example, current models don't fully incorporate climate tipping points—the critical thresholds in the earth’s system or related processes which, if exceeded, can cause sudden, dramatic, or irreversible changes. If parts of the Earth system begin moving in unexpected directions, observations will provide the early warning signals needed to update models and prepare for significant changes.

Keeling highlighted a particular concern: it's difficult to see how commercial entities can provide this kind of public good. Long-term observational programs often depend on small teams with specialized skillsets maintaining high-calibration standards in laboratories. These teams and their institutional knowledge are among the most vulnerable parts of the climate data infrastructure. The U.S. has been a global leader in ocean temperature measurements, atmospheric composition monitoring, and other critical observations that form the backbone of international efforts. Losing this capacity would affect not just domestic risk management but global climate science.

2. Data Consistency Directly Impacts Insurance Pricing and Market Stability

The insurance industry's ability to price risk accurately depends on consistent data. Kieran Bhatia of Guy Carpenter illustrated this point by comparing hurricane and wildfire modeling. For hurricanes, the insurance industry has access to over 60 years of reliable satellite observations, which supports stable and predictable catastrophe model updates. The result is pricing stability and market confidence.

Wildfire presents a stark contrast. As a newer area of catastrophe modeling with less detailed observations and fewer resolved climate model projections, recent wildfire model updates have shown double-digit and even triple-digit changes in expected loss calculations.

"If you have consistent data, you get more consistency in your view of risk. You can price risk better. You can have more financial stability," Bhatia explained. Consistent, long-term data matters not just for insurance companies but for communities, property owners, and the broader economy that depends on stable insurance markets.

The vulnerability modeling component—understanding how buildings and infrastructure respond to hazards—remains less standardized across the industry because it relies more on proprietary data and methodologies. The variation in modeling vulnerability across the industry leads to significant variance in loss projections for the same geographic area, forcing customers to navigate uncertainty about which models to trust. When hazard data is reliable and shared, the industry gains confidence. When it's not, uncertainty can erode the value of the modeling altogether.

3. Innovative Approaches to Climate Data Collection Are Enabling Better Risk Management

Nancy Pfund of DBL Partners brought an investor's perspective on how technology companies are developing solutions to work with available data while generating new information in real time that drives better outcomes. One company in her fund’s portfolio deploys autonomous helicopter systems during wildfire events that integrate multiple data sources—historical fire perimeters, USGS terrain models, 911 calls, mountaintop cameras, lightning detection, satellite imagery, and smart grid sensors—to create dynamic risk maps and improve firefighting effectiveness.

The results are striking. While results vary depending on wildfire conditions, the company’s analysis found that traditional helicopter water drops often hit their effective target only 40 percent of the time due to poor visibility, smoke, and turbulence. In an early demonstration, the company’s autonomous system achieved 80 percent accuracy.

“While we're never going to have the perfect dataset—we never do in any investment—we can't let perfect be the enemy of the good,” Pfund said. “We have to roll up our sleeves and do something.”

"We would love to have better long-term data collection. We all depend on that, but it's not an excuse," she added.

Pfund emphasized that while traditional climate data sources provide a critical foundation, innovation is essential for driving outcomes where data gaps exist. Private sector solutions can create entirely new data sources to address needs that existing infrastructure wasn't designed to meet. 

Regardless of whether the data comes from new or traditional sources, it needs to be affordable and accessible, said Charlie Sidoti of InnSure. “A lot of this innovation is done by entrepreneurs, so having easily accessible, low-cost access to data is critical for innovation to happen.” If the data “costs a lot of money,” he added, “that's going to slow innovation and make some of these problems worse.”

Looking Forward

The panel discussion revealed both challenges and opportunities in climate data sources and infrastructure. While federal programs face uncertainty, private sector innovation is accelerating, new technologies are making data collection more accessible, and collaborative efforts like the Greenhouse Gas Coalition and the Global Digital Single Market Data Alliance are working to bridge gaps between data and decision-makers.

For example, Ryan Alexander explained that the Data Foundation’s Climate Data Collaborative is focused on facilitating the connections that help ensure data is connected to decision makers to accelerate climate action. Climate data is often produced in silos, established by statute or regulation, with little consideration for data interoperability across agencies, programs, or companies.  

“Nobody was thinking, how should the Environmental Protection Agency data match up with the data that the Internal Revenue Service collects for eligibility on tax credits with the data that National Aeronautics and Space Administration collects,” Alexander said. She also noted that current uncertainty over the future role of the federal government in long-term collection of climate data underscores the importance of acting now to develop a common vision for what a robust, cross-sector data ecosystem should look like. 

The consensus across perspectives was clear: As climate risks intensify, the quality of our decisions will depend on the quality of our data—and our ability to make that data accessible to everyone who needs it to protect communities, price risk accurately, and build resilience for the future. We need a robust ecosystem of climate data, including long-term climate observations and emerging insights from novel sources and new technologies. Neither can fully substitute for the other.

image

DATA FOUNDATION
1100 13TH STREET NORTHWEST
SUITE 800, WASHINGTON, DC
20005, UNITED STATES

INFO@DATAFOUNDATION.ORG

This website is powered by
ToucanTech