Attention: You are using an outdated browser, device or you do not have the latest version of JavaScript downloaded and so this website may not work as expected. Please download the latest software or switch device to avoid further issues.

LEARN > Evidence Act Hub > Measuring Progress: 2024 Survey of Federal Evaluation Officials

Measuring Progress: 2024 Survey of Federal Evaluation Officials

Measuring Progress: 2024 Survey of Federal Evaluation Officials from the Data Foundation, in partnership with the AEA, assesses the growth of evaluation capacity and practice in federal agencies.

 


Download: Measuring Progress: 2024 Survey of Federal Evaluation Officials


Executive Summary

The Foundations for Evidence-Based Policymaking Act of 2018 (Evidence Act) was a sweeping effort to enhance the government’s capacity to produce and use evaluations. The Evidence Act formally created evaluation officers at each of the 24 largest federal agencies. Subsequent guidance encouraged this capacity in all federal agencies. The formally designated individuals, alongside the teams of federal evaluators who support them throughout agencies, have taken on the challenge of advancing and strengthening evidence-building cultures in their organizations. The evaluation community is making substantial progress in building, growing, and enhancing its culture and capability following enactment of the Evidence Act.

Measuring Progress: 2024 Survey of Federal Evaluation Officials from the Data Foundation, in partnership with the American Evaluation Association, assesses the growth of evaluation capacity and practice in federal agencies. This online survey for evaluation officials was conducted in March and April of 2024, asking questions about experiences related to the implementation of the Evidence Act. The findings indicate substantial progress since officials were initially surveyed in 2021 by the Data Foundation and the American Evaluation Association, although opportunities for further growth in evidence capacity remain for federal agencies.

Key findings from the survey include:

  • Evaluation officials are optimistic about mission success, educating peers, and promoting equity. While responsibilities vary among officials, respondents reported increased influence on agency decision-making since 2021, particularly in educating peers and promoting equity, with findings most impacting regulatory actions (16%), strategic planning (20%), and operational decisions (20%), though resource sufficiency remains a concern for many.  
     
  • The Evidence Act greatly enhances evaluation activities, and needs continued support. While 48% of respondents reported that the Evidence Act “helped a lot” with achieving their evaluation mission, only half reported “sufficient” staffing resources, and even fewer (37%) reported “sufficient” non-staffing funding. Respondents who found the Evidence Act very helpful cited increased attention to evaluations, while others noted limited influence on budgeting and capacity building capacity for high-quality evaluations.
     
  • Learning agenda implementation faces challenges, yet yields clear benefits. Key challenges for formulating learning agendas include determining question scope, understanding purpose, managing timelines, and external collaboration. Respondents suggest better integrating strategic planning and budgeting, increasing resources, and gaining leadership support. Despite difficulties, the process encourages evidence use, fosters an evidence culture, and improves inter-agency collaboration.
     
  • Artificial Intelligence (AI) adoption is a low priority for respondents. Across a range of questions, respondents indicated a lack of involvement in AI adoption now and in the future. A majority (52%) reported only “a little” current use of AI in their organizations, with 74% “unsure” or “not applicable” when asked about adoption in the coming year. 
     
  • Evaluation officials are increasingly clear on their roles and responsibilities. Most respondents (81%) are clear about their responsibilities, despite only 39% being dedicated full-time to evaluation activities. However, 67% perceive their agency peers are less clear about their role. Officials are highly experienced, with over half having 10 or more years in their organization, evaluation field, and federal government, though 50% have less than 5 years in their current role, aligning with the Evidence Act's adoption.

The findings of the survey suggest several key recommendations for those working to support the evaluation officers and leaders across the community:  

  1. Create strategic capacity to engage in evaluation activities, including targeted support for effective engagement. Evidence and evaluation capacity is multi-faceted and the individuals in key leadership roles need organizational support systems. Support comes through adequate staffing and resources, but also by ensuring that evaluation officers are full-time positions, when appropriate. 
  2. Develop and use tools for collaboration with both internal and external stakeholders to enable more effective engagement. Engaging with stakeholders effectively has a positive feedback loop for evaluation and evidence activities, including the use and usefulness of evaluation processes and products. While a range of stakeholders can be engaged across the communities for producing and using learning agendas, evaluation plans, and evaluation results, especially due to limited capacity, improved tools and resources are urgently needed to support emerging evaluation leaders in undertaking these tasks efficiently and effectively, including collaboration with external stakeholders like program partners and congressional officials. These types of activities can and should be facilitated through the Interagency Council on Evaluation Policy, the Evaluation Officers Council, interagency support infrastructures, professional association communities of practice, and third-party support systems. 
  3. Improve the integration of strategic planning and budget formulation with evidence planning activities. Evaluation officers and officials articulated emerging alignments of existing processes, which allow evaluative thinking and approaches to become more incorporated within existing processes, thereby reducing the cost and burden on agency program partners while increasing the value for supporting high-priority strategic topics. 
  4. Promote continuous improvement of Evidence Act products, and incorporate AI capabilities. Recognizing the need for continuing to improve the quality and usefulness of the multi-year learning agendas, annual evaluation plans, evidence periodic capacity assessments, and other products produced from the Evidence Act can also be aligned with emerging approaches through generative AI and relevant tools to innovate in the years ahead. 

While substantial progress has been made in federal evaluation practices and capacity since our last survey in 2021, challenges persist in securing resources, building capacity, and sustaining evaluation priorities. Addressing these challenges will be crucial for continuing to strengthen the role of evaluation in federal decision-making and fostering a culture of evidence-informed practices across federal agencies.


Authors

  • Steve Mumford, Ph.D., Associate Professor of Political Science and Public Administration, University of New Orleans; Research Fellow, Data Foundation
  • Nathan Varnell, Policy and Research Analyst, Center for Evidence Capacity, Data Foundation
  • K. Adam Smith, Graduate Assistant, University of New Orleans
  • Sara Stefanik, Director, Center for Evidence Capacity, Data Foundation

Acknowledgments

The authors thank multiple reviewers from the American Evaluation Association’s Evaluation Policy Task Force and the Data Foundation Board of Directors. The authors also appreciate the participation and engagement from federal evaluation officials who responded to the survey.


Disclaimer 

This paper is a product of the Data Foundation, in partnership with the American Evaluation Association. The findings and conclusions expressed by the authors do not necessarily reflect the views or opinions of the Data Foundation or AEA, their funders and sponsors, or Boards of Directors.

 © 2024 Data Foundation. All rights reserved.


Download Measuring Progress: 2024 Survey of Federal Evaluation Officials

image

DATA FOUNDATION
1100 13TH STREET NORTHWEST
SUITE 800, WASHINGTON, DC
20005, UNITED STATES

INFO@DATAFOUNDATION.ORG

This website is powered by
ToucanTech