Attention: You are using an outdated browser, device or you do not have the latest version of JavaScript downloaded and so this website may not work as expected. Please download the latest software or switch device to avoid further issues.
1 Nov 2022 | |
Written by ToucanTech Support | |
General |
Federal agency officials, especially those in newly created leadership positions established by the Foundations for Evidence-Based Policymaking Act of 2018 (Evidence Act), are engaging in activities to facilitate building and using evidence to inform decision-making. The Data Foundation’s 2022 Virtual Symposium, in partnership with The George Washington University’s Trachtenberg School of Public Policy and Public Administration, convened experts who described real-world experiences and research projects that focused on efforts to build federal agency evidence-building capacity. In the call for submissions, the Data Foundation asked experts to submit research related to four areas:
This compendium offers an overview of the 16 presentations and key takeaways.
Table of Contents
Day 1: Understanding Capacity for Evidence-Building Across Government
Day 2: Innovations in Evidence-Building Activities
Day 1: Understanding Capacity for Evidence-Building Across Government
The first day focused on efforts to assess evidence-building capacity and establish evidence-building processes. The presenters from both government and non-governmental organizations covered topics including capacity assessments, data silos and data hubs, workforce management, and how agencies can work with contractors and collaborate across different levels of government to build capacity for effective evidence-based policymaking.
Informing Infrastructure Improvement Decision-Making with the Best Available Science
Presented by Ed Kearns, Chief Data Officer, First Street Foundation and former Acting Chief Data Officer at the U.S. Department of Commerce
Risk Factor is a free tool created by the nonprofit First Street Foundation to make it easy to understand risks from a changing environment. Risk Factor (available at: riskfactor.com) allows users to search their address to see the location’s flood, wildfire, or heat risk now and 30 years into the future. The tool is able to determine various climate risk factors down to the level of a street, property, house, or 100-meter segment of roadway. Developed using an open source hazard model driven by open source government data from various agencies – including the Federal Emergency Management Agency, National Oceanic and Atmospheric Administration, and U.S. Geological Survey at the Department of Interior – the First Street Foundation further expanded the tool’s capabilities using outputs from the Intergovernmental Panel on Climate Change (IPCC) climate models as well as spatial and economic analysis models. The project demonstrates how federal government data can enable development of innovative tools to inform public understanding and decision-making. By utilizing open science and bringing together various data sources, the public and private sectors can take a more climate-informed science approach to prepare for the effects of climate change.
Key Takeaways & Recommendations:
An Iterative Approach to Developing, Conducting, and Using the Department of Homeland Security Capacity Assessment
Presented by Rebecca Kruse, Assistant Director for Evaluation, Department of Homeland Security; Brodi Kotila, Senior Political Scientist, RAND Homeland Security Operational Analysis Center (HSOAC); and Coreen Farris, Senior Behavioral Scientist, RAND HSOAC.
The Evidence Act requires agencies to assess their capacity to produce and use evidence for better decision-making and policymaking. The Department of Homeland Security (DHS) has conducted two iterations of its capacity assessment since 2020 – an initial assessment in fiscal year 2020, and the official, independent assessment in fiscal year 2021. The presentation covered methodology, findings and recommendations, lessons learned, and an update on DHS evaluation activities since the assessment was published.
Key Takeaways & Recommendations:
Understanding Human Capital Needs for Expanding Data and Evidence Culture Using a Federal Data and Digital Maturity Survey
Presented by Maddie Powder, Research Associate, Partnership for Public Service
The Partnership for Public Service and the Boston Consulting Group conducted a data maturity assessment using the Federal Data and Digital Maturity Index survey, or FDDMI, at six federal agencies. To understand the agencies’ data and digital maturity, respondents ranked their agency’s current maturity on a sliding scale of 0 to 100 points and their target maturity in five years. Based on the survey responses, agency results were separated into 4 categories – starter, literate, performer, and leader. Findings from the survey revealed human capital – referring to an agency’s workforce and a component of the overall maturity score – ranked lower than overall maturity scores. Target maturity scores were higher than current maturity scores for all sub-categories, revealing a consistent ambition to improve.
Key Takeaways & Recommendations:
An FAA Experience: Applying Intervention Research as a Change Management Approach to Implement Evidence-Based Management
Presented by Robert Young, Senior Advisor, Strategy, Risk & Engagement, Federal Aviation Administration Security & Hazardous Materials Safety Organization
The Federal Aviation Administration (FAA) Security and Hazardous Materials Safety Organization developed a three-year strategic plan to implement evidence-based management with the objectives of delivering goods and services with more rigor and scientific integrity, better managing data, and building evidence-building capabilities. To execute this strategy, the agency used intervention research as a change management approach. The research team engaged participants with a “reflexive experiential lens” to gather a range of perspectives from the senior executive level to employees to bridge the gap between the academic theory and the realities of implementing an evidence-based management system in a federal agency.
Key Takeaways & Recommendations:
Advancing Equity through Evidence-Building, Data Integration, and Research Partnerships: A Local Government’s View from “The Other Washington”
Presented by Claire Evans, Research Specialist, King County Metro Transit; Truong Hoang, Deputy Regional Administrator, Region 2, Community Services Division, Economic Services Administration, Washington State Department of Social and Health Services; Maria Jimenez-Zepeda, ORCA Reduced Fare Project Program Manager, King County Metro Transit; Christina McHugh, Housing and Adult Services Evaluation Manager, King County Department of Community and Human Services, Performance Measurement and Evaluation Unit; and David Phillips, Associate Research Professor, Wilson Sheehan Lab for Economic Opportunities, University of Notre Dame
Representatives from King County and Washington State highlighted challenges, successes, and lessons learned as their organizations have been working to adopt and integrate evidence-based policymaking into their programs and day-to-day operations. Panelists described how partnerships and data sharing helped center equity and improve program decisions about transportation assistance and housing for low-income populations. The panelists touted leadership buy-in, partnerships between local, state, and nonprofits to conduct evaluation, and funding as critical for developing local evaluation capacity.
Key Takeaways & Recommendations:
Assessing the Quality of Impact Evaluations at USAID
Presented by Irene Velez, Director, Monitoring Evaluation Research Learning and Adaptation (MERLA), Panagora Group
The United States Agency for International Development’s (USAID) Bureau of Policy Planning and Learning commissioned a study that assessed the quality of the agency’s impact evaluations from 2011-2019. The study was intended to identify the strengths and shortcomings of USAID’s impact evaluation reports and inform the agency’s update to guidance for conducting impact evaluations. The research group compared impact evaluation reports with the definition outlined in USAID’s evaluation policy and developed a review instrument to assess the quality of the impact evaluations. Quality was based on six domains, including: sample size, conceptual framing, treatment characteristics and outcome measurements, data collection and analysis, threats to validity, and reporting. The researchers found that about half of the impact evaluation reports met the quality criteria, that USAID’s existing evaluation guidance was not tailored toward impact evaluations, and that the existing guidance lacked alignment with the organization’s broader scientific research policy.
Key Takeaways & Recommendations:
Approaches to Assessing Agency Capacity for Evidence-Building
Presented by Tania Alfonso, Senior Evaluation Specialist, USAID; Danielle Berman, Senior Evidence Analyst, Office of Management and Budget (Moderator); Susan Jenkins, Evaluation Officer, Department of Health and Human Services; and Christina Yancey, Chief Evaluation Officer, Department of Labor
Panelists from the Department of Health and Human Services, the Department of Labor, and the U.S. Administration for International Development demonstrated various approaches to conducting capacity assessments by sharing the models their agencies used to complete department-wide assessments.
Agencies tailored the capacity assessment to fit their unique organizational needs. Some utilized an evaluation council framework and others started from a more structured, centralized evaluation authority. Based on the first agency capacity assessments, the evaluation officers are planning activities to address capacity needs. Future efforts include developing a four-year plan with each year focusing on a different category of capacity, providing training and tools, focusing on knowledge management, and improving communication regarding the value of evidence.
Key Takeaways & Recommendations:
Opportunity for Partnership: A Budget and Program Perspective on the Learning Agenda and Evidence-Building Activities
Presented by Courtney Timberlake, President, American Association for Budget and Program Analysis (AABPA); Ed Brigham, Executive Consultant, Federal Consulting Alliance & AABPA Board Member; Jon Stehle, Councilmember, City of Fairfax VA; and Darreisha Bates, Federal Portfolio Manager, Tyler Technologies & Former Director of Intergovernmental Relations, U.S. Government Accountability Office
The panel covered the evolution of the budget process since analysts adopted more advanced technology and discussed how to improve upon that recent progress going forward. The panel divided data use in the budgeting process into two areas: how computerization and data have made the federal budget more accurate, and where and how data can be used to support budget and policy decisions.
Key Takeaways & Recommendations:
Day 2: Innovations in Evidence-Building Activities
The second day of the symposium focused on innovation. Presentations offered a glimpse into areas where government agencies and other organizations have expanded their capacity to build and use evidence – and the strategies, tools, and methods that have enabled them to do so. Panelists’ innovative approaches to evidence across multiple issue areas emphasized the importance of fostering partnerships, leveraging existing data sources, and the role of organizational leadership.
Education Research-Practice Partnerships: Innovative Structures to Build and Use Evidence
Presented by Rachel Anderson, Director, Policy and Research Strategy, Data Quality Campaign
State and local agency leaders have finite time, resources, capacity, and expertise to interpret data and conduct research. Research-practice partnerships can address these challenges by engaging education researchers and practitioners in long-term collaboration efforts to develop research questions relevant to policies or practices, collect and analyze data, and interpret research findings. To provide an example, the presenter illustrated how research-practice partnerships can alert practitioners when students are at risk of failing to be successful in school.
Key Takeaways & Recommendations:
Statewide Longitudinal Data Systems and Predictive Analytics: Understanding, Measuring, and Predicting K-12 Outcomes
Presented by Nancy Sharkey, Senior Program Officer, Statewide Longitudinal Data Systems Grant Program, National Center for Education Statistics; Robin Clausen, Stakeholder Liaison and Research Analyst, Statewide Longitudinal Data System, Montana Office of Public Instruction; Chris Stoddard, Professor, Montana State University; Mathew Uretsky, Professor, Portland State University; and Angie Henneberger, Research Assistant Professor, University of Maryland School of Social Work
The panel gave an overview of the Statewide Longitudinal Data Systems (SLDS) program with examples from several states. The SLDS program, authorized in 2002 by the Education Sciences Reform Act and the Educational Technical Assistance Act, provides cooperative agreements between the federal government and states to support evidence-building and research through grants administered by the Institute of Education Sciences at the Department of Education. To date, participating states and territories have received around $800 million over seven rounds of grants. Representatives from Montana and Maryland universities offered examples of how the SLDS program has aided in their data collection systems and research.
Key Takeaways & Recommendations:
Using Linked Administrative Data to Connect Families to Pandemic Stimulus Payments
Presented by Aparna Ramesh, Senior Research Manager, California Policy Lab, UC Berkeley
The California Policy Lab partnered with the California Department of Social Services, the California tax department, and a research center out of Harvard University to identify Californians who were eligible but had not yet received the Child Tax Credit or federal stimulus payments.
The research team hypothesized that combining tax filing data and the Supplemental Nutrition Assistance Program enrollment data could reveal what percent of low-income Californians receiving social services had not filed returns but qualified for tax credits or stimulus payments. The team identified multiple data sources that included eligibility information, including tax return data and enrollment data for low-income social programs. To address potential privacy and legal concerns with linking this administrative data, both agencies signed separate data use agreements and the California Policy Lab served as a trusted third party to link state tax and social services data.
Researchers found that using tax data helped deliver stimulus assistance quickly to most low-income families. Most families receiving social services file a return and automatically receive these credits. Among adults who didn’t receive the federal stimulus, the vast majority were ineligible because they did not have dependents. And of the children who did not receive the credit, most lived with a single adult and a quarter lived in a household with no adult.
Key Takeaways & Recommendations:
Using a Framework for Evidence Capacity to Strengthen Federal Program Offices
Presented by Heather Gordon, Managing Consultant, Mathematica
The Evidence Capacity Support Project is an initiative between the Department of Health and Human Services’ Administration for Children and Families (ACF), Office of Planning, Research, and Evaluation (OPRE), and Mathematica to deepen evidence capacity. In this partnership, Mathematica offers support to staff within ACF program offices to build and use evidence in their work. Mathematica reviewed existing research literature and ACF documentation and conducted interviews with program staff to develop a framework for evidence capacity to inform the initiative.
Mathematica’s framework has identified and defined five dimensions of evidence capacity, each of which includes related inputs, outputs, or activities that are involved in evidence capacity-building. The framework can be used by other federal agencies and organizations interested in assessing their capacity to build evidence and apply it to their programs.
Key Takeaways & Recommendations:
Critical Factors for Building Successful Data Science Teams
Presented by Robin Wagner, Senior Advisor, Epidemiology Branch, Division of Cardiovascular Sciences, National Heart, Lung, and Blood Institute, National Institutes of Health
The Health and Human Services Data Council established the Data-Oriented Workforce Subcommittee (DOWS) to implement the workforce priority of the 2018 HHS Data Strategy and to develop a plan to enhance the department’s data science capacity. The subcommittee conducted two research projects that identified training opportunities for existing staff, recruitment strategies and tools for new staff, and retention and succession planning strategies. The DOWS research produced several recommendations regarding skills and competencies of data science teams.
Key Takeaways & Recommendations:
Advocating for and Applying COVID-19 Equity Data: The Black Equity Coalition’s (Pittsburgh, PA) Efforts to Improve Public Sector Health Agencies’ Practices
Presented by Jason Beery, Director of Applied Research, UrbanKind Institute & Member, Black Equity Coalition Data Working Group; Ashley Hill, DrPH, Assistant Professor, Department of Epidemiology, School of Public Health, University of Pittsburgh & Member, Black Equity Coalition Community Health Working Group; Ruth Howze, Community Coordinator, Black Equity Coalition; and Stacey Pharrams, Community Researcher, Healthy Start Initiative & Member, Black Equity Coalition Data Working Group
Representatives from the Black Equity Coalition (BEC) discussed their efforts in Allegheny County, Pennsylvania to use data to address the disparate impact of the COVID-19 pandemic on the Black community. BEC had three main goals: to improve 1) access to testing, 2) data quality (for both testing and case data), and 3) vaccine uptake in Black communities.
BEC accomplished its goals by engaging stakeholders, including community members, public officials, and health departments, to discuss COVID-related data quality improvements. BEC also focused on collecting demographic data to understand how COVID impacts differed by race and ethnicity. Linking COVID data to demographic data, BEC built maps and a dashboard to present local officials with evidence of disparate outcomes by race, age, and other demographic characteristics.
Key Takeaways & Recommendations:
A Dynamic, Inclusive Approach to Learning Agenda Development for the Centers for Disease Control and Prevention’s Center for State, Tribal, Local, and Territorial Support: Reflections on the Participant Engagement Process
Presented by Elizabeth Douglas, Senior Manager, ICF and Jessie Rouder, Lead Research Scientist, Behavioral Health, ICF
Representatives from the Center for State, Tribal, Local, and Territorial Support (CSTLTS) shared their experience as one of the first Centers for Disease Control (CDC) centers to develop a learning agenda, describing the process and the lessons learned. CSTLTS presented the learning agenda as an opportunity to align efforts with other HHS, CDC, or center-wide strategic planning processes.
The center took a phased approach to learning agenda development, in part to ensure robust engagement. The team engaged appropriate participants, identified key learning agenda questions, and then designed evidence-building activities to answer priority questions. The presentation highlighted strategies for stakeholder engagement, including orientation meetings, virtual workshops, and consultations with leadership.
Key Takeaways & Recommendations:
Best Practices for Monitoring and Evaluating the ARP, IIJA, and Other Programs: Report of the Department of Commerce Data Governance Working Group
Presented by Carla Medalia, Assistant Division Chief for Business Development, Economic Reimbursable Surveys Division, U.S. Census Bureau; Ron Jarmin, Deputy Director, U.S. Census Bureau; Ryan Smith, Policy Advisor for the Office of Regional Affairs, Economic Development Administration; Oliver Wise, Chief Data Officer, Department of Commerce
The Department of Commerce received an influx of funding from the American Rescue Plan (ARP) and the Infrastructure and Investment in Jobs Act (IIJA), significantly increasing its annual appropriations and opening opportunities to demonstrate value to taxpayers. To maximize the department’s ability to leverage program data and improve program outcomes, the department formed a data governance working group composed of bureaus that are managing or receiving ARP or IIJA funding. The working group identified three phases of collecting and using program performance data:
Developing a shared structure and quality standards to facilitate linkage across bureaus;
Developing common metadata standards to ensure departmental interoperability; and
Identifying and overcoming barriers to implementing the identified systems and standards to ensure data use.
Data linkage with Census Bureau-collected survey and administrative data expanded the department’s capacity to understand how well recipients are using funding to carry out programs and achieve intended outcomes.
Key Takeaways & Recommendations:
Capacity-Building Going Forward
Forty speakers presented 16 different programs and projects over the course of the Data Foundation’s two-day event.
The presentations highlighted innovative approaches to improve government data collection, governance, and use.
Offering insight into strategies for planning and carrying out agency-wide capacity assessments and implementing sustainable changes to address agency data capacity, the symposium identified four core lessons:
The symposium coincided with the release of many federal agencies’ self-assessments of evidence-building capabilities and activities, including capacity assessments and learning agendas – all requirements of the Evidence Act.
The Data Foundation hopes this research will highlight the value of data and evidence for policymaking and encourage continued collaboration and learning between agencies and non-governmental organizations to support successful Evidence Act implementation.
DATA FOUNDATION
1100 13TH STREET NORTHWEST
SUITE 800, WASHINGTON, DC
20005, UNITED STATES