An ongoing project at CASBS on evidence-informed decision making is launching an “Impact Evaluation Design Lab” to help policymakers assess large-scale public policy programs and engage researchers in advancing social scientific methodology through real-world evaluation design challenges.
The first partners: King County, WA and Stockton, CA.
Over the past 25 years government officials, foundations, researchers, and the public have become increasingly interested in ensuring that social interventions result in intended outcomes and are an effective use of public and private investments. With impetus from medical science, randomized controlled trials (RCTs) — administering program benefits to treatment groups while withholding interventions from control groups and then analyzing the impacts — have become a primary vehicle for identifying evidence about the causal effects of policy interventions. The rigor of RCTs, and clarity regarding their results, have encouraged this adoption. In turn, the increased use of RCTs in social interventions has spurred methodological development in the statistical analysis and design of randomized experiments.
But not every policy intervention lends itself to RCT evaluation. Both evaluators and government officials realize that randomization of policy interventions isn’t always possible, and can be financially and logistically infeasible, ethically challenging, or produce results that may only be replicable in a narrow set of circumstances.
Statistical techniques called observational analyses allow policy makers and scholars to build evidence about “what works” when randomization cannot be practically applied. Such methods, though, are often are highly complex – involving advanced statistical techniques, large data sets, and lots of computational power. Furthermore, results can be sensitive to different assumptions and even researchers — much less decision makers and the public — may have difficulty understanding how results from such studies are produced.
Consequently, many governments, foundations, and other institutions conducting important, valuable work within the evidence-informed policy space underutilize observational analyses for policy evaluation. This is compounded by a need for more expertise sharing between social scientists and policy practitioners that could advance both the science of causal inference and real-world practice.
The CASBS Evidence-Informed Decision-Making Project
This is where the CASBS evidence-informed decision-making project steps in. At the initiative of then-CASBS board chair (and current board member) Paul Brest, in 2015 key CASBS players formally proposed a project in the hopes of making major contributions to the evidence-informed policy movement. The thinking was that, through both its fellowship program and its convening power, the Center could help speed the transmission of academic knowledge into practice and that engagement with policymakers could likewise help inform the development of the causal inference field.
With early funding from the Knight Foundation and with the leadership of board members Brest, Jennifer Eberhardt, Gary King, and Stanford Law School faculty (and current CASBS faculty fellow) Dan Ho, CASBS sponsored two workshops, resulting in a recognition that one of many constraints is that government policymakers and academic experts operate under incentives that are not aligned for rich collaboration. The outcome was the establishment of the first residential fellowship lines at CASBS dedicated to causal inference and policy, made possible in part by the generous support of CASBS board member Salar Kamangar. The fellowship year allows causal inference researchers and policy practitioners space and time away from day-to-day demands to grapple with integrating the work of social and behavioral scientists in a practical way that addresses government’s needs.
2017-18 Core Team
Enter Carrie S. Cihak, Graham Gottlieb, and Jake Bowers.
As Chief of Policy for King County Executive Dow Constantine, the highest-ranking elected official in King County, Washington, Cihak identifies high-priority policy areas and community outcomes for leadership focus and develops innovative solutions to complex issues, particularly those involving equity and social justice. Graham Gottlieb, a public policy practitioner serving most recently at the U.S. Agency for International Development, is in pursuit of organizational reforms and new policy mechanisms that can optimize the way outside knowledge and funding flows to government — specifically for generating evidence-informed policy. Both Cihak and Gottlieb are 2017-18 policy fellows in residence at CASBS, with Cihak’s fellowship supported by Kamangar.
Their principal partner, Jake Bowers, is a political scientist and statistician based at the University of Illinois at Urbana-Champaign, as well as a fellow with the federal government’s Office of Evaluation Sciences. He’s a respected leader in developing shared languages and standards for scientific integrity and causal inference, improving projects via design critiques, and generally advancing the role of social and behavioral scientists in policy innovation and evaluation. He is working alongside Cihak and Gottlieb as a 2017-18 CASBS research affiliate and will be in-residence as a causal inference fellow in 2018-19.
This CASBS core team, with support from the board, CASBS director Margaret Levi, and an advisory committee, are steering the evidence-informed decision-making project into its next phase by launching an annual CASBS Impact Evaluation Design Lab.
The CASBS Impact Evaluation Design Lab
By convening both academic experts and local government practitioners around evaluation design challenges for real-world policy interventions, the Impact Evaluation Design Lab seeks to advance the science of causal inference and provide policymakers with better information on whether programs achieve their intended impacts.
The first design lab launches with a two-day workshop in late March, followed by a summer institute in late June, with work occurring before, between, and after the gatherings. CASBS will host the design lab and connect experts and academics interested in conducting evaluations with government actors who will bring evaluation problems, data, and context.
The March workshop will identify the most important questions for evaluation, delve into data issues, and identify promising avenues for rigorous evaluation design that the government and academic partners will tackle together in the following months.
The June summer institute, sponsored principally by SAGE Publishing, will include a “design hack-a-thon,” where government teams will present evaluation design proposals for input, feedback, and critique. The government teams will leave with well-developed evaluation plans, and academic participants hope to leave with promising methodological innovations that address current limitations with causal inference theory and practice.
The CASBS core team already is developing plans for the 2019 design lab. It will add an “analysis hack-a-thon” that provides input, feedback, and critique of 2018’s evaluation projects before results are released publicly.
Bowers is excited by the potential. ”By engaging a range of academic experts in building evaluation design and interrogating the results, and encouraging best practices such as the registration of pre-analysis plans, we can really enhance credibility of and confidence in the results.”
Real-World Impact
By no means is the Design Lab an abstract exercise. For this first design lab, the CASBS team has partnered with two local governments, including Cihak’s home institution: King County, based in Seattle.
In 2015, King County’s Metro Transit department implemented ORCA LIFT - an innovative initiative that provides low-income riders with steeply discounted public transit fares. King County leaders hypothesize that ORCA LIFT better connects enrollees – now numbering over 60,000 – to a range of positive outcomes relating to employment, education, and health. But this has not been evaluated rigorously.
“The county is focused on building equity through increasing access for people to the host of things that allow them to pursue opportunity and fulfill their potential,” says Cihak. People in low-income households or those in communities that have been historically disadvantaged face a number of barriers to accessing opportunity, including transportation mobility. We’re trying to figure-out how to reduce the structural barriers people face and ORCA LIFT is part of our big theory of change.”
The CASBS team and their collaborators aspire to push the bounds of non-randomized, observational methods and improve evaluation of the causal effects of public policies like ORCA LIFT. But that is no easy task, at least for the complex social problems and policies they are considering. Paul Brest offers the insight that even the most rigorous evaluation design is unlikely to provide definitive, replicable answers.
“In addition to understanding the causal mechanisms,” Brest said recently, “we need to identify how much confidence we have in the relationship between a policy and its intended outcome. Moreover, the issue is not just whether a government program causes an outcome or not, but to what extent.”
“A rigorous, causal evaluation of ORCA LIFT will help us learn how and to what extent the program is contributing to better quality of life for people with low-income in King County," says Cihak. "Our participation in the CASBS design lab – where we are able get advice from several top experts in causal inference, will give us confidence to use the results in our decision making.”
In addition, the city of Stockton, CA, is launching a guaranteed income initiative — the Stockton Economic Empowerment Demonstration (SEED) — and seeks evaluation of this demonstration and its effects. It is among the first significant guaranteed income tests in the United States.
Importantly, many local governments around the country are watching both King County and Stockton’s experiments closely. “Knowledge, learning, and results generated in the CASBS design lab could be instrumental in the decisions other governments make regarding their own programs,” says Gottlieb. “Part of what we want to accomplish through the design lab is not just an individual evaluation, but to influence the broader space.”
Next
If successful, the 2018 design lab — and the overall project of which it is a part — will leverage into a bigger initiative that engages a wider circle of governments, academics, and other institutions, providing a foundation for further scaling.
“CASBS, moreover, will be in a position to exert influence on current institutional and organizational practices towards new standards of best practice for policy evaluation,” says Bowers.
“If you’re only taking existing evidence and methods and applying them, you’re not innovating,” says Cihak. “That’s not going to create a cycle of learning. And that’s not what the evidence-informed policy movement is about.
“We at CASBS — in interaction with researchers, governments, and other partners — have to take risks in order to innovate, and then be willing to rigorously evaluate the results. That’s what we’re doing not only with projects in the design lab, but also with the design lab itself.”
Learn more about the CASBS 2018 Impact Evaluation Design Lab