Table of contents


Vision

Mission

Value

1: Data & Results

1.1: Data Linking & Privacy

1.2: Provider Scorecards

2: Evaluations & Experiments

2.1: Behavioral Insights

2.2: Evaluation

2.3: Existing Programs & Datasets

3: Grants

3.1: Pay for Success

3.2: Tiered-Evidence

3.3: Performance Partnerships & Waiver Demonstrations

3.4: Competitive Grants

5: Multi-Phase Selection

4: Agency Capacity

4.1: Agency-Wide Evaluation Plans

4.2: Evidence Guidelines

4.3: Learning Networks

4.4: What Works Clearinghouses

4: Other Strategies



StrategicPlan

Next Steps in the Evidence and Innovation Agenda

To help agencies move forward in harnessing evidence and evaluation, this memo: • Provides guidance for 2015 agency Budget submissions and describes plans to prioritize Budget requests that strengthen the use of evidence and innovation. • Invites agencies to participate in a series of workshops and interagency collaborations organized by the Executive Office of the President to help agencies develop and strengthen proposals that catalyze innovation and learning. While much of the focus will be on proposals that can be implemented without additional resources, there will be limited funding available in the President's 2015 Budget for strong proposals that require some new funding

The President recently asked his Cabinet to carry out an aggressive management agenda for his second term that delivers a smarter, more innovative, and more accountable government for citizens. An important component ofthat effort is strengthening agencies' abilities to continually improve program performance by applying existing evidence about what works, generating new knowledge, and using experimentation and innovation to test new approaches to program delivery. This is especially impmiant given current fiscal challenges, as our nation recovers from a deep recession and agencies face tough choices about how to meet increased demand for services in a constrained resource environment.

Source:
http://www.whitehouse.gov/sites/default/files/omb/memoranda/2013/m-13-17.pdf

Start: End: Publication Date: 2013-08-05

Submitter:

First name: Owen

Last name: Ambur

Email Address: Owen.Ambur@verizon.net

Organization:

Name: Office of Management and Budget

Acronym: OMB

Stakeholder(s):

Sylvia M. Burwell: Director, Office of Management and Budget

Cecilia Munoz: Director, Domestic Policy Counci

John Holdren : Director, Office of Science and Technology Policy

Alan Kmeger: Chairman, Council of Economic Advisors

Vision

A smarter, more innovative, and more accountable government

Mission

To strengthen agencies' abilities to continually improve program performance by applying existing evidence about what works, generating new knowledge, and using experimentation and innovation to test new approaches to program delivery.


Goal 1: Data & Results

Harness data to improve agency results.

Stakeholder(s):

Federal Agencies: Administrative data collected by Federal, State, or local agencies to run programs can be a valuable resource for program improvement and for helping agencies, consumers, and providers make more informed decisions.

State Agencies

Local Agencies

Consumers

Providers

Researchers

Objective(s):

1.1: Data Linking & Privacy

1.2: Provider Scorecards


Other Information:

Harnessing data to improve agency results: Proposals should enable agencies and/or researchers to access and utilize relevant data to answer important questions about program outcomes while fully protecting privacy. For example, by linking data on program participants to administrative data on earnings, college-going, health, or other outcomes, agencies may be able to improve their understanding of program performance and ultimately improve results. Projects should build on the recent Executive Order, "Making Open and Machine Readable the New Default for Government Information," as well as on the Memorandum "Sharing Data While Protecting Privacy" (M-11-02). We especially encourage proposals that use administrative data to track important outcome measures for federal grant programs, and we are open to proposals that substitute higher quality administrative data for existing grantee reporting requirements.

Objective 1.1: Data Linking & Privacy

Linking data across programs and levels of government while fully protecting privacy

Stakeholder(s):

Department of Housing and Urban Development: Example: The Department of Housing and Urban Development has partnered with the Department of Health and Human Services to match HUD administrative data with Centers for Medicare & Medicaid Services data. The two agencies recently completed a successful match that will improve understanding of the characteristics of seniors living in publicly subsidized housing and how supportive housing interventions may affect their health care use.

Department of Health and Human Services

Centers for Medicare & Medicaid Services

Other Information:

Linking data across programs can lower evaluation costs and improve their quality, streamline reporting requirements for program providers and participants, and answer important questions about program performance. A number of Federal agencies are currently developing or using protocols and processes to share personally identifiable data to permit such linkages in ways that fully adhere to laws, regulations, and policies designed to protect individual privacy and confidentiality.

Objective 1.2: Provider Scorecards

Use data from government agencies to create provider scorecards.

Stakeholder(s):

Service Providers

Department of Education: Example: The College Scorecard, launched earlier this year, highlights key indicators about the cost and value of colleges and universities to help high school students choose a post-secondary school that meets their needs. It is produced by the Department of Education and posted on its web site. The Scorecard includes data on costs, graduation rates, loan default rates, and average student debt-and average earnings of recent graduates will be added soon.

Colleges

Universities

Other Information:

Reliable data from government agencies can be used to create provider scorecards that compare how well different service providers perform. Scorecards are a tool for agencies and consumers to make more informed decisions and choices-and for providers to better understand and improve their performance. If data on participant characteristics are available, such as education level or income, scorecards can go a step further by enabling more detailed comparisons of alternative providers that serve people with similar characteristics.


Goal 2: Evaluations & Experiments

[Engage in] high-quality, low-cost evaluations and rapid, iterative experimentation

Stakeholder(s):

Innovative Companies: Many innovative companies use rapidly conducted randomized field trials to identify high­-impact innovations and move them quickly into production.

Public Sector: In the public sector, low-cost, frequent field tests do not replace longer-term, rigorous evaluations - they supplement them.

Innovative Administrators: They allow innovative administrators to say: "Might this help boost results? Let's try it and see if it works. "

Objective(s):

2.1: Behavioral Insights

2.2: Evaluation

2.3: Existing Programs & Datasets


Other Information:

High-quality, low-cost evaluations and rapid, iterative experimentation: Proposals should help agencies improve the quality and timeliness of evaluations, for example by building evaluation into ongoing program changes; reducing costs by measuring key outcomes in existing administrative data sets; and drawing on private sector approaches that use frequent, low-cost experimentation to test strategies to improve results and return on investment. Proposals should utilize randomized controlled trials or careful quasi­-experimental techniques to measure the effect of interventions on important policy outcomes. We particularly welcome proposals that draw on behavioral insights to improve results and lower costs in direct operations.

Objective 2.1: Behavioral Insights

Apply behavioral insights to improve results and lower costs in direct operations.

Stakeholder(s):

Fiscal Service: Example: Research has revealed the power of"social norms" on behavior, meaning the influence of what others do on our decisions. Building on this insight, the Fiscal Service at the Treasury Department has recently updated the text and format of letters sent to individuals with delinquent debt to the federal government. The new letters, which will be tested against the older version using a randomized control trial, use simplified language, personalization, and a reference to social norms (i.e., the fact that 94% of outstanding debts are paid off on time and that the recipient is in the fraction that has not yet paid) to motivate a higher rate of debt repayment.

Treasury Department

Other Information:

Human decision making is central to many public policy interventions. Major advances have been made in research regarding the influences that drive people's decisions and choices, and these new insights can significantly improve policy outcomes at a lower cost.

Objective 2.2: Evaluation

Usie high-quality evaluation to answer important policy and program questions.

Stakeholder(s):

Occupational Safety and Health Administration: Examples: Current Federal evaluations cover a diverse set of issues, including the Occupational Safety and Health Administration examining the effectiveness of on-site consultation, inspections, and corrective action letters on worker injury/illness rates, the Millennium Challenge Corporation examining the impact of road improvements in El Salvador or commercial training activities in Ghana, and the Department of Energy examining the effects of smart grids and dynamic pricing on household energy use.

Millennium Challenge Corporation

El Salvador

Ghana

Department of Energy

Other Information:

Rigorous impact evaluations, especially those using random assignment to program and control groups, can provide strong evidence on key policy or program questions within an agency. They can help determine whether a program works and whether an alternative practice might work better.

Objective 2.3: Existing Programs & Datasets

Conduct evaluations by drawing on existing data to measure outcomes and on program changes that are being implemented anyway.

Stakeholder(s):

Hawaii's Opportunity Probation with Enforcement (HOPE) Program: Example: Hawaii's Opportunity Probation with Enforcement (HOPE) Program is a supervision program for drug-involved probationers. The program was evaluated using a randomized control trial at a cost of about $150,000 for the evaluation. The low cost for this rigorous evaluation was achieved by measuring outcomes using administrative data (e.g., arrest records) that the state already collected for other purposes, rather than doing costly new data collection. The study found that HOPE group members were 55 percent less likely than control group members to be re-arrested during the first year.

Other Information:

High-quality, low-cost evaluations that piggy-back on existing programs and datasets -- By drawing on existing data to measure outcomes and on program changes that are being implemented anyway, agencies can conduct high-quality randomized evaluations at low cost. For example, when a program change is being phased in gradually or a program is oversubscribed, participants could in some cases be selected based on random assignment, allowing for rigorous evaluation.


Goal 3: Grants

Use innovative outcome-focused grant designs.

Stakeholder(s):

Federal Grant-Making Agencies: Because many Federal dollars flow to States, localities, and other entities through competitive and formula grants, grant reforms are an important component of strengthening the use of evidence in government. The goals include encouraging a greater share of grant funding to be spent on approaches with strong evidence of effectiveness and building more evaluation into grant-making so we keep learning more about what works.

States

Localities

Grant Recipients

Objective(s):

3.1: Pay for Success

3.2: Tiered-Evidence

3.3: Performance Partnerships & Waiver Demonstrations

3.4: Competitive Grants

5: Multi-Phase Selection


Other Information:

Using innovative outcome-focused grant designs: Proposals should expand or improve the use of grant program designs that focus Federal dollars on effective practices while also encouraging innovation in service delivery. These include tiered-evidence grants, Pay for Success initiatives and other pay for performance approaches, Performance Partnerships allowing blended funding, waiver demonstrations, incentive prizes, competitive incentive funds that encourage the use of evidence-based practices in formula grants, or other strategies to make grant programs more evidence focused.

Objective 3.1: Pay for Success

Partner with philanthropic and private investors to fund proven and promising practices and to significantly enhance the return on taxpayer investments.

Stakeholder(s):

Philanthropic Investors

Private Investors

Taxpayers

Department of Justice: Examples: The Department of Justice is coordinating PFS projects to use more effective prisoner re-entry interventions to reduce recidivism and its associated costs.

Department of Labor : And the Department of Labor has launched an effort to test new and more effective strategies for delivering workforce development and preventative social services that cut across existing program siloes, increase job placement and improve job retention.

Other Information:

Pay for Success offers innovative ways for the government to partner with philanthropic and private investors to fund proven and promising practices and to significantly enhance the return on taxpayer investments. Under this model, investors provide the up-front capital for social services with a strong evidence base that, when successful, achieve measurable outcomes that improve the lives of families and individuals and reduce their need for future services. Government pays when these measurable results are achieved. The PFS model is particularly well-suited to cost-effective interventions that produce government savings, since those savings can be used to pay for results.

Objective 3.2: Tiered-Evidence

Focus resources on practices with the strongest evidence.

Stakeholder(s):

Department of Education: Example: The Department of Education's Investing in Innovation Fund (i3) invests in high-impact, potentially transformative education interventions, ranging from new ideas with significant potential to those with strong evidence of effectiveness that are ready to be scaled up. Based on the success of i3, the Department recently issued proposed regulations that would allow its other competitive grant programs to adopt this three­ tiered model.

Other Information:

Tiered-evidence grant designs -- "Tiered-evidence" or "innovation fund" grant designs focus resources on practices with the strongest evidence, but still allow for new innovation. In a three-tiered grant model, for example, grantees can qualify for 1) the "scale up" tier and receive the most funding; 2) the "validation" tier and receive less funding but evaluation support; or 3) the "proof of concept" tier and receive the least funding, but also support for evaluation. With a tiered-evidence approach, potential grantees know that to be considered for funding, they must provide demonstrated evidence behind their approach and/or be ready to subject their models to evaluation. The goal is that, over time, interventions move up tiers as evidence becomes stronger. So far five agencies have launched or proposed 13 tiered grant programs in the areas such as education, teenage pregnancy prevention, home visitation programs, workforce, international assistance, and more.

Objective 3.3: Performance Partnerships & Waiver Demonstrations

Enable States and localities to demonstrate better ways to use resources, by giving them flexibility to pool discretionary funds across multiple Federal programs serving similar populations and communities.

Stakeholder(s):

States

Localities

Federal Programs: Example: The 2014 Budget would authorize up to 13 State or local performance partnership pilots to improve outcomes for disconnected youth. Pilot projects would support innovative, efficient, outcome-focused strategies using blended funding from separate youth-serving programs in the Departments of Education, Labor, Health and Human Services, Housing and Urban Development, Justice, and other agencies.

Youth-Serving Programs

Department of Education

Department of Labor

Department of Health and Human Services: Department of Housing and Urban Development, Justice

Department of Housing and Urban Development

Department of Justice

Other Information:

Performance Partnership pilots enable States and localities to demonstrate better ways to use resources, by giving them flexibility to pool discretionary funds across multiple Federal programs serving similar populations and communities in exchange for greater accountability for results. With waiver demonstrations, Federal agencies suspend certain programmatic requirements in discretionary or mandatory programs to support State and local innovations that are then rigorously evaluated to learn what works and what is cost effective.

Objective 3.4: Competitive Grants

Use competitive grants to promote use of evidence in formula grants.

Stakeholder(s):

Formula Grant Programs

States

HHS: Example: For HHS, the 2014 Budget proposes to require that States use five percent of their mental health block grant allocation for grants that use the most effective evidence­-based prevention and treatment approaches. The Senate Appropriations Committee adopted this policy in its recent bill.

Senate Appropriations Committee

Other Information:

Formula grant programs are often the largest grant programs in government, so they are a critical area for advancing more results-focused government. Agencies can improve the effectiveness of formula grant programs by using competitive grants to encourage adoption of evidence-based approaches within formula grants. For instance, agency competitions can give preference points to State and local applicants implementing evidence-based practices with their formula funds. And formula grants to States can include set-asides for States to award competitively to promote use of evidence.

Objective 5: Multi-Phase Selection

Enhance grant-funded projects by conducting a multi-phase selection process.

Stakeholder(s):

PROMISE: Example: The Promoting Readiness of Minors in the Supplemental Security (PROMISE) program began with coordinated planning by the Departments of Education, HHS, Labor and the Social Security Administration to review existing research and gather input from experts to develop an integrated service delivery model that was incorporated into the grant solicitation. The next phases are grantee selection and rigorous evaluation of grantees' approaches.

Department of Education

HHS

Department of Labor

Social Security Administration

Other Information:

Multi-phase grant competitions -- The quality of grant-funded projects can be enhanced by conducting a multi-phase selection process. In the first phase, before selection, agencies can share research findings with potential applicants to ensure they are integrated into project designs and implementation strategies. Expert input can also be used to develop program models or variations within models that the grant program could test and evaluate. Moreover, preference points can be given to applicants that implement research-informed models and agree to participate in a rigorous evaluation. Multi-phase designs are particularly useful when there are many applications of varying quality, where a streamlined pre-application process can identify leading proposals.


Goal 4: Agency Capacity

Strengthen agency capacity to use evidence.

Stakeholder(s):

Agencies

Managers

Program Officers

Review Panels

Decision Makers: Evaluation is useful only to the extent that it is being used for decision making. An evaluation plan that focuses evidence-building resources on the most relevant and actionable issues helps generate useful knowledge. Common evidence standards and What Works Clearinghouses, meanwhile, help make existing evidence more useful to decision makers.

Objective(s):

4.1: Agency-Wide Evaluation Plans

4.2: Evidence Guidelines

4.3: Learning Networks

4.4: What Works Clearinghouses


Other Information:

Strengthening agency capacity to use evidence: Proposals should strengthen agency capacity by promoting knowledge-sharing among government decision-makers and practitioners through clearinghouses that help translate strong research into practice; enhancing the skills of managers, program officers, and review panels to assess and use available evidence; and developing common evidence frameworks to better distinguish strong from weak evidence and measure cost effectiveness.

Objective 4.1: Agency-Wide Evaluation Plans

Focus evaluation resources on high priority issues.

Stakeholder(s):

Department of Labor: Example: The Department of Labor has a Chief Evaluation Office (CEO) that works closely with program offices to develop and implement evaluation agendas set by policy officials. It also promotes high standards for data systems; monitors and reviews research and evaluation plans initiated by DOL agencies to ensure they are consistent with departmental goals and the highest standards of empirical rigor; works to institutionalize an evidence-based culture through seminars and forums on evaluation topics and findings; and maintains an active connection with outside experts to ensure that the Department is aware of relevant research and evaluation findings and activities.

Chief Evaluation Office (CEO)

Other Information:

An agency-wide evaluation plan developed with senior policy and program officials can focus evaluation resources on high priority issues - for example, questions that are most important for improving program results-and on rigorous methodologies that produce actionable insights.

Objective 4.2: Evidence Guidelines

[Provide] common research standards and evidence frameworks across agencies.

Stakeholder(s):

Evaluation Officials: Example: Evaluation officials from the Departments of Education, Labor, Health and Human Services, and the National Science Foundation are jointly developing common evidence guidelines for research studies that can be a resource for improving the quality of studies throughout the Federal Government.

Department of Education

Department of Labor

Department of Health and Human Services

National Science Foundation

Other Information:

Common evidence guidelines for various types of research studies -- Common research standards and evidence frameworks across agencies can facilitate evaluation contracting, information collection clearance, and the strengthening or creation of research clearinghouses and repositories about "what works." They also help agencies use results from different types of high quality studies to identify effective programs, improve programs, and encourage innovative new approaches.

Objective 4.3: Learning Networks

Share best practices on procurement, evidence guidelines, and evaluation and performance measurement.

Stakeholder(s):

Learning Networks

Small Business Administration: Example: The Small Business Administration and the Departments of Agriculture and Commerce, with guidance from OMB and CEA, are working together with the Census Bureau to find more robust ways to evaluate the impact of Federal business technical assistance programs. The goal of the working group is to develop a standard methodology for measuring the impact of these types of technical assistance programs across the Federal Government.

Department of Agriculture

Department of Commerce

OMB

CEA

Census Bureau

Federal Business Technical Assistance Programs

Other Information:

Cross-agency learning networks -- Inter-agency working groups of evaluation and program officials within the Federal Government can share best practices, including helping spread effective procurement practices, developing common evidence guidelines, and better integrating evaluation and performance measurement efforts. Other cross-agency groups are forming learning networks around specific policy issues in order to share relevant research and develop shared evaluation strategies.

Objective 4.4: What Works Clearinghouses

[Provide] repositories that synthesize evaluation findings in ways that make research useful to decision-makers, researchers, and practitioners.

Stakeholder(s):

Decision-Makers

Researchers

Practitioners

Federal Innovation Programs: Examples: Current "what works" clearinghouses include the Department of Justice's CrimeSolutions.gov, the Department of Education's What Works Clearinghouse, the Substance Abuse and Mental Health Services Administration's National Registry of Evidenced-based Programs and Practices, and the Department of Labor's new Clearinghouse ofLabor Evaluation and Research.

Department of Justice

CrimeSolutions.gov

Department of Education

What Works Clearinghouse

Substance Abuse and Mental Health Services Administration

National Registry of Evidenced-based Programs and Practices

Department of Labor

Clearinghouse of Labor Evaluation and Research

Other Information:

"What works" clearinghouses are repositories that synthesize evaluation findings in ways that make research useful to decision-makers, researchers, and practitioners. Moreover, as Federal innovation funds and other programs provide financial incentives for using and building evidence, these repositories provide useful tools for understanding what interventions are ready for replication or expansion and disseminating results.


Goal 4: Other Strategies

Propose other strategies that would significantly improve agency capacity to use or build evidence to achieve better results or increase cost-effectiveness in high priority programs.

Stakeholder(s):

Agencies: While agencies are encouraged to submit proposals that can be implemented within current statutory authorities, legislative changes will also be considered. (Please note where a proposal would require legislative changes.) Agencies may also propose new investments in evidence-building infrastructure for high-priority areas in cases where the benefits substantially outweigh the costs. Agencies may wish to consider new financing approaches, set-asides that designate a small fraction of funding for evaluation and evidence development; and partnerships with other federal agencies, state and local governments, non-profit organizations, and academic institutions. We particularly encourage proposals that cross agency boundaries or other functional silos. Agencies should work with their OMB contacts to agree on a format within their 2015 budget submissions to: (1) explain agency progress in using evidence and (2) present their plans to build new knowledge of what works and is cost-effective. An example of a template that could be used to provide this information to Resource Management Offices is available at https://max.gov/omb/evidence.

Objective(s):


Other Information:

Other agency-specific needs: Agencies may propose other strategies that would significantly improve their capacity to use or build evidence to achieve better results or increase cost-effectiveness in high priority programs. In addition to developing strategies to use evidence to promote continuous, incremental improvement, agencies are also encouraged to submit proposals that would test higher-risk, higher-return innovations with the potential to lead to more dramatic improvements in results or reductions in cost.

Objective

XMLDatasets.net
(Stylesheet revision: 2012-09-20)

01 COMMUNICATIONS INC.
http://stratml.DNAOS.com/stratml.html

Stylesheet revision (main): 2010-10-20T20:10:10.20Z
Stylesheet revision (base): 2010-10-20T20:10:10.20Z

XMLDatasets.net
http://www.xmldatasets.net/StratML