Evidence-based policy

From Wikipedia, the free encyclopedia

Background and Definition[]

Evidence-based policy (EBP) is an idea in public policy proposing that policy decisions should be based on, or informed by, rigorously established objective evidence. The implied contrast here is with policymaking based on ideology or 'common sense'. Evidence-based policy has two core components. First, it needs to use a thorough research method, such as randomized controlled trials (RCT), to test a hypothesis and construct an evidence. Second, the evidence that is obtained from RCT needs to be used that the interventions are effective. [1] Good data, analytical skills and political support to the use of scientific information, as such, are typically seen as the important elements of an evidence-based approach.[2]

Some have promoted particular types of evidence as 'best' for policymakers to consider, including scientifically rigorous evaluation studies such as randomized controlled trials to identify programs and practices capable of improving policy-relevant outcomes. However, some areas of policy-relevant knowledge are not well served by quantitative research, leading to debate about the methods and instruments that are considered critical for the collection of relevant evidence. For instance, policies that are concerned with human rights, public acceptability, or social justice may require other evidence than what randomized trials provide, or may require moral philosophical reasoning in addition to considerations of evidence of intervention effect (which randomized trials are principally designed to provide[3]).It is also assumed that social goals are best served when scientific evidence is used rigorously and comprehensively to inform decisions, rather than in a piecemeal, manipulated, or cherry-picked manner.

Some policy scholars now avoid using the term evidence-based policy, using others such as evidence informed. This language shift allows continued thinking about the underlying desire to improve evidence use in terms of its rigor or quality, while avoiding some of the key limitations or reductionist ideas at times seen with the evidence-based language. Still, the language of evidence-based policy is widely used and, as such, can be interpreted to reflect a desire for evidence to be used well or appropriately in one way or another – such as by ensuring systematic consideration of rigorous and high quality policy relevant evidence, or by avoiding biased and erroneous applications of evidence for political ends.[4]

History[]

Although the term and reference, Evidence-based policy, didn't see expansion in medicine until in the 1990's[5] and in social policy in early 2000's, the emergence of evidence based practice can be dated way back to the early 1900s.[1] The move towards modern evidence-based policy has its roots in the larger movement towards evidence-based practice, which was prompted by the rise of evidence-based medicine in the 1980s. It is, nevertheless, a controversial idea.[6] The earliest form of evidence-based policy was tariff-making in Australia which was required under legislation to be educated by the public report issued by the Tariff Board. These reports were initially only reporting on the impacts but changed to also report on the effects of industries and the economy.[7]

History of Evidence-Based Medicine[]

The phrase evidence-based medicine (EBM) was coined by Gordon Guyatt,[8] however earliest form of EBM traces to further back to the early 1900's. Some also argue that earliest form of EBM occurred in the 11th century, when Ben Cao Tu Jing from Song Dynasty said, "In order to evaluate the efficacy of ginseng, find two people and let one eat ginseng and run, the other run without ginseng. The one that did not eat ginseng will develop shortness of breath sooner."[9] Many scholars see the term evidence-based policy as evolving from "evidence-based medicine", in which research findings are used as the support for clinical decisions and evidence is gathered by randomized controlled trials (RCTs), which is comparing a treatment group with a placebo group to measure results.[10] Even though the earliest published RCTs in medicine were during WWII and post-war era: 1940s and 1905s,[1] the word 'evidence-based medicine did not appear in published medical research until 1993.[5] In 1993, the Cochrane Collaboration was established in the UK, and works to keep all RCTs up-to-date and provides "Cochrane reviews" which provides primary research in human health and health policy.[10][11] The evolution of the appearance of the keyword EBM has increased since the 2000's and the effect of EBM has seen major expansion to the field of medicine.[12]

History of Evidence-Based Policy Making[]

RCTs were late to appear in the social policy as compared to the medicine field. Although evidence-based policy can be traced as far back as the fourteenth century, it was more recently popularized by the Blair Government in the United Kingdom.[7] The Blair Government said they wanted to end the ideological led-based decision making for policy making.[7] For example, a UK Government white paper published in 1999 ("Modernising Government") noted that Government must "produce policies that really deal with problems, that are forward-looking and shaped by evidence rather than a response to short-term pressures; that tackle causes not symptoms".[13]There was then an increase in research and policy activists pushing for more evidence-based policy-making which led to the formation of the sister organization to Cochrane Collaboration, the Campbell Collaboration in 1999.[10][14] The Campbell Collaboration conducts reviews on the best evidence that analyzes the effects of social and educational policies and practices.

The Economic and Social Research Council (ESRC) became involved in the push for more evidence-based policymaking with its 1.3 million pound grant to the Evidence Network in 1999. The Evidence Network is a center for evidence-based policy and practice and is similar to both the Campbell and Cochrane Collaboration.[10] More recently the Alliance for Useful Evidence has been established, with funding from ESRC, Big Lottery and Nesta, to champion the use of evidence in social policy and practice. The Alliance is a UK-wide network that promotes the use of high-quality evidence to inform decisions on strategy, policy and practice through advocacy, publishing research, sharing ideas and advice, and holding events and training.

Recently questions have been raised about the conflicts-of-interest inherent to evidence-based decision-making used in public policy development. In a study of vocational education in prisons operated by the California Department of Corrections, Andrew J. Dick, William Rich, and Tony Waters found that political considerations inevitably intruded into “evidence-based decisions” which were ostensibly technocratic in nature. They point out that this is particularly true where evidence is paid for by policymakers who have a vested interest in having past political judgments confirmed, evidence-based research is likely to be corrupted, leading to policy-based evidence making.[15]

Methodology[]

There are many methodologies for evidence-based policy but they all share the following characteristics:

  • Tests a theory as to why the policy will be effective and what the impacts of the policy will be if it is successful
  • Includes a counterfactual: what would have occurred if the policy had not been implemented
  • Incorporates some measurement of the impact
  • Examines both direct and indirect effects that occur because of the policy
  • Separates the uncertainties and controls for other influences outside of the policy that may have an effect on the outcome
  • Should be able to be tested and replicated by a third party

The form of methodology used with evidence-based policy fit under the category of a cost-benefit framework and are created to estimate a net payoff if the policy was to be implemented. Because there is a difficulty in quantifying some effects and outcomes of the policy, it is mostly focused broadly on whether or not benefits will outweigh costs, instead of using specific values.[7]

Critiques[]

Several critiques have emerged. Paul Cairney, professor of politics and public policy at the University of Stirling in Scotland, argues[16] that supporters of the idea underestimate the complexity of policy-making and misconstrue the way that policy decisions are usually made. Cartwright and Hardie[17] oppose emphasizing randomized controlled trials (RCTs). They show that the evidence from RCTs is not always sufficient for undertaking decisions. In particular, they argue that extrapolating experimental evidence into policy context requires understanding what necessary conditions were present within the experimental setting and asserting that these factors also operate in the target of considered intervention. Furthermore, considering the prioritization of RCTs, the evidence-based policy can be accused of being preoccupied with narrowly understood ‘interventions’ denoting surgical actions on one causal factor to influence its effect.

The definition of intervention presupposed by the movement of evidence-based policy overlaps with James Woodward’s[18] interventionist theory of causality. However, policy-making encompasses also other types of decisions such as institutional reforms and actions based on predictions. The other types of evidence-based decision-making do not require having at hand evidence for the causal relation to be invariant under intervention. Therefore, mechanistic evidence and observational studies suffice for introducing institutional reforms and undertaking actions that do not modify the causes of a causal claim.[19]

Moreover, evidence has emerged[20] of front-line public servants, like hospital managers, making decisions that actually worsen patients' care in order to hit pre-ordained targets. This argument was put forward by Professor Jerry Muller of the Catholic University of America in a book called The Tyranny of Metrics.[21] According to articles published in Futures, evidence based policy - in the form of cost-based or risk analyses, may entail forms of compression and exclusion of the issues under analysis,[22] also in relation to power asymmetries among different actors in their capacity to produce evidence.[23] A comprehensive list of critiques, including the fact that policies shown to be successful in one place often fail in others, despite reaching a gold standard of evidence, has been compiled by the policy platform Apolitical.[24]

Types of Evidence for Evidence-Based Policy Making[]

All kinds of evidence that has been collected through building, data collection, analysis related to development policy and practice can be considered an evidence.[25] These evidences, are however, not regarded equal. There departments and the policymakers within the government have the ultimate power in choosing how the evidences are ranked hierarchically to be used for the development of policies.[26]

Quantitative Evidence[]

Numerical quantities that come from peer-reviewed journals, data from public surveillance systems or individual programs are considered quantitative evidence for policymaking. Quantitative data can also be collected by the government or policymakers themselves through the form of surveys.[25] Qualitative evidences are widely used in EBM and evidence-based public health policy constructions.

Qualitative Evidence[]

Qualitative evidence includes nonnumerical data that are collected by methods that include, observations, interviews or using focus groups. Qualitative evidences are widely used to create a persuasive stories to impact those in the decision making authority.[25] Although the evidences can be divided according to their type, their is no hierarchical weight over qualitative vs quantitative data. They are both efficient in acting as an evidence in certain areas than others. Often, qualitative and quantitative evidence are combined to be used in the process of policymaking.[26]

Evidence-based development policy[]

The Overseas Development Institute has pioneered RAPID Outcome Mapping Approach (ROMA) over the past five years as a means to help aid donors and partners better transform research into policy initiatives.[27]

RAPID Outcome Mapping Approach[]

ROMA approach takes these lessons into account has been field tested through more than 40 workshops and training courses worldwide. It is an eight-step approach for each of which the ODI has developed resources and policy tools to ensure each step is comprehensively addressed:

  1. Define a clear, overarching policy objective.
  2. Map the policy context around that issue and identify the key factors that may influence the policy process. The RAPID framework provides a useful checklist of questions.
  3. Identify the key influential stakeholders. RAPID’s Alignment, Interest and Influence Matrix (AIIM) can be used to map actors along three dimensions: the degree of alignment (i.e. agreement) with the proposed policy, their level of interest in the issue, and their ability to exert influence on the policy process.
  4. Develop a theory of change – identify the changes needed among them if they are to support the desired policy outcome.
  5. Develop a strategy to achieve the milestone changes in the process – Force Field Analysis is a flexible tool that can be used to further understand the forces supporting and opposing the desired policy change and suggest concrete responses.
  6. Ensure the engagement team has the competencies required to operationalize the strategy.
  7. Establish an action plan for meeting the desired policy objective – useful tools include the RAPID Information matrix, DFID’s log frame and IDRC’s Outcome Mapping Strategy Map among them.
  8. Develop a monitoring and learning system, not only to track progress, make any necessary adjustments and assess the effectiveness of the approach, but also to learn lessons for the future.

An example of ROMA approach can be seen in the case of the Wildlife Enforcement Monitoring System (WEMS) Initiative[28] where a systematic approach of agreement has brought its implementation in Africa.

Results[]

This has resulted in:[27]

  1. Over 50 case studies on successful evidence-based policy engagement have been compiled, a network
  2. Development and facilitation of the evidence-based policy in Development Network (ebpdn), which links more than 20 institutional partners and thousands of practitioners working on evidence-based policy processes.
  3. Creating an array of practical toolkits designed with civil society organisations, researchers and progressive policymakers in mind. For example, at the recent Tokyo Conference on Combating Wildlife crime, United Nations University and ESRI presented the first case of evidence-based policy making maps on enforcement and compliance of CITES convention.[29]
  4. Direct support to civil society organizations (CSOs) to provide training in policy influencing and strategic communication.
  5. Strengthening the capacity for the UK Department for International Development (DFID) to influence other actors.

Key lessons[]

Six key lessons have been developed, which are:

  1. Policy processes are complex and rarely linear or logical and simply presenting information to policy-makers and expecting them to act upon it is very unlikely to work. Policy processes are not purely linear as they have various stages that each take varying lengths of time to complete and may, in fact, be conducted simultaneously. Strategies must be fluid.
  2. Policy is often only weakly informed by research-based evidence due to information gaps, secrecy, the need for speedy responses, political expediency and the fact that policymakers are rarely scientists.
  3. Research-based evidence can contribute to policies that have a dramatic impact on lives. Success stories quoted in the UK's Department for International Development's (DFID) new research strategy include a 22% reduction in neonatal mortality in Ghana as a result of helping women begin breastfeeding within one hour of giving birth and a 43% reduction in deaths among HIV positive children using a widely available antibiotic.
  4. The need for a holistic understanding of the context in which the policy is to be implemented.
  5. Policy entrepreneurs need additional skills to influence policy.[30] They need to be political fixers, able to understand the politics and identify the key players. They need to be good storytellers, able to synthesize simple compelling stories from the results of the research. They need to be good networkers to work effectively with all the other stakeholders, and they need to be good engineers, building a program that pulls all of this together.
  6. Policy entrepreneurs need clear intent – they need to really want to do it. Turning a researcher into a policy entrepreneur, or a research institute or department into a policy-focused think tank involves a fundamental re-orientation towards policy engagement rather than academic achievement; engaging much more with the policy community; developing a research agenda focusing on policy issues rather than academic interests; acquiring new skills or building multidisciplinary teams; establishing new internal systems and incentives; spending much more on communications; producing a different range of outputs; and working more in partnerships and networks.

These lessons show that the relationship between research, policy and practice is complex, multi-factorial, non-linear, and highly context specific.[27] What works in one situation may not work in another. Developing effective strategies in complex environments is not straightforward. Simple tools such as cost–benefit analysis, logical frameworks, traditional project management tools and others may not work on their own, as they fail to take into account the existing complexity.

Based on research conducted in six Asian and African countries, the Future Health Systems consortium has identified a set of key strategies for improving uptake of evidence in to policy,[31] including: improving the technical capacity of policy-makers; better packaging of research findings; use of social networks; establishment of fora to assist in linking evidence with policy outcomes.[32][33]

The Pew Charitable Trust[]

The Pew Charitable Trust is a global non-governmental organization that seeks to improve public policy, inform the public, and invigorate civic life.[34] Pew was established in 1948, head quartered in Philadelphia, PA, has since strived to make a difference for public issue areas based on data, science, and facts to serve the public good.[35] Pew has a Results First initiative that works with the different US states to implement evidence-based policymaking in the development of their policies.[34] This initiative has developed a framework that has five key components, to help governments make better choices in their policymaking decisions.

This 5 key components are:[35]

  • Program assessment. Systematically review available evidence on the effectiveness of public programs.
    • Develop an inventory of funded programs.
    • Categorize programs by their evidence of effectiveness.
    • Identify programs’ potential return on investment.
  • Budget development. Incorporate evidence of program effectiveness into budget and policy decisions, giving funding priority to those that deliver a high return on investment of public funds.
    • Integrate program performance information into the budget development process.
    • Present information to policymakers in user-friendly formats that facilitate decision-making.
    • Include relevant studies in budget hearings and committee meetings.
    • Establish incentives for implementing evidence-based programs and practices.
    • Build performance requirements into grants and contracts.
  • Implementation oversight. Ensure that programs are effectively delivered and are faithful to their intended design.
    • Establish quality standards to govern program implementation.
    • Build and maintain capacity for ongoing quality improvement and monitoring of fidelity to program design.
    • Balance program fidelity requirements with local needs.
    • Conduct data-driven reviews to improve program performance.
  • Outcome monitoring. Routinely measure and report outcome data to determine whether programs are achieving desired results.
    • Develop meaningful outcome measures for programs, agencies, and the community.
    • Conduct regular audits of systems for collecting and reporting performance data.
    • Regularly report performance data to policymakers.
  • Targeted evaluation. Conduct rigorous evaluations of new and untested programs to ensure that they warrant continued funding.
    • Leverage available resources to conduct evaluations.
    • Target evaluations to high-priority programs.
    • Make better use of administrative data—information typically collected for operational and compliance purposes—to enhance program evaluations.
    • Require evaluations as a condition for continued funding for new initiatives.
    • Develop a centralized repository for program evaluations.

The Coalition for Evidence-Based Policy[]

The Coalition for Evidence-Based Policy was a nonprofit, nonpartisan organization, whose mission was to increase government effectiveness through the use of rigorous evidence about "what works." Since 2001, the Coalition worked with U.S. Congressional and Executive Branch officials and advanced evidence-based reforms in U.S. social programs, which have been enacted into law and policy. The Coalition claimed to have no affiliation with any programs or program models, and no financial interest in the policy ideas it supported, enabling it to serve as an independent, objective source of expertise to government officials on evidence-based policy.[36][unreliable source]

Major new policy initiatives that were enacted into law with the work of Coalition with congressional and executive branch officials.[37]

  • Evidence-Based Home Visitation Program for at-risk families with young children (Department of Health and Human Services – HHS, $1.5 billion over 2010-2014
  • Evidence-Based Teen Pregnancy Prevention Program (HHS, $109 million in FY14)
  • Investing in Innovation Fund, to fund development and scale-up of evidence-based K-12 educational interventions (Department of Education, $142 million in FY14)
  • First in the World Initiative, to fund development and scale-up of evidence-based interventions in postsecondary education (Department of Education, $75 million in FY14)
  • Social Innovation Fund, to support public/private investment in evidence-based programs in low-income communities (Corporation for National and Community Service, $70 million in FY14)
  • Trade Adjustment Assistance Community College and Career Training Grants Program, to fund development and scale-up of evidence-based education and career training programs for dislocated workers (Department of Labor – DOL, $2 billion over 2011-2014)
  • Workforce Innovation Fund, to fund development and scale-up of evidence-based strategies to improve education/employment outcomes for U.S. workers (DOL, $47 million in FY14).

Their website now says "The Coalition wound down its operations in the spring of 2015, and the Coalition’s leadership and core elements of the group’s work have been integrated into the Laura and John Arnold Foundation".[38] In 2003 the Coalition published a guide on educational evidenced-based practices.[39]

See also[]

References[]

  1. ^ a b c Baron, Jon (1 July 2018). "A Brief History of Evidence-Based Policy". The Annals of the American Academy of Political and Social Science. 678 (1): 40–50. doi:10.1177/0002716218763128. ISSN 0002-7162. S2CID 149924800.
  2. ^ Head, Brian. (2009). Evidence-based policy: principles and requirements Archived 28 November 2010 at the Wayback Machine. University of Queensland. Retrieved 4 June 2010.
  3. ^ Petticrew, M (2003). "Evidence, hierarchies, and typologies: Horses for courses". Journal of Epidemiology & Community Health. 57 (7): 527–9. doi:10.1136/jech.57.7.527. PMC 1732497. PMID 12821702.
  4. ^ Parkhurst, Justin (2017). The Politics of Evidence: from Evidence Based Policy to the Good Governance of Evidence (PDF). London: Routledge. doi:10.4324/9781315675008. ISBN 9781138939400.[page needed]
  5. ^ a b Guyatt, G. H. (1 December 1993). "Users' guides to the medical literature. II. How to use an article about therapy or prevention. A. Are the results of the study valid? Evidence-Based Medicine Working Group". JAMA: The Journal of the American Medical Association. 270 (21): 2598–2601. doi:10.1001/jama.270.21.2598. ISSN 0098-7484. PMID 8230645.
  6. ^ Bridges, D., Smeyers, P. and Smith, R. (eds) (2009) Evidence-Based Education Policy: What evidence? What basis? Whose policy?, Chichester, Wiley-Blackwell. Hammersley, M. (2013) The Myth of Research-Based Policy and Practice, London, Sage.
  7. ^ a b c d Banks, Gary (2009). Evidence-based policy making: What is it? How do we get it? Archived 23 January 2010 at the Wayback Machine. Australian Government, Productivity Commission. Retrieved 4 June 2010
  8. ^ Guyatt, Gordon H. (1 March 1991). "Evidence-based medicine". ACP Journal Club. 114 (2): A16. doi:10.7326/ACPJC-1991-114-2-A16 (inactive 16 December 2021). ISSN 1056-8751.CS1 maint: DOI inactive as of December 2021 (link)
  9. ^ Payne-Palacio, June R.; Canter, Deborah D. (9 August 2016). The Profession of Dietetics: A Team Approach. Jones & Bartlett Learning. ISBN 978-1-284-12635-8.
  10. ^ a b c d Marston & Watts. Tampering with the Evidence: A Critical Appraisal of Evidence-Based Policy-Making Archived 23 March 2012 at the Wayback Machine. RMIT University. Retrieved 10 September 2014.
  11. ^ The Cochrane Collaboration Retrieved 10 September 2014.
  12. ^ Claridge, Jeffrey A.; Fabian, Timothy C. (1 May 2005). "History and Development of Evidence-based Medicine". World Journal of Surgery. 29 (5): 547–553. doi:10.1007/s00268-005-7910-1. ISSN 1432-2323. PMID 15827845. S2CID 21457159.
  13. ^ Department for Environment, Food and Rural Affairs (21 September 2006). "Evidence-based policy making". Archived from the original on 14 January 2011. Retrieved 6 March 2010.
  14. ^ The Campbell Collaboration Retrieved 10 September 2014.
  15. ^ See pp. 11–40 and 281–306 in "Prison Vocational Education and Policy in the United States" by Andrew J. Dick, William Rich, and Tony Waters. New York: Palgrave Macmillan 2016.[ISBN missing]
  16. ^ 1973–, Cairney, Paul (11 April 2016). The politics of evidence-based policy making. New York. ISBN 9781137517814. OCLC 946724638.CS1 maint: numeric names: authors list (link)
  17. ^ Cartwright, Nancy; Hardie, Jeremy (20 September 2012). Evidence-Based Policy: A Practical Guide to Doing It Better. Oxford University Press. ISBN 978-0-19-998670-5.
  18. ^ Woodward, James (27 October 2005). Making Things Happen: A Theory of Causal Explanation. Oxford University Press. ISBN 978-0-19-803533-6.
  19. ^ Maziarz, Mariusz (2020). The Philosophy of Causality in Economics: Causal Inferences and Policy Proposals. London & New York: Routledge.
  20. ^ "Government by numbers: how data is damaging our public services | Apolitical". Apolitical. Retrieved 10 April 2018.
  21. ^ 1954–, Muller, Jerry Z. (16 December 2017). The tyranny of metrics. Princeton. ISBN 9780691174952. OCLC 1005121833.CS1 maint: numeric names: authors list (link)
  22. ^ Andrea Saltelli, Mario Giampietro, 2017, What is wrong with evidence based policy, and how can it be improved? Futures, 91, 62–71.
  23. ^ Andrea Saltelli, 2018, Why science’s crisis should not become a political battling ground, FUTURES, vol. 104, p. 85-90, https://doi.org/10.1016/j.futures.2018.07.006.
  24. ^ "Evidence-based policymaking: is there room for science in politics? | Apolitical". Apolitical. Retrieved 10 April 2018.
  25. ^ a b c Brownson, Ross C.; Chriqui, Jamie F.; Stamatakis, Katherine A. (2009). "Understanding Evidence-Based Public Health Policy". American Journal of Public Health. 99 (9): 1576–1583. doi:10.2105/AJPH.2008.156224. ISSN 0090-0036. PMC 2724448. PMID 19608941.
  26. ^ a b Court, Julius; Sutcliffe, Sophie (November 2005). "Evidence-Based Policymaking: What is it? How does it work? What relevance for developing countries?" (PDF). Overseas Development Institute(ODI).
  27. ^ a b c Young, John and Mendizabal, Enrique (2009) Helping researchers become policy entrepreneurs: How to develop engagement strategies for evidence-based policy-making Archived 6 August 2010 at the Wayback Machine London: Overseas Development Institute
  28. ^ World Wildlife Day (3 March 2014). Evidence Based Policy-Making in Addressing Wildlife Crime Archived 17 July 2017 at the Wayback Machine. Wildlife Enforcement Monitoring System Initiative. Retrieved 10 September 2014.
  29. ^ Evidence Based Policy-Making in Addressing Wildlife Crime. United Nations University. Retrieved 10 September 2014.
  30. ^ "Policy Entrepreneurs: Their Activity Structure and Function in the Policy Process". Journal of Public Administration Research and Theory. 1991. doi:10.1093/oxfordjournals.jpart.a037081. hdl:10945/53405.
  31. ^ Syed, Shamsuzzoha B; Hyder, Adnan A; Bloom, Gerald; Sundaram, Sandhya; Bhuiya, Abbas; Zhenzhong, Zhang; Kanjilal, Barun; Oladepo, Oladimeji; Pariyo, George; Peters, David H (2008). "Exploring evidence-policy linkages in health research plans: A case study from six countries". Health Research Policy and Systems. 6: 4. doi:10.1186/1478-4505-6-4. PMC 2329631. PMID 18331651.
  32. ^ Hyder, A; et al. (14 June 2010). "National Policy-Makers Speak Out: Are Researchers Giving Them What They Need?". Health Policy and Planning. Retrieved 26 May 2012.
  33. ^ Hyder, A; Syed, S; Puvanachandra, P; Bloom, G; Sundaram, S; Mahmood, S; Iqbal, M; Hongwen, Z; Ravichandran, N; Oladepo, O; Pariyo, G; Peters, D (2010). "Stakeholder analysis for health research: Case studies from low- and middle-income countries". Public Health. 124 (3): 159–66. doi:10.1016/j.puhe.2009.12.006. PMID 20227095.
  34. ^ a b "About The Pew Charitable Trusts". pew.org. Retrieved 8 December 2021.
  35. ^ a b "Evidence-Based Policymaking" (PDF). 2014 – via The Pew Charitable Trust. Cite journal requires |journal= (help)
  36. ^ The Coalition for Evidence-Based Policy. Retrieved 18 September 2014.
  37. ^ "Coalition for Evidence-Based Policy". Retrieved 8 December 2021.
  38. ^ "Coalition for Evidence-Based Policy".
  39. ^ "Identifying and Implementing Educational Practices Supported By Rigorous Evidence: A User-Friendly Guide, 2003" (PDF).

Further reading[]

  • Cartwright, Nancy; Stegenga, Jacob (2011). "A theory of evidence for evidence-based policy". Proceedings of the British Academy. 171: 291–322.
  • Davies, H. T. O, Nutley, S. M. and Smith, P. C. (eds) (2000) What Works? Evidence-based Policy and Practice in the Public Services, Bristol, Policy Press.
  • Hammersley, M. (2002) Educational Research, Policymaking and Practice, Paul Chapman/Sage.
  • Hammersley, M. (2013) The Myth of Research-Based Policy and Practice, London, Sage.
  • McKinnon, Madeleine C; Cheng, Samantha H; Garside, Ruth; Masuda, Yuta J; Miller, Daniel C (2015). "Sustainability: Map the evidence". Nature. 528 (7581): 185–7. Bibcode:2015Natur.528..185M. doi:10.1038/528185a. PMID 26659166.

External links[]

Retrieved from ""