代写范文

留学资讯

写作技巧

论文代写专题

服务承诺

资金托管
原创保证
实力保障
24小时客服
使命必达

51Due提供Essay,Paper,Report,Assignment等学科作业的代写与辅导,同时涵盖Personal Statement,转学申请等留学文书代写。

51Due将让你达成学业目标
51Due将让你达成学业目标
51Due将让你达成学业目标
51Due将让你达成学业目标

私人订制你的未来职场 世界名企,高端行业岗位等 在新的起点上实现更高水平的发展

积累工作经验
多元化文化交流
专业实操技能
建立人际资源圈

Sysematic_Reviews_of_Evidence_and_How_They_Influence_Health_Policy

2013-11-13 来源: 类别: 更多范文

Why are diverse sources of evidence of value to health policy makers' To start this discussion we need to understand what we mean by “evidence based public health”, this is defined as “ the development, implementation and evaluation of effective programmes and policies in public health through the application of principles of scientific reasoning, including systematic uses of data and information systems, and the appropriate use of behavioural science, theory and programme planning models” (Brownson, Ross, 2003) This encourages us to use evidence that is least likely to be biased as the basis for decision making (Kelly, Morgan, Allis, 2009), the conscientious, explicit and judicious use of best evidence in decision making (Sacket, Rosenberg, Gray, 1996). The use of the wording “best” emphasises that it is the quality of the evidence not the quantity that is most important, it is the best information that can be found to address the research question being asked. The benefits to practitioners of operating in this manner are numerous as it ensures the best interventions are taken up and these decisions can be supported by scientific research From this we understand that public health is a multidiscipline field a far as research and study is concerned and this is why a wide and diverse source of evidence is important. Evidence based public health provides a reassurance that interventions and treatments are based on up to date, reliable scientific evidence which is used to decide what does and does not work for public health. It also means that by using the “best” possible information as a reviewer you are using your resources such as time, staff etc. in the most efficient manner. Evidence based public health is best employed when supporting decisions, when evaluating cost: benefit of interventions and when putting in place new health initiatives or implementing new policies. The intervention I have chosen to look at with respect to these questions is “What are the proven interventions that address obesity in children, and how are successful programmes in this area designed' As resources are ever scarcer there is a need for open and transparent methods to review evidence to feed into the policy making process. ( NICE,2009) Policy makers must often make difficult decisions regarding health improvements and the minimisation of health equalities; this is especially the case in looking at the issue of childhood obesity where there is an ever shifting research base available to inform policy. Initially all clinical effectiveness assessments were based on Randomised Control Trials (RCT) but issues arise here as the questions asked to address public health issues such as childhood obesity are not answered by RCT and so other forms of evidence re needed to achieve the answer is reached. The evidence required here is broad, wide ranging and goes beyond the medical evidence to include social sciences. If the question being asked related to drug trials then RCT would provide the best evidence with more certainty than other methods as this would be based on biomedicine and so needs appropriate evidence to support this. The Cochrane collaboration reviews research in this area. However, my question is a public health one that functions in a very different area. The arena that public health operates in relation to obesity in children needs to look at collective social behaviour and the individual human mind, therefore as Kelly et al (2009) stat the delivery of its interventions therefore are at a community, population and societal level. This involves operating with different epistemological ideas, with a variety of methods that will produce the best evidence that is very different to that produced by RCT’s. The focus of research in public health and so in childhood obesity does not have a focus that is strictly analytical but is more diffuse and will be working at a number of areas simultaneously. The outcomes don’t always happen at the same level as the intervention put in place to address the issue. For example, an intervention at an individual level may produce an outcome at the community or societal level, and interventions at a population level may affect the outcomes at an individual level, an example of this would be legislation controlling fat and sugar levels in all school meals. (Kelly, Abraham, 2009). The problem here arises from reviewing and synthesising evidence on childhood obesity where the gathering, interpretation and implementation involve operating at numerous levels and crossing a number of boundaries. The evidence base needed is diverse not only methodologically but also epistemologically and they can be derived from political science, psychology and social science, these evidence sources have not been examined and synthesised in the past in the same manner as the evidence for biomedical interventions have. The Campbell collaboration is trying to achieve for social science research and evidence what Cochrane has achieved for the synthesis of biomedical evidence. The World Health Organisation is giving increased emphasis to the role of health systems to improve health and minimise inequalities and is focussing its attention on policy making as an important means of achieving effective systems (Murray and Frank, 2000) and Davies (2005) tells us that policy makers are influenced by sources other than research. These influences include political beliefs and ideology, pressure groups, think tanks, lobbyists, media influences and political and policy timetables and so evidence synthesis is critical in identifying, appraising and summarizing what is known in social and political science research. (Davies 2006). To ignore the wealth of information from these sources would be lead to the possibility that the answer found to the research question would neither lead to the best interventions to address childhood obesity and would produce results that were not reproducible or transparent. Systematic reviews therefore attempt to assess as objectively as possible all forms of evidence on a particular topic, this could then be defined as, “A formulated, replicable and current summary that collates, assesses and appraises all empirical evidence related to a specific research question” (Alliance for Health Policy and Systems Research, 2009) If we look at the pyramid of evidence below, this clearly shows the move away from RCT as the gold standard in research evidence, is this because RCT can’t answer all the questions posed especially those relating to public health' What are the challenges facing those wishing to review these sources of evidence systematically' If we then move onto looking at how we carry out this review and the synthesis process we can see that there are a number of challenges that arise as we do this. The steps for reviewing the literature to answer my question re childhood obesity interventions are detailed below: * What is the public health problem, what is the question that best defines it' * What information sources are appropriate' * What are the key concepts, terms etc.' * Conduct search * Choose which documents to review * Pick out the relevant data from those documents * Summarise the review * What are the results' (Brownson, Ross, Baker et al, 2003) The problem is: proven interventions to address childhood obesity, how are successful programmes designed' So the question I am asking is: What effective interventions and programmes are available for obesity reduction in school age children. The question is very important in this process; the review needs to answer one specific well formulated question. The question needs to be well structured and specific to the issue being looked at. Pia et al (2004) suggest using the PICO system. That is every question needs to address the following; Population Group Intervention Comparison of Interventions Outcomes One of the challenges to reviewing the research data is feeling confident that you are indeed asking the right question; it may be that the question needs to change as the review proceeds. Systematic reviews in public health rely heavily on qualitative and experimental data, as is the case for this question, where we are looking not only at the effectiveness of the intervention but design and implementation. Many studies may also look at cost / benefit and how those using the interventions may experience it, these types of investigation do not lend themselves to quantitative research. A lot of data that is relevant to the question resides in so called “fugitive” literature (Sechrest, White, Brown, 1979) such as Government reports, dissertations, models and theses, conference papers and other documents. If we don’t review all the relevant data we may miss important research. Having decided on the question we move onto choosing the data sources that are relevant, here data will be from diverse sources such as Government reports, PubMed and other medical databases that cover the biomedical side – mainly quantitative side of the research and sources of educational research and psychological sources – mainly qualitative side of the research, among others. The challenges here can be in insuring that all relevant sources are included, here having a wide range of experience and expert knowledge on the review team can aid this process. As far as identifying the key concepts and terms for inclusion, this can be challenging in ensuring that all that are relevant to the question are covered, a team with wide experience can again aid this. For the question being asked here key terms would include; Exercise, bodyweight, child nutrition, epidemiology, parent/ child relations, primary prevention, RCT, School health services. The search would be conducted from the identified data sources and the documents chosen for synthesis using strict inclusion/ exclusion criteria that have been agreed by a number of researchers. Another challenge arises here as it is difficult often to pull out the information required from the data, which is very time consuming, as it involves trawling through Abstracts and Conclusions in the documents to ascertain the results of the study, are the results valid and how could the results be applied in public health intervention or practise. The next stage in the process is to carry out a synthesis of the results; there are a number of emerging synthesis methods such as; Narrative analysis Meta ethnography Realist synthesis Meta synthesis Meta narrative review Thematic analysis It can be challenging to decide which type of synthesis best fits the question being asked, so choosing the approach to be used will depend heavily on the aim of the review, the question being asked and the balance of the evidence available, is it mostly quantitative or qualitative. For this question there is a good balance of both types of evidence available. A review to develop theory may best be carried out using narrative synthesis whereas if cost / benefit is important modelling and meta ethnography may be more appropriate approaches to use. For supporting decisions narrative and Bayesian synthesis can encompass research and non-research sources well. (Mays, Pope, Popay, 2005). When meta-analysis is used estimates of the average impact across studies as well as degrees of variation and why there is variation can be observed. It can generate reasons as to why some programmes are more effective than others and it can rule out chance when combining results (Hedges & Olkin, 1985), this is one method that would therefore address the question being asked. Again the challenge here is having experienced reviewers in the team who have experience of all the types of approach commonly used and who feel comfortable making the decision as to which approach to use. Reviews may by their nature comprise a combination of approaches and again this is where an experienced team of reviewers will know when this applies. It can be challenging to manage the array of approaches that can be used but it can also bring about the advantage of multiple viewpoints which can be of great significance, this can bring together views and the experiences of researchers working in numerous disciplines. There are other challenges related to this process, as there is limited data from RCT and we needed to look wider for information, we relied heavily on other study types and how we combined these was a challenge for the review as the tools to do this are not yet developed to a phase where a standardised way of conducting these reviews is available, as is the case for RCT. Systematic reviews must be reproducible and encapsulate a wide range of inputs to answer the question being asked, to achieve this they need to include details about all stages of the decision making process including; question choice, criteria for inclusion / exclusion, how the synthesis was carried out and how the conclusions were reached. As at present there is no definitive accepted quality criteria for these reviews, they can vary considerably quality wise. However Mays, Pope and Popay (2005) suggest the following questions could be asked; Is the aim of the review clear' Are the review questions relevant to the concerns of the practitioner or policymaker' Are the methods used explicit and comprehensive' Is there a good enough fit between the types of evidence being used and the question being asked' Where judgements are made is the reasoning clearly stated for such' Was there an appropriate range of skills, knowledge and expertise in the review team' For me this last point is a major challenge when carrying out reviews if as we have said the evidence is diverse, it means the researchers may not have enough expertise to interpret the evidence from a non research source. To get a clear judgement on the evidence therefore a broad range of skills, expertise and experience is needed (Mays, Pope, Popay, 2005). Getting this type of group together can be challenging and the time required to carry out even such a small review as this one n childhood obesity is also a challenge. The reviews are resource intensive and a further challenge is that these are mostly one off exercises conducted as time or funding is available, this raises the challenge of keeping them up to date and incorporating new research into them. Once the review is complete and answers have been sourced for the question we asked on childhood obesity interventions the next step is to look at the findings and ask whether there are differences in the way the health care system is structured that may mean that the intervention would not perform in the same way as the area the research was carried out in. also, are there differences in the perspectives or views of those who hold political influence in this area, are there on the ground constraints that could hinder replication of the findings' Existing evidence shows that different parts of the populations respond differently to identical interventions (Killoran & Kelly, 2004) and interventions may actually increase inequalities in the population (White & Adams, 2009). This can act as a barrier to the use of research evidence along with the perceptions of the decision makers, the gulf that may exist between researchers and decision makers, competing influences and practical constraints. (Orton, Lloyd William, Robertson, 2011). Others also challenge the ideals of systematic reviews, Black (2001) states that the relationship between research evidence and policy is weakened by competing pressures such as social relations and the electoral considerations of the policy makers. He challenges the linear relationship between research and policy that is assumed, he finds that research evidence has most influence in central policymaking but less so at a local level. Local health commissioners and managers therefore need to build the critical use of research evidence and evaluation into local implementation in order to provide consistent and effective health interventions. (Evans & Snooks, 2013). Elliot & Popay (2000) also find that factors such as financial constraints, shifting timescales and decision makers own experiences temper the direct influences of research evidence on decision making. Research is more likely therefore to impact on policy indirectly by shaping debate and mediating dialogue between practitioners and policy makers. It is this sustained dialogue that will increase the use of evidence based policy. The challenge for reviewers lies in incorporating and trying to balance non experimental evidence with the possible lack of experimental data, this may show where there are gaps in research, there is also a need for standardised plain language summaries of reviews and one suggestion has been to adopt a one page of take home messages, three page executive summary and full review approach to make these more user friendly. From this we can see that the inputs from systematic reviews are extremely valuable as they identify, assess, appraise and synthesise all relevant evidence in a methodical reproducible way. When analysing the effects of competing options they can reduce bias in effectiveness estimates by looking at all relevant studies. We have identified a number of challenges in carrying these out such as time, skills required, lack of standardised procedures and lack of quality control among others. Rigorous quality controls and protocols as implemented by the Campbell Collaboration will attempt to gain the same level of transparency as those carried out by the Cochrane Collaboration. It will maintain reviews taking global evidence into account. Systematic reviews will not resolve all academic and political conflicts in the area of research and policy but they will better enlighten us as to what is known from scientific evidence. Systematic reviews could incorporate the voice and needs of many audiences, they could balance the needs of science with the priorities of policymakers and so contribute to better policy making that will strengthen health systems going forward. References Abraham C., Kelly, M. P., West, R. & Michie, S. (2009). The UK National Institute for Health and Clinical Excellence (NICE) Public Health Guidance on Behaviour Change: A Brief Introduction. Psychology Health and Medicine, 14, 1-8. Black, N. (2001). Evidence based policy: Proceed with care. BMJ 2004; 323: 275-278. Brownson, R.C., Baker, E.A., Leet, L.T. & Gillespie, K. (2003). Ed. Evidence Based Public Health. New York: Oxford University Press. Davies, P. (2006). Chapter10, what is needed from research synthesis from a policy making perspective' In Popay J. ed (2006) Moving beyond effectiveness: Methodological issues with synthesis of diverse sources of evidence. London. NICE. Dixon – Woods, M., Agarwal, S., Jones, D., Young, B & Sutton, A. (2005). Synthesising qualitative and quantitative evidence: a review of possible methods. Journal of Health Service research Policy. 10 (1) 45-53. Donald, A. (2001). Commentary: research must be taken seriously. BMJ 2001; 323: 278-279. Elliot, U., Popay, J. (2000). How are policy makers using evidence' Models of research utilisation and local NHS policy making. Journal Epidemiological Community Health. 54, 461-468. Evans, A., Snooks, H., Howson, H. & Davis, M. (2013). How hard can it be to include research evidence and evaluation into local health policy implementation' Results from a mixed methods study. Implement Sci 2013. 8:17 doi 10.1186/1748-5908-8-17 Guide to research methods: Evidence pyramid. http://libguides.methodistcollege.edu/content.php'pid=175181&sid=1474687 Hedges, L.V., Ingram,O. ( 1985). Statistical methods for meta-analysis. New York : Academic Press. Murray, C. & Frenj, J. (2000). A framework for assessing the performance of health systems. WHO Bulletin 2000, 78: 717-731. Mays, N., Pope, C., Popay, J. (2005). Systematically reviewing qual and quant evidence to inform management and policy making in the health field. J Health Service Research Policy 10 (10 July 2005 Kelly, M., Morgan, A., Ellis, S., Younger, T., Huntley, J., Swann, C. (2010). Evidence based public health: A review of the experience of the National Institute of Health and Clinical Excellence (NICE) of developing public health guidance in England. Soc. Sci. Med 2010. 71(6) 1056-1062. Killoran, A., Kelly, M. (2004). Towards an evidence based approach to tackling health inequalities: the English experience. Health Education Journal. 63 (1), 7-14. NICE (2009) review2009-2010.nice.org.uk/public_health Accessed 20th January 2013. Nutley, S., Davis, H.T., Tilley, N. (2000). Letter to the editor: Public Money & Management. 20. 3-6 Orton, L., Lloyd Williams, F., Taylor, T., et al. (2011). The use of research evidence in public health decision making processes: Systematic review. PLOS ONE 6 (7) e21704. Doi 10.1371/journal.pone.0021704. Pia, M. et al (2004). Systematic reviews and meta-analyses: An illustrated step by step guide. The National Medical Journal of India. 2004. 17(2); 86-95 Sackett DL, Rosenberg WM, Gray JA, Haynes RB, Richardson WS. Evidence based medicine: what it is and what it isn't. BMJ. 1996 January 13;312(7023):71-2. Sechrest, L., White, S., Brown, E. (1979) The rehabilitation of criminal offenders: problems and prospects. Washington DC. National Academy of Science. Systematic reviews in health policy and systems research. Alliance for Health Policy and Systems Research. (2009). www.who.int/alliance-hpsr/resources/AllianceHPSR_Brief_Note4_ENG.pdf. Accessed 15th January 2013. White, M., Adams, J., Heywood, P. (2009). How and why do interventions that increase health overall widen inequalities within populations. In Social Inequality & Public Health. Bristol: Policy Press 65-83. Postings Re: Discussion Task 1 by Margaret McGuinness - Monday, 25 February 2013, 11:37 The area I work in relies very heavily on following European Legislation that is in the main part evidence based but with huge safety limits in built. One area of legislation that is increasingly causing us major issues is that of pesticide levels in drinking water which are currently set at 0.1ug/l - there is no medical or scientific basis for this particular standard and as such is seen very much as a "surrogate zero" level. While I fully understand the emotive feelings pesticide residues can raise when found in food or drinking water the fact we have this level to work to which has no evidence base behind it means that we will have to spend many many millions of pounds installing treatment to remove these from drinking water. Many of the new pesticides that are becoming problematic have no removal method that currently works to remove all or nearly all so we can only reduce levels. On speaking to health colleagues in Health Protection Scotland they realise that this leaves water companies in a very difficult position if they have a works that is failing for pesticides at low level but above the standard and they are supportive of any evidence based resaerch that couls show whether these levels are correct or not. However, there is great reluctance to change this in Europe and to date there is no indication that these levels will be reviewed even in the next 10 years. Maybe we need to get evidence based policy more firmly established in this area so that at least we know what we are doing has some basis in fact. Margaret Show parent | Reply | Export to portfolio Re: Discussion Task 2 by Margaret McGuinness - Monday, 25 February 2013, 15:22 Hi, There have been some really enlightening replies to this and many mirror my feelings in this area. I to feel that qualitative research has a great a role to play as quantitative and as such should have the same "weight" research wise. Only by doing this can we as Jennie has said can we hope to get a better understanding of the feelings of those involved and how those feelings can be directed or used to improve evidence based treatments and therapies. I feel we ignore qualitative research at our peril as it could show us so much if there were standardised ways of reviewing the research carried out, especially as to whether interventions can be improved and to how people actually implement these out in th real world.It seems here that that this is where the problems lie, as others have said how do we control researcher bias if the reveiw questions can be revised' I dont have the answers to this but it feels like there is a clear and evident need for qualitative research reviews to be included in evidence based research Margaret Popay, J. (2005) Moving beyond floccinaucinihilipilification: Enhancing the utility of systematic reviews. Journal of Clinical Epidemiology; 58 (11): 1079-1080 Re: Discussion Task 1 by Margaret McGuinness - Wednesday, 27 February 2013, 15:53 Hi Janice and Deborah, Thanks for your really clear summaries of the two papers, I found both these papers realy interesting as they raised issues I hadn't appreciated before and so have led me to think in much greater depth about all the possible evidence that could and should be involved in policy making. The main point that I took away from the papers were; The large number of areas that evidence can be gained from, not just from research and analysis. This then means that having suitable methods to synthesise that evidence is extremely important if we are to get the best evidence base for policy making I can see the need to develop methods to appraise these differing evidence types and like others hoped there would be one clear easy framework to follow to achieve this, however this is not to be, at l I can fully appreciate that the type of analysis and synthesis of qualitative and quantitative data used will inevitably depend on the questions being asked, the aims of the review and the nature of the available evidence. One point that was raised that I had not thought about was the extended lenghts of time these reviews can take and that often the timescal for a full review would not fit in with the policy making timetable so meaning that an interim review would be necessary. The descriptions that Janice gave of the 4 basic approaches again showed me that very much which one was used was on a best fit to the available information and would be again case dependant. I thought the point that was raised regarding the need for a wide range of experiences to be available within the review team as a very valid one but, like others that would not be my general experience and I wonder how many of us could say that that held true. I can see even more now since reading these two papers that Cochrane style reviews while having an important role to play are not enough to allow realistic decisions to be made. The picture is much more complex and the need to find ways to evaluate all types of available evidence is the way that true evidence based policy making will hopefully come about. Margaret Re: Discussion Task 1 by Margaret McGuinness - Thursday, 14 March 2013, 13:57 Hi, I agree that the importance of appropriate search strategies is paramount to carrying out a full and successful review Popay and Roberts (2006) raises many questions for me on the fullness of reviews and the difficulties of including all relevant data sources. I think the eveiw question is very important in deciding what data to include or not and I can see that the question will develop as the data is explored. For me I think difficulties will arise in trying to synthesise data from all the various databases, as I feel I wouldnt be adequately trained to feel comfortable using all of them and it would take time for that comfort to develop. Time as we all know is something very precious and not readily available. I think the question from Janice re - same research question but 3 different approaches, would this results in the same outcomes or not' is really interesting, does this mean that without one clear methodological choice we run th risk of producing weak reviews or those that are not as complete as we would wish' Does this drive a desire for one clear methodology or not' I also feel that Deborahs point regarding the depth of knowledge, skills, bias etc is a really important one, is a review a review even if the quality is questionable or bias is involved, and indeed how would we address this' So really again for me more questions than answers and again highlighting the difficulties of carrying out full reviews of all relevant data. Popay, J. and Roberts, H. (2006)Introduction: methodological issues in the synthesis of diverse sources of evidence. In Popay, J. (ed) (2006) Moving Beyond Effectiveness: methodological issues in the synthesis of diverse sources of evidence. Margaret Show parent | Reply | Export to portfolio Re: Evidence Synthesis- Tanning Bed Use by Margaret McGuinness - Tuesday, 19 March 2013, 12:58 Hi Rajesh, I found your study really interesting and I think you came up with many of the issues we all felt while trying to conduct review. I carried my review out on the quality of life impacts of bariatric surgery for the obese I found plenty of information both qualitative and quantitative but they answered the question very differently ans showed me early on that my question needed to be more specific and needed to be targeted at specific areas of quality of life not just generaliseable "quality of life" as this covered too wide an area. The challenges for me were getting the right question that addressed the information I wanted to get out of the review and the fact that trying to mix qualitative and quantitative data was extremely difficult as trying to make the data similar enough to answer the question was a major problem as was finding the outcomes at times within the body of the papers. I found it difficult trying to make qualitative findings like quantitative and wonder if I would have fund it easier to do it the other way round ( I will try this to see). Trying to pick up the figures that were relevant in the studies was also an issue at times. The fact that even for 4 papers this took a long time was also a challenge and I can see that to carry out a full review would be a real challenge time wise and resource wise. I think the main reason I found this a challenging process was that I felt I did not have the necessary experience either in the area I was looking at or in the process I was trying to carry out. This really brought it home to me the importance of havin experienced reviewers and expert knowledge to hand when carrying out such reviews. To these ends I think my experiences of attempting this review really refleced all I have learnt through this module Margaret Re: Evidence Synthesis- Tanning Bed Use by Margaret McGuinness - Tuesday, 19 March 2013, 13:31 Papers for review.docx Apologies, I forgot to attach my table of papers Margaret
上一篇:Teaching_Assistants_Role 下一篇:Strategic_Management