Quality and reporting standards, resources, training materials and information for realist evaluation: the RAMESES II project
Background: Many of the problems confronting policy- and decision-makers, evaluators and researchers today are complex, as are the interventions designed to tackle them. Their success depends both on individuals’ responses and on the wider context of people’s lives. Realist evaluation tries to make...
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
National Institute for Health Research
2017-10-01
|
Series: | Health Services and Delivery Research |
Subjects: | |
Online Access: | https://doi.org/10.3310/hsdr05280 |
_version_ | 1817991006013882368 |
---|---|
author | Geoff Wong Gill Westhorp Joanne Greenhalgh Ana Manzano Justin Jagosh Trisha Greenhalgh |
author_facet | Geoff Wong Gill Westhorp Joanne Greenhalgh Ana Manzano Justin Jagosh Trisha Greenhalgh |
author_sort | Geoff Wong |
collection | DOAJ |
description | Background: Many of the problems confronting policy- and decision-makers, evaluators and researchers today are complex, as are the interventions designed to tackle them. Their success depends both on individuals’ responses and on the wider context of people’s lives. Realist evaluation tries to make sense of these complex interventions. It is a form of theory-driven evaluation, based on realist philosophy, that aims to understand why these complex interventions work, how, for whom, in what context and to what extent. Objectives: Our objectives were to develop (a) quality standards, (b) reporting standards, (c) resources and training materials, (d) information and resources for patients and other lay participants and (e) to build research capacity among those interested in realist evaluation. Methods: To develop the quality and reporting standards, we undertook a thematic review of the literature, supplemented by our content expertise and feedback from presentations and workshops. We synthesised findings into briefing materials for realist evaluations for the Delphi panel (a structured method using experts to develop consensus). To develop our resources and training materials, we drew on our experience in developing and delivering education materials, feedback from the Delphi panel, the RAMESES JISCMail e-mail list, training workshops and feedback from training sessions. To develop information and resources for patients and other lay participants in realist evaluation, we convened a group consisting of patients and the public. We built research capacity by running workshops and training sessions. Results: Our literature review identified 152 realist evaluations, and when 37 of these had been analysed we were able to develop our briefing materials for the Delphi panel. The Delphi panel comprised 35 members from 27 organisations across six countries and five disciplines. Within three rounds, the panels had reached a consensus on 20 key reporting standards. The quality standards consist of eight criteria for realist evaluations. We developed resources and training materials for 15 theoretical and methodological topics. All resources are available online (www.ramesesproject.org). We provided methodological support to 17 projects and presentations or workshops to help build research capacity in realist evaluations to 29 organisations. Finally, we produced a generic patient information leaflet for lay participants in realist evaluations. Limitations: Our project had ambitious goals that created a substantial workload, leading to the need to prioritise objectives. For example, we truncated the literature review and focused on standards and training material development. Conclusions: Although realist evaluation holds much promise, misunderstandings and misapplications of it are common. We hope that our project’s outputs and activities will help to address these problems. Our resources are the start of an iterative journey of refinement and development of better resources for realist evaluations. The RAMESES II project seeks not to produce the last word on these issues, but to capture current expertise and establish an agreed state of the science. Much methodological development is needed in realist evaluation but this can take place only if there is a sufficient pool of highly skilled realist evaluators. Capacity building is the next key step in realist evaluation. Funding: The National Institute for Health Research Health Services and Delivery Research programme. |
first_indexed | 2024-04-14T01:06:34Z |
format | Article |
id | doaj.art-c1abde0272d84a798da6ae922aec0a8b |
institution | Directory Open Access Journal |
issn | 2050-4349 2050-4357 |
language | English |
last_indexed | 2024-04-14T01:06:34Z |
publishDate | 2017-10-01 |
publisher | National Institute for Health Research |
record_format | Article |
series | Health Services and Delivery Research |
spelling | doaj.art-c1abde0272d84a798da6ae922aec0a8b2022-12-22T02:21:12ZengNational Institute for Health ResearchHealth Services and Delivery Research2050-43492050-43572017-10-0152810.3310/hsdr0528014/19/19Quality and reporting standards, resources, training materials and information for realist evaluation: the RAMESES II projectGeoff Wong0Gill Westhorp1Joanne Greenhalgh2Ana Manzano3Justin Jagosh4Trisha Greenhalgh5Nuffield Department of Primary Care Health Sciences, University of Oxford, Oxford, UKRealist Research Evaluation and Learning Initiative, Charles Darwin University, Darwin, NT, AustraliaSociology and Social Policy, University of Leeds, Leeds, UKSociology and Social Policy, University of Leeds, Leeds, UKCentre for Advancement in Realist Evaluation and Syntheses (CARES), University of Liverpool, Liverpool, UKNuffield Department of Primary Care Health Sciences, University of Oxford, Oxford, UKBackground: Many of the problems confronting policy- and decision-makers, evaluators and researchers today are complex, as are the interventions designed to tackle them. Their success depends both on individuals’ responses and on the wider context of people’s lives. Realist evaluation tries to make sense of these complex interventions. It is a form of theory-driven evaluation, based on realist philosophy, that aims to understand why these complex interventions work, how, for whom, in what context and to what extent. Objectives: Our objectives were to develop (a) quality standards, (b) reporting standards, (c) resources and training materials, (d) information and resources for patients and other lay participants and (e) to build research capacity among those interested in realist evaluation. Methods: To develop the quality and reporting standards, we undertook a thematic review of the literature, supplemented by our content expertise and feedback from presentations and workshops. We synthesised findings into briefing materials for realist evaluations for the Delphi panel (a structured method using experts to develop consensus). To develop our resources and training materials, we drew on our experience in developing and delivering education materials, feedback from the Delphi panel, the RAMESES JISCMail e-mail list, training workshops and feedback from training sessions. To develop information and resources for patients and other lay participants in realist evaluation, we convened a group consisting of patients and the public. We built research capacity by running workshops and training sessions. Results: Our literature review identified 152 realist evaluations, and when 37 of these had been analysed we were able to develop our briefing materials for the Delphi panel. The Delphi panel comprised 35 members from 27 organisations across six countries and five disciplines. Within three rounds, the panels had reached a consensus on 20 key reporting standards. The quality standards consist of eight criteria for realist evaluations. We developed resources and training materials for 15 theoretical and methodological topics. All resources are available online (www.ramesesproject.org). We provided methodological support to 17 projects and presentations or workshops to help build research capacity in realist evaluations to 29 organisations. Finally, we produced a generic patient information leaflet for lay participants in realist evaluations. Limitations: Our project had ambitious goals that created a substantial workload, leading to the need to prioritise objectives. For example, we truncated the literature review and focused on standards and training material development. Conclusions: Although realist evaluation holds much promise, misunderstandings and misapplications of it are common. We hope that our project’s outputs and activities will help to address these problems. Our resources are the start of an iterative journey of refinement and development of better resources for realist evaluations. The RAMESES II project seeks not to produce the last word on these issues, but to capture current expertise and establish an agreed state of the science. Much methodological development is needed in realist evaluation but this can take place only if there is a sufficient pool of highly skilled realist evaluators. Capacity building is the next key step in realist evaluation. Funding: The National Institute for Health Research Health Services and Delivery Research programme.https://doi.org/10.3310/hsdr05280realist evaluationquality standardsreporting standardstraining materialsdelphi processrameses project |
spellingShingle | Geoff Wong Gill Westhorp Joanne Greenhalgh Ana Manzano Justin Jagosh Trisha Greenhalgh Quality and reporting standards, resources, training materials and information for realist evaluation: the RAMESES II project Health Services and Delivery Research realist evaluation quality standards reporting standards training materials delphi process rameses project |
title | Quality and reporting standards, resources, training materials and information for realist evaluation: the RAMESES II project |
title_full | Quality and reporting standards, resources, training materials and information for realist evaluation: the RAMESES II project |
title_fullStr | Quality and reporting standards, resources, training materials and information for realist evaluation: the RAMESES II project |
title_full_unstemmed | Quality and reporting standards, resources, training materials and information for realist evaluation: the RAMESES II project |
title_short | Quality and reporting standards, resources, training materials and information for realist evaluation: the RAMESES II project |
title_sort | quality and reporting standards resources training materials and information for realist evaluation the rameses ii project |
topic | realist evaluation quality standards reporting standards training materials delphi process rameses project |
url | https://doi.org/10.3310/hsdr05280 |
work_keys_str_mv | AT geoffwong qualityandreportingstandardsresourcestrainingmaterialsandinformationforrealistevaluationtheramesesiiproject AT gillwesthorp qualityandreportingstandardsresourcestrainingmaterialsandinformationforrealistevaluationtheramesesiiproject AT joannegreenhalgh qualityandreportingstandardsresourcestrainingmaterialsandinformationforrealistevaluationtheramesesiiproject AT anamanzano qualityandreportingstandardsresourcestrainingmaterialsandinformationforrealistevaluationtheramesesiiproject AT justinjagosh qualityandreportingstandardsresourcestrainingmaterialsandinformationforrealistevaluationtheramesesiiproject AT trishagreenhalgh qualityandreportingstandardsresourcestrainingmaterialsandinformationforrealistevaluationtheramesesiiproject |