Abstract
Introduction: Realist evaluation is an increasingly popular methodology in health services research. For realist evaluations (RE) this project aims to: develop quality and reporting standards and training materials; build capacity for undertaking and critically evaluating them; produce resources and training materials for lay participants, and those seeking to involve them.
Methods: To achieve our aims, we will: (1) Establish management and governance infrastructure; (2) Recruit an interdisciplinary Delphi panel of 35 participants with diverse relevant experience of RE; (3) Summarise current literature and expert opinion on best practice in RE; (4) Run an online Delphi panel to generate and refine items for quality and reporting standards; (5) Capture ‘real world’ experiences and challenges of RE—for example, by providing ongoing support to realist evaluations, hosting the RAMESES JISCmail list on realist research, and feeding problems and insights from these into the deliberations of the Delphi panel; (6) Produce quality and reporting standards; (7) Collate examples of the learning and training needs of researchers, students, reviewers and lay members in relation to RE; (8) Develop, deliver and evaluate training materials for RE and deliver training workshops; and (9) Develop and evaluate information and resources for patients and other lay participants in RE (eg, draft template information sheets and model consent forms) and; (10) Disseminate training materials and other resources.
Planned outputs: (1) Quality and reporting standards and training materials for RE. (2) Methodological support for RE. (3) Increase in capacity to support and evaluate RE. (4) Accessible, plain-English resources for patients and the public participating in RE.
Discussion: The realist evaluation is a relatively new approach to evaluation and its overall place in the is not yet fully established. As with all primary research approaches, guidance on quality assurance and uniform reporting is an important step towards improving quality and consistency.
Methods: To achieve our aims, we will: (1) Establish management and governance infrastructure; (2) Recruit an interdisciplinary Delphi panel of 35 participants with diverse relevant experience of RE; (3) Summarise current literature and expert opinion on best practice in RE; (4) Run an online Delphi panel to generate and refine items for quality and reporting standards; (5) Capture ‘real world’ experiences and challenges of RE—for example, by providing ongoing support to realist evaluations, hosting the RAMESES JISCmail list on realist research, and feeding problems and insights from these into the deliberations of the Delphi panel; (6) Produce quality and reporting standards; (7) Collate examples of the learning and training needs of researchers, students, reviewers and lay members in relation to RE; (8) Develop, deliver and evaluate training materials for RE and deliver training workshops; and (9) Develop and evaluate information and resources for patients and other lay participants in RE (eg, draft template information sheets and model consent forms) and; (10) Disseminate training materials and other resources.
Planned outputs: (1) Quality and reporting standards and training materials for RE. (2) Methodological support for RE. (3) Increase in capacity to support and evaluate RE. (4) Accessible, plain-English resources for patients and the public participating in RE.
Discussion: The realist evaluation is a relatively new approach to evaluation and its overall place in the is not yet fully established. As with all primary research approaches, guidance on quality assurance and uniform reporting is an important step towards improving quality and consistency.
Original language | English |
---|---|
Pages (from-to) | 1-10 |
Number of pages | 10 |
Journal | BMJ Open |
Volume | 5 |
Issue number | 8 |
DOIs |
|
Publication status | Published - Aug 2015 |