Variation in Research Designs Used to Test the Effectiveness of Dissemination and Implementation Strategies: A Review

BackgroundThe need for optimal study designs in dissemination and implementation (D&I) research is increasingly recognized. Despite the wide range of study designs available for D&I research, we lack understanding of the types of designs and methodologies that are routinely used in t...

Full description

Bibliographic Details
Main Authors: Stephanie Mazzucca, Rachel G. Tabak, Meagan Pilar, Alex T. Ramsey, Ana A. Baumann, Emily Kryzer, Ericka M. Lewis, Margaret Padek, Byron J. Powell, Ross C. Brownson
Format: Article
Language:English
Published: Frontiers Media S.A. 2018-02-01
Series:Frontiers in Public Health
Subjects:
Online Access:http://journal.frontiersin.org/article/10.3389/fpubh.2018.00032/full
_version_ 1811195551004229632
author Stephanie Mazzucca
Rachel G. Tabak
Meagan Pilar
Alex T. Ramsey
Ana A. Baumann
Emily Kryzer
Ericka M. Lewis
Margaret Padek
Byron J. Powell
Ross C. Brownson
Ross C. Brownson
author_facet Stephanie Mazzucca
Rachel G. Tabak
Meagan Pilar
Alex T. Ramsey
Ana A. Baumann
Emily Kryzer
Ericka M. Lewis
Margaret Padek
Byron J. Powell
Ross C. Brownson
Ross C. Brownson
author_sort Stephanie Mazzucca
collection DOAJ
description BackgroundThe need for optimal study designs in dissemination and implementation (D&I) research is increasingly recognized. Despite the wide range of study designs available for D&I research, we lack understanding of the types of designs and methodologies that are routinely used in the field. This review assesses the designs and methodologies in recently proposed D&I studies and provides resources to guide design decisions.MethodsWe reviewed 404 study protocols published in the journal Implementation Science from 2/2006 to 9/2017. Eligible studies tested the efficacy or effectiveness of D&I strategies (i.e., not effectiveness of the underlying clinical or public health intervention); had a comparison by group and/or time; and used ≥1 quantitative measure. Several design elements were extracted: design category (e.g., randomized); design type [e.g., cluster randomized controlled trial (RCT)]; data type (e.g., quantitative); D&I theoretical framework; levels of treatment assignment, intervention, and measurement; and country in which the research was conducted. Each protocol was double-coded, and discrepancies were resolved through discussion.ResultsOf the 404 protocols reviewed, 212 (52%) studies tested one or more implementation strategy across 208 manuscripts, therefore meeting inclusion criteria. Of the included studies, 77% utilized randomized designs, primarily cluster RCTs. The use of alternative designs (e.g., stepped wedge) increased over time. Fewer studies were quasi-experimental (17%) or observational (6%). Many study design categories (e.g., controlled pre–post, matched pair cluster design) were represented by only one or two studies. Most articles proposed quantitative and qualitative methods (61%), with the remaining 39% proposing only quantitative. Half of protocols (52%) reported using a theoretical framework to guide the study. The four most frequently reported frameworks were Consolidated Framework for Implementing Research and RE-AIM (n = 16 each), followed by Promoting Action on Research Implementation in Health Services and Theoretical Domains Framework (n = 12 each).ConclusionWhile several novel designs for D&I research have been proposed (e.g., stepped wedge, adaptive designs), the majority of the studies in our sample employed RCT designs. Alternative study designs are increasing in use but may be underutilized for a variety of reasons, including preference of funders or lack of awareness of these designs. Promisingly, the prevalent use of quantitative and qualitative methods together reflects methodological innovation in newer D&I research.
first_indexed 2024-04-12T00:45:00Z
format Article
id doaj.art-7f667af5f79c431cbcb125f544a34be5
institution Directory Open Access Journal
issn 2296-2565
language English
last_indexed 2024-04-12T00:45:00Z
publishDate 2018-02-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Public Health
spelling doaj.art-7f667af5f79c431cbcb125f544a34be52022-12-22T03:54:53ZengFrontiers Media S.A.Frontiers in Public Health2296-25652018-02-01610.3389/fpubh.2018.00032330284Variation in Research Designs Used to Test the Effectiveness of Dissemination and Implementation Strategies: A ReviewStephanie Mazzucca0Rachel G. Tabak1Meagan Pilar2Alex T. Ramsey3Ana A. Baumann4Emily Kryzer5Ericka M. Lewis6Margaret Padek7Byron J. Powell8Ross C. Brownson9Ross C. Brownson10Prevention Research Center in St. Louis, Brown School, Washington University in St. Louis, St. Louis, MO, United StatesPrevention Research Center in St. Louis, Brown School, Washington University in St. Louis, St. Louis, MO, United StatesPrevention Research Center in St. Louis, Brown School, Washington University in St. Louis, St. Louis, MO, United StatesDepartment of Psychiatry, Washington University School of Medicine, St. Louis, MO, United StatesBrown School of Social Work, Washington University in St. Louis, St. Louis, MO, United StatesBrown School of Social Work, Washington University in St. Louis, St. Louis, MO, United StatesSchool of Social Work, University of Maryland, Baltimore, MD, United StatesPrevention Research Center in St. Louis, Brown School, Washington University in St. Louis, St. Louis, MO, United StatesDepartment of Health Policy and Management, Gillings School of Global Public Health, University of North Carolina at Chapel Hill, Chapel Hill, NC, United StatesPrevention Research Center in St. Louis, Brown School, Washington University in St. Louis, St. Louis, MO, United StatesDepartment of Surgery, Alvin J. Siteman Cancer Center, Washington University School of Medicine, Washington University in St. Louis, St. Louis, MO, United StatesBackgroundThe need for optimal study designs in dissemination and implementation (D&I) research is increasingly recognized. Despite the wide range of study designs available for D&I research, we lack understanding of the types of designs and methodologies that are routinely used in the field. This review assesses the designs and methodologies in recently proposed D&I studies and provides resources to guide design decisions.MethodsWe reviewed 404 study protocols published in the journal Implementation Science from 2/2006 to 9/2017. Eligible studies tested the efficacy or effectiveness of D&I strategies (i.e., not effectiveness of the underlying clinical or public health intervention); had a comparison by group and/or time; and used ≥1 quantitative measure. Several design elements were extracted: design category (e.g., randomized); design type [e.g., cluster randomized controlled trial (RCT)]; data type (e.g., quantitative); D&I theoretical framework; levels of treatment assignment, intervention, and measurement; and country in which the research was conducted. Each protocol was double-coded, and discrepancies were resolved through discussion.ResultsOf the 404 protocols reviewed, 212 (52%) studies tested one or more implementation strategy across 208 manuscripts, therefore meeting inclusion criteria. Of the included studies, 77% utilized randomized designs, primarily cluster RCTs. The use of alternative designs (e.g., stepped wedge) increased over time. Fewer studies were quasi-experimental (17%) or observational (6%). Many study design categories (e.g., controlled pre–post, matched pair cluster design) were represented by only one or two studies. Most articles proposed quantitative and qualitative methods (61%), with the remaining 39% proposing only quantitative. Half of protocols (52%) reported using a theoretical framework to guide the study. The four most frequently reported frameworks were Consolidated Framework for Implementing Research and RE-AIM (n = 16 each), followed by Promoting Action on Research Implementation in Health Services and Theoretical Domains Framework (n = 12 each).ConclusionWhile several novel designs for D&I research have been proposed (e.g., stepped wedge, adaptive designs), the majority of the studies in our sample employed RCT designs. Alternative study designs are increasing in use but may be underutilized for a variety of reasons, including preference of funders or lack of awareness of these designs. Promisingly, the prevalent use of quantitative and qualitative methods together reflects methodological innovation in newer D&I research.http://journal.frontiersin.org/article/10.3389/fpubh.2018.00032/fullresearch study designresearch methodsreviewimplementation researchdissemination research
spellingShingle Stephanie Mazzucca
Rachel G. Tabak
Meagan Pilar
Alex T. Ramsey
Ana A. Baumann
Emily Kryzer
Ericka M. Lewis
Margaret Padek
Byron J. Powell
Ross C. Brownson
Ross C. Brownson
Variation in Research Designs Used to Test the Effectiveness of Dissemination and Implementation Strategies: A Review
Frontiers in Public Health
research study design
research methods
review
implementation research
dissemination research
title Variation in Research Designs Used to Test the Effectiveness of Dissemination and Implementation Strategies: A Review
title_full Variation in Research Designs Used to Test the Effectiveness of Dissemination and Implementation Strategies: A Review
title_fullStr Variation in Research Designs Used to Test the Effectiveness of Dissemination and Implementation Strategies: A Review
title_full_unstemmed Variation in Research Designs Used to Test the Effectiveness of Dissemination and Implementation Strategies: A Review
title_short Variation in Research Designs Used to Test the Effectiveness of Dissemination and Implementation Strategies: A Review
title_sort variation in research designs used to test the effectiveness of dissemination and implementation strategies a review
topic research study design
research methods
review
implementation research
dissemination research
url http://journal.frontiersin.org/article/10.3389/fpubh.2018.00032/full
work_keys_str_mv AT stephaniemazzucca variationinresearchdesignsusedtotesttheeffectivenessofdisseminationandimplementationstrategiesareview
AT rachelgtabak variationinresearchdesignsusedtotesttheeffectivenessofdisseminationandimplementationstrategiesareview
AT meaganpilar variationinresearchdesignsusedtotesttheeffectivenessofdisseminationandimplementationstrategiesareview
AT alextramsey variationinresearchdesignsusedtotesttheeffectivenessofdisseminationandimplementationstrategiesareview
AT anaabaumann variationinresearchdesignsusedtotesttheeffectivenessofdisseminationandimplementationstrategiesareview
AT emilykryzer variationinresearchdesignsusedtotesttheeffectivenessofdisseminationandimplementationstrategiesareview
AT erickamlewis variationinresearchdesignsusedtotesttheeffectivenessofdisseminationandimplementationstrategiesareview
AT margaretpadek variationinresearchdesignsusedtotesttheeffectivenessofdisseminationandimplementationstrategiesareview
AT byronjpowell variationinresearchdesignsusedtotesttheeffectivenessofdisseminationandimplementationstrategiesareview
AT rosscbrownson variationinresearchdesignsusedtotesttheeffectivenessofdisseminationandimplementationstrategiesareview
AT rosscbrownson variationinresearchdesignsusedtotesttheeffectivenessofdisseminationandimplementationstrategiesareview