- Research article
- Open Access
- Open Peer Review
Evaluating a train-the-trainer approach for improving capacity for evidence-based decision making in public health
BMC Health Services Researchvolume 15, Article number: 547 (2015)
Evidence-based public health gives public health practitioners the tools they need to make choices based on the best and most current evidence. An evidence-based public health training course developed in 1997 by the Prevention Research Center in St. Louis has been taught by a transdisciplinary team multiple times with positive results. In order to scale up evidence-based practices, a train-the-trainer initiative was launched in 2010.
This study examines the outcomes achieved among participants of courses led by trained state-level faculty. Participants from trainee-led courses in four states (Indiana, Colorado, Nebraska, and Kansas) over three years were asked to complete an online survey. Attempts were made to contact 317 past participants. One-hundred forty-four (50.9 %) reachable participants were included in analysis. Outcomes measured include frequency of use of materials, resources, and other skills or tools from the course; reasons for not using the materials and resources; and benefits from attending the course. Survey responses were tabulated and compared using Chi-square tests.
Among the most commonly reported benefits, 88 % of respondents agreed that they acquired knowledge about a new subject, 85 % saw applications for the knowledge to their work, and 78 % agreed the course also improved abilities to make scientifically informed decisions at work. The most commonly reported reasons for not using course content as much as intended included not having enough time to implement evidence-based approaches (42 %); other staff/peers lack training (34 %); and not enough funding for continued training (34 %). The study findings suggest that utilization of course materials and teachings remains relatively high across practitioner groups, whether they were taught by the original trainers or by state-based trainers.
The findings of this study suggest that train-the-trainer is an effective method for broadly disseminating evidence-based public health principles. Train-the-trainer is less costly than the traditional method and allows for courses to be tailored to local issues, thus making it a viable approach to dissemination and scale up of new public health practices.
Public health is a diverse field, employing people from a variety of backgrounds in a wide range of occupations . The occupations are as varied as the level of education, which range from high school diplomas to doctoral degrees. Data that are available suggest that less than half of the public health workforce has formal training in public health [2, 3]. Further limiting the standardization of skills across roles is the lack of formal core competencies or certification criteria for most practitioners . Long-term solutions for filling this gap in preparedness include on-the-job training as a way to disseminate knowledge and enhance skills of public health practitioners. Yet training opportunities vary widely by region and face myriad challenges including high staff turnover, lack of available trainers locally, and restrictions on travel that would allow participation in continuing education [5–7].
Workforce capacity building in public health has been an area of focus for decades since attention was drawn to the inadequate public health infrastructure [3, 4, 8, 9]. Strengthening the public health infrastructure was a driving force behind the formation of the Public Health Accreditation Board, which developed accreditation standards and measures for public health agencies [10, 11]. Assuring workforce competence is one of the 10 domains. Part of this domain focuses on assessing knowledge and skill gaps and providing appropriate training. The final domain specifically addresses contributing to and applying the evidence base of public health .
Beginning in the 1990’s and following the lead established in medicine, public health recognized the need to identify the evidence of effectiveness for different interventions, translate that evidence into recommendations for practice, and increase the extent to which that evidence is used [12–14]. Evidence-based public health (EBPH) has been described as the integration of science-based interventions with community preferences to improve population health . By its nature, EBPH is an iterative and dynamic process as it takes place in natural settings rather than in controlled experimental situations. Because EBPH is a relatively new approach to public health practice, many practitioners, regardless of educational background, have not received formal training on this topic. One of the most widely disseminated training efforts to improve evidence-based decision making in public health has been a course developed in 1997 in Missouri . The EBPH course, offered in a 2.5 to 4-day format, includes 9 modules that cover the core principles of evidence-based public health from problem definition through program development to evaluation [7, 12, 16–18]. It is designed to provide tools and information that will improve skills for evidence-based decision making among public health practitioners.
Initially, the EBPH course was taught exclusively in Missouri for state and local public health practitioners. In an attempt to broaden the reach of training, the Centers for Disease Control and Prevention began its support of a national course in 2002. This annual training draws 25–35 participants from state and local government, non-governmental organizations (e.g., American Cancer Society, YMCA), and other sectors. In the first long-term evaluation of EBPH, which included Missouri and national participants from 2001–2004, 90 % of respondents reported that the course helped them make informed decisions in the workplace and 82 % of participants reported that the course content helped them communicate better with co-workers . While participants value the course and content, its reach into the public health workforce is not complete due to a number of barriers. Based on both qualitative and quantitative evaluations, one of the leading barriers to applying the skills taught in the course is not having co-workers who are also trained in EBPH [5, 19, 20].
To ensure a critical mass of workers with a common language and understanding of EBPH, training must be “scaled up.” Scalability is the process by which an intervention shown to be efficacious on a small scale (under controlled conditions) is expanded under real world conditions to reach a broader practice or policy audience [21, 22]. Scaling-up a public health innovation like the EBPH training program would improve coverage and access to the training and its intended benefits by reducing cost, utilizing in-state trainers who are knowledgeable of local issues, and encouraging collaboration among researchers and staff from neighboring universities and local and state public health departments. The process of scaling-up requires an implementation plan that considers the context, delivery mechanisms, and resource requirements of the program [23, 24].
In an effort to scale up the EBPH training course, the program was expanded in 2010 to begin taking the training to states with the aim of building EBPH capacity within those health departments and leveraging their expertise to train co-workers and others. The approach, funded by the National Association of Chronic Disease Directors (NACDD), was a train-the-trainer program.
Train-the-trainer programs are used in a wide variety of fields for workforce development, including public health preparedness ; occupational safety ; nutrition education , health care issues [28–32]; and a variety of clinical interventions [33, 34]. Train-the-trainer approaches have been used extensively in HIV prevention and education to train clinicians and peers [35–40]. There are a number of potential advantages to train-the-trainer approaches, the most obvious being to reach larger audiences through subsequent training activities led by those who were trained initially. Assuming the trainees are local to the audiences they will train, they may have more direct access to those communities and better understanding of contextual issues affecting application of training. Building capacity at the local level also has the potential for enhancing collaboration and networking among those trained and for sustaining the training .
Despite its widespread use and potential benefits, the literature on the effectiveness of train-the-trainer approaches is limited. A contributing factor is that many of those who participate in train-the-trainer programs do not replicate training sessions at the local level . For example, only 20 % of those trained in disaster preparedness  conducted a replication training 6 months after they were trained. Similarly, in a study of perinatal HIV prevention and care training, only 20 % went on to conduct training after being trained .
In the EBPH train-the-trainer program, state chronic disease units were invited to apply for on-site training by the PRC-StL faculty; they were encouraged to involve faculty from local schools of public health in the process. A condition of award was that states agree to replicate the course at least once in the subsequent year, to be taught by in-state trainers who had completed the training course.
Each year, one to two states were selected to have training on site. Twenty-five to forty participants, including public health practitioners from state and local government along with their partners from academic centers and community organizations, were trained in each state. Between 2010 and 2015, ten states received training. To date, six states have replicated the course two or more times; three have offered it once, and two are in the planning stages for their first or second replication.
Among the participants in the initial training were people who had been previously identified as potential future trainers. After the initial training, local trainers were provided support materials, including guidance on adult learning techniques and dialogue education, as well as technical assistance from a NACDD contractor, who also worked closely with a local coordinator throughout the process and served as a liaison between state-based faculty and the original training team.
Several steps were taken to maximize course fidelity. All replication courses included the same nine core modules as the original training. While the objectives, framework, and essential content remained the same, the state-based trainers were encouraged to consider their state’s priorities and incorporate local data and relevant program and policy examples wherever appropriate. The NACDD contractor and course developers collaborated with states on tailoring and any other proposed changes to the content or format. At the time of their first replication, new trainers were also observed and provided constructive feedback.
Innovative products, programs and practices often fall short of realizing their full impact due to scaling-up challenges. Fortunately, research interest in scale-up and spread is increasing [21, 42]. However, much of the literature on scaling-up of innovations to date has focused on barriers and facilitators [23, 43]. In this paper, we evaluate a train-the-trainer approach to scaling up. We describe the application of training concepts and tools by participants of courses led by trained state-level staff and the reach of training by those states. As EBPH is a complex, iterative approach to decision making in public health , assessing gains in knowledge and skills from individual modules provides a limited picture. Instead, implementation of core concepts is measured and used as a proxy for gains in knowledge and skills.
This research was approved by the Saint Louis University Institutional Review Board.
In this evaluation, we surveyed public health practitioners who attended a state-sponsored EBPH course between 2011 and 2013 in Colorado, Indiana, Kansas, or Nebraska. Total replications per state ranged from one to five during that time period. In total, 317 past attendees were contacted via email and invited to take a brief (10 min on average) survey in Qualtrics . To increase response rate, participants received two reminder emails, a phone call, and a final reminder email. The survey remained open for 3 months. The 34 course attendees not reachable by email, who had no working phone number and/or no longer worked at the health department, were deemed unreachable, leaving a possible 283 respondents. The final response rate was 50.9 % (144/283).
Along with background characteristics, the survey included questions on the frequency of use of materials, resources, and other skills or tools from the course; reasons for not using the materials and resources as much as intended; and benefits from attending the course. Use of materials/skills was measured with a four-point frequency scale (seldom/never, quarterly, monthly, or weekly). Course benefits and reasons for less than intended material/skill use was measured on a 5-point Likert scale (from strongly disagree to strongly agree). The survey also included open-ended questions where participants were invited to describe the most useful parts of the training and what could have been done differently to improve the course. The survey instruments are available from the last author and in Additional file 1.
We calculated frequencies and conducted descriptive statistics to explore participant characteristics and responses. Similar to other work [7, 19, 45], we compared data across three mutually exclusive groups--state health department, local health department (county or city), and participants from an agency other than a health department such as a university or community organization. We conducted Chi-square tests to determine statistical differences in proportions across the three participant groups. Statistical significance was determined at p < 0.05. No relationships were found in regard to clustering by state, therefore no adjustments were made in regard to state membership because the survey assessed individual-level opinions on the personal competencies. For qualitative analysis of open-ended items, responses were grouped and coded for main themes. Direct quotes were then selected to represent the main themes that emerged.
Among respondents, most (56 %) were from local city or county health departments (Table 1). In addition, a little over a quarter (26 %) were from state health departments and another 13 % were from various other organizations such as universities (5 %), community-based organizations (5 %) and other non-profit and health-related entities (3 %). Program manager or coordinator was the most often reported type of job position (35 %), followed by health educator or community health worker (25 %). Eleven percent of the sample was considered upper agency management (e.g., division or bureau heads, directors, deputies). Almost half (49 %) held at least a master’s degree (16 % with a Master of Public Health) as their highest degree earned with just over a fourth (26 %) with a bachelor’s degree or less. Areas of specialization among participants varied widely. More than one-third (36 %) specialized in health promotion. Other common areas of specialization included obesity, physical activity and/or nutrition (25 %), epidemiology or evaluation (24 %), tobacco (19 %), and communicable diseases (18 %). Participants reported a mean of 10 years (SD = 7.2 years) working in public health.
Participants reported numerous benefits from attending the EBPH course (Table 2). Among the most commonly reported benefits, 88 % of respondents agreed that they acquired knowledge about a new subject, 85 % saw applications for the knowledge to their work, and 78 % agreed the course also improved abilities to make scientifically informed decisions at work. Approximately one-third agreed that the course helped them prepare policy briefings (32 %) or obtain funding for programs (31 %). Two benefits from the course varied by type of agency. Local and state health participants were less likely to report that the course helped them to adapt an evidence-based intervention to a community’s needs compared to participants from other agencies (53 %, 68 %, and 81 % respectively, p = .02). In addition, those from other agencies and local health participants were less likely to agree than those from state health departments that the course helped them to implement evidence-based practices in CDC cooperative agreements or other federal programs (27, 40, and 58 % respectively, p = .04).
Use of course materials, resources and skills
Frequency with which core materials and skills from the course were used varied (Table 3). One-third (33 %) reported searching scientific literature at least once per month. This was lower among local health participants (23 %) as compared to those from state health departments (45 %) and other agencies (46 %) (p = .02). In addition, local health participants were less likely to report using materials and skills from the course at least monthly to evaluate a program compared to state health participants and those from other agencies (12, 37, and 31 % respectively, p = .004). The most commonly reported reasons for not using course content as much as intended included not having enough time to implement EBPH approaches (42 %); other staff/peers lack EBPH training (34 %); and not enough funding for continued training (34 %) (data not shown). No associations were found between course materials, resources, skills and participants’ education level or degree type.
Responses to open-ended questions
Two open-ended questions were included in the survey. The first asked respondents what was the most useful part of the training (Table 4). One group of responses clustered into themes about content—learning about EBPH, learning about available resources, and gaining knowledge in specific areas. One respondent wrote, “Having a tangible reference as to what evidence-based public health strategies meant and how they could be used in our everyday work lives.” Another said, “Utilizing data and information to select evidence-based strategies/ programs.” Another set of responses centered on the learning process. Comments included, “Small group discussion and group work developing examples of EBPH,” and “Interaction with other team members to discuss ways to improve or work with EBPH methods.” Other comments related to specific content areas, e.g., the value of the module on economic evaluation and the concept of return on investment.
The second qualitative question asked how the training could be improved (Table 5). The main themes in responses related to the need for more course follow up, more examples from practice, and more group and hands-on work. There was also a group of responses regarding the length and level of individual modules and the course overall. The desire for continued learning though follow-up sessions was cited most often, as illustrated by these comments: “Maybe offer continuing or follow-up training to keep us fresh,” and “Refresher courses one time per year where each participant could perhaps bring an example to present to others.” The latter comment also speaks to the theme of wanting more examples of how EBPH has been used in practice. Another respondent wrote, “Possibly more real life examples of how programs and various job positions can incorporate it in their work.”
Comparison to traditional PRC-led courses
A comparison of train-the-trainer respondents to those taught by the original trainers showed few differences in reported outcomes (Table 6). The train-the-trainer group differed significantly in terms of percentage agreeing or strongly agreeing that they had acquired new knowledge (88 vs 78 %) and adapting an intervention to a community’s needs while keeping it evidenced-based (62 vs 51 %). The traditionally trained group reported higher agreement to the ability to implement evidence-based practices in a CDC cooperative agreement or other federal program (60 vs. 42 %). There were no significant differences between groups with utilization of EBPH course materials and resources.
This evaluation provides support for the effectiveness of a train-the-trainer method for improving skills and capacity to practice EBPH. Nearly 80 % of respondents who took a state-based course taught by in-state trainers reported that the course had helped them to make scientifically-based decisions at work. Additionally, four out of five participants agreed or strongly agreed that the course had helped them to become a better leader who promoted evidence-based decision making.
Previous studies have assessed the impact of the EBPH courses, both domestically and abroad. One such study utilized a follow-up survey for participants taking the course between 2001 and 2004 . Another evaluation surveyed those taking the course from 2008 to 2011 . These evaluations followed up with participants who had been trained by the original PRC-StL faculty. The surveys assessed whether or not participants used certain skills and tools from the course on at least a monthly basis. The 2008–2011 cohort  also included international participants, but those participants are excluded from this discussion to allow for better comparability between groups.
Comparing results of the current evaluation with the most recent evaluation of the traditional course format  allows us to compare benefits of training and follow-up use of course materials and concepts (Table 6). The frequencies of most benefits were comparable for the traditional vs. train-the-trainer format, suggesting similar effectiveness of the train-the-trainer model. Two benefits were more often cited among train-the-trainer participants (acquire new knowledge, adapt to a community’s needs) whereas one benefit was more common among traditional format respondents (implement evidence-based practices in a CDC/federally funded program). Participants reporting that they used EBPH materials and skills in planning a new program at least monthly fluctuated only slightly (nonsignificantly) between the traditional and train-the-trainer formats.
Barriers to EBPH, as noted in the current study, provide the context for developing and scaling up public health training programs. Although the rankings and percentages vary slightly among the studies of the EBPH program to date, three barriers have consistently been among those most commonly cited: not having enough time, not having funding, and not having co-workers who are also trained in EBPH [5, 7, 19, 20, 46]. To address the issue of adequate time for applying EBPH concepts, the course seeks to identify user-friendly tools that are readily available to practitioners (e.g., the Community Guide, the National Network of Libraries of Medicine) . The lack of co-workers trained in EBPH points to the need for a “critical mass” of committed staff and a social network in support of evidence-based decision making [48, 49]. Having trainers who live and work in-state provides local EBPH experts in the workplace, allowing for more rapid spread of EBPH processes through enhanced communication and ongoing collaboration among colleagues.
A few limitations of the current evaluation deserve note. First, the data collected are self-reported, measuring respondents’ perceptions of learning and impacts. It is possible that participants over- or under-rated their skills and knowledge when responding to survey items. Second, the time gap from delivery of the course to data collection resulted in a sizable proportion of participants (14 %) who had changed jobs since they took the course. This change in roles makes these individuals more difficult to contact. Although several attempts were made, another 11 % of eligible course participants were not reachable by email or phone. Additionally, the data collected did not allow for subanalyses to examine time between course participation to survey response as an independent variable. Further research is needed to determine if skills and/or benefits from the course change over time, particularly as replications continue and the time gap since training widens. Finally, we do not have relevant data about non-respondents, and thus are not able to conclude that respondents consist of a representative sample.
Based on the data presented here, a train-the-trainer model is a viable method for expanding the reach of EBPH training. An EBPH train-the-trainer program can effectively improve numerous skills essential to evidence-based decision making among public health practitioners. To maximize efficiency and take advantage of advances in technology, several sites have expressed interest in electronic or virtual platforms for training. To date, one site has implemented an online-only option, and another has offered a hybrid version of the course in which select content is provided via webinar followed by 2 days of face-to-face training. Traditionally, course participants have rated highly the aspects of the course that enable working together during training and networking with peers. Thus, any potential benefits of online modalities will need to be carefully balanced with the loss of face-to-face interaction. Future research will be needed to evaluate the effectiveness of these modalities. To date, ten states (Kansas, Colorado, Indiana, Nebraska, Florida, New York, Texas, Vermont, Oklahoma, and Tennessee) have been a part of the train-the-trainer program. Future analyses will be needed to compare these training outcomes to those measured in this study. Of future research interest is also the longer-term impact of training on program and policy development and the development of strategic collaborations. Although these are beyond the scope of our training goals, other training programs have noted positive impacts on public health policy and development of research networks as indirect benefits of training at the network and organizational levels [50, 51].
This evaluation and related literature [25, 29, 31] suggest many benefits and lessons of the train-the-trainer model including 1) the advantage of local trainers who are more familiar with contextual issues to allow tailoring of the training; 2) enhanced collaboration among practice and academic partners to create a forum for networking and new partnership opportunities; 3) a more convenient and less costly method of training that eliminates the need to bring in external trainers or for participants to travel out of state; and 4) specific examples of how to improve the course in the future. This evaluation suggests that the train-the-trainer method has increased the capacity of practitioners trained in EBPH while maintaining fidelity with the original objectives and framework of the course.
Koo D, Miner K. Outcome-based workforce development and education in public health. Annu Rev Public Health. 2010;31:253–69. 1 p following 69.
Centers for Disease Control and Prevention. Fact sheet: public health infrastructure. Atlanta, GA: Centers for Disease Control and Prevention; 2001. Contract No.: Document Number|.
Turnock BJ. Public health: What it is and how it works. 4th ed. Sudbury, MA: Jones and Bartlett Publishers; 2009.
Institute of Medicine. Who will keep the public healthy? Educating public health professionals for the 21st century. Washington, D.C.: National Academies Press; 2003.
Baker EA, Brownson RC, Dreisinger M, McIntosh LD, Karamehic-Muratovic A. Examining the role of training in evidence-based public health: a qualitative study. Health Promot Pract. 2009;10(3):342–8.
Brownson RC, Fielding JE, Maylahn CM. Evidence-based public health: a fundamental concept for public health practice. Annu Rev Public Health. 2009;30:175–201.
Dreisinger M, Leet TL, Baker EA, Gillespie KN, Haas B, Brownson RC. Improving the public health workforce: evaluation of a training course to enhance evidence-based decision making. J Public Health Manag Pract. 2008;14(2):138–43.
IOM. Committee for the study of the future of public health. The future of public health. Washington, DC: National Academy Press; 1988.
Institute of Medicine. The future of the public’s health in the 21st century. Washington, D.C.: National Academies Press; 2003.
Bender K, Halverson PK. Quality improvement and accreditation: what might it look like? J Public Health Manag Pract. 2010;16(1):79–82.
Public Health Accreditation Board. Public Health Accreditation Board Standards and Measures, version 1.5. 2013. Alexandria, VA: Public Health Accreditation Board; 2013 [updated 2013; cited August 25, 2015]; Available from: http://www.phaboard.org/wp-content/uploads/SM-Version-1.5-Board-adopted-FINAL-01-24-2014.docx.pdf.
Brownson RC, Gurney JG, Land GH. Evidence-based decision making in public health. J Public Health Manag Pract. 1999;5(5):86–97.
Glasziou P, Longbottom H. Evidence-based public health practice. Aust N Z J Public Health. 1999;23(4):436–40.
Jenicek M. Epidemiology, evidence-based medicine, and evidence-based public health. J Epidemiol Commun Health. 1997;7:187–97.
Kohatsu ND, Robinson JG, Torner JC. Evidence-based public health: an evolving concept. Am J Prev Med. 2004;27(5):417–21.
Brownson RC, Baker EA, Leet TL, Gillespie KN, True WR. Evidence-based public health. 2nd ed. New York: Oxford University Press; 2011.
Brownson RC, Diem G, Grabauskas V, Legetic B, Potemkina R, Shatchkute A, et al. Training practitioners in evidence-based chronic disease prevention for global health. Promot Educ. 2007;14(3):159–63.
O’Neall MA, Brownson RC. Teaching evidence-based public health to public health practitioners. Ann Epidemiol. 2005;15(7):540–4.
Gibbert WS, Keating SM, Jacobs JA, Dodson E, Baker E, Diem G, et al. Training the workforce in evidence-based public health: an evaluation of impact among US and international practitioners. Prev Chronic Dis. 2013;10:E148.
Jacobs JA, Duggan K, Erwin P, Smith C, Borawski E, Compton J, et al. Capacity building for evidence-based decision making in local health departments: scaling up an effective training approach. Implement Sci. 2014;9(1):124.
Milat AJ, King L, Bauman AE, Redman S. The concept of scalability: increasing the scale and potential adoption of health promotion interventions into policy and practice. Health Promot Int. 2012;28(3):285–98.
Milat AJ, King L, Newson R, Wolfenden L, Rissel C, Bauman A, et al. Increasing the scale and adoption of population health interventions: experiences and perspectives of policy makers, practitioners, and researchers. Health Res Policy Syst. 2014;12:18.
Mangham LJ, Hanson K. Scaling up in international health: what are the key issues? Health Policy Plan. 2010;25(2):85–96.
Diem G, Brownson RC, Grabauskas V, Shatchkute A, Stachenko S. Prevention and control of noncommunicable diseases through evidence-based public health: implementing the NCD 2020 action plan. Glob Health Promot. 2015;10.
Orfaly RA, Frances JC, Campbell P, Whittemore B, Joly B, Koh H. Train-the-trainer as an educational model in public health preparedness. J Public Health Manag Pract. 2005;Suppl:S123–7.
Trabeau M, Neitzel R, Meischke H, Daniell WE, Seixas NS. A comparison of “Train-the-Trainer” and expert training modalities for hearing protection use in construction. Am J Ind Med. 2008;51(2):130–7.
McClelland JW, Irving LM, Mitchell RE, Bearon LB, Webber KH. Extending the reach of nutrition education for older adults: feasibility of a Train-the-Trainer approach in congregate nutrition sites. J Nutr Educ Behav. 2002;34 Suppl 1:S48–52.
Assemi M, Mutha S, Hudmon KS. Evaluation of a train-the-trainer program for cultural competence. Am J Pharm Educ. 2007;71(6):110.
Bess CA, LaHaye C, O’Brien CM. Train-the-Trainer Project meets organization’s strategic initiative for retention and continuous learning. J Nurses Staff Dev. 2003;19(3):121–7. quiz 8-9.
Green ML. A train-the-trainer model for integrating evidence-based medicine training into podiatric medical education. J Am Podiatr Med Assoc. 2005;95(5):497–504.
Levine SA, Brett B, Robinson BE, Stratos GA, Lascher SM, Granville L, et al. Practicing physician education in geriatrics: lessons learned from a train-the-trainer model. J Am Geriatr Soc. 2007;55(8):1281–6.
Stratos GA, Katz S, Bergen MR, Hallenbeck J. Faculty development in end-of-life care: evaluation of a national train-the-trainer program. Acad Med. 2006;81(11):1000–7.
Campbell NR, Petrella R, Kaczorowski J. Public education on hypertension: a new initiative to improve the prevention, treatment and control of hypertension in Canada. Can J Cardiol. 2006;22(7):599–603.
Corelli RL, Fenlon CM, Kroon LA, Prokhorov AV, Hudmon KS. Evaluation of a train-the-trainer program for tobacco cessation. Am J Pharm Educ. 2007;71(6):109.
Booth-Kewley S, Gilman PA, Shaffer RA, Brodine SK. Evaluation of a sexually transmitted disease/human immunodeficiency virus prevention train-the-trainer program. Mil Med. 2001;166(4):304–10.
Burr CK, Storm DS, Gross E. A faculty trainer model: increasing knowledge and changing practice to improve perinatal HIV prevention and care. AIDS Patient Care STDS. 2006;20(3):183–92.
Gabel LL, Pearsol JA. The twin epidemics of substance use and HIV: a state-level response using a train-the-trainer model. Fam Pract. 1993;10(4):400–5.
Hiner CA, Mandel BG, Weaver MR, Bruce D, McLaughlin R, Anderson J. Effectiveness of a training-of-trainers model in a HIV counseling and testing program in the Caribbean Region. Hum Resour Health. 2009;7:11.
Nyamathi A, Vatsa M, Khakha DC, McNeese-Smith D, Leake B, Fahey JL. HIV knowledge improvement among nurses in India: using a train-the-trainer program. J Assoc Nurses AIDS Care. 2008;19(6):443–9.
Tobias CR, Downes A, Eddens S, Ruiz J. Building blocks for peer success: lessons learned from a train-the-trainer program. AIDS Patient Care STDS. 2011;26(1):53–9.
Hahn EJ, Noland MP, Rayens MK, Christie DM. Efficacy of training and fidelity of implementation of the life skills training program. J Sch Health. 2002;72(7):282–7.
Milat AJ, King L, Bauman A, Redman S. Scaling up health promotion interventions: an emerging concept in implementation science. Health Promot J Austr. 2012;22(3):238.
Norton W, Mittman B. Scaling up health promotion/disease prevention programs in community settings: Barriers, facilitators, and initial recommendations. Hartford, CT: Patrick and Catherine Weldon Donaghue Medical Research Foundation; 2010. Contract No.: Document Number|.
Qualtrics. Qualtrics: Survey Research Suite. 2014 [updated 2014; cited June 8, 2014]; Available from: http://www.qualtrics.com/.
Jacob RR, Baker EA, Allen P, Dodson EA, Duggan K, Fields R, et al. Training needs and supports for evidence-based decision making among the public health workforce in the United States. BMC Health Serv Res. 2014;14(1):564.
Maylahn C, Bohn C, Hammer M, Waltz E. Strengthening epidemiologic competencies among local health professionals in New York: teaching evidence-based public health. Public Health Rep. 2008;123 Suppl 1:35–43.
Kaplan GE, Juhl AL, Gujral IB, Hoaglin-Wagner AL, Gabella BA, McDermott KM. Tools for identifying and prioritizing evidence-based obesity prevention strategies, Colorado. Prev Chronic Dis. 2013;10, E106.
Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38(1):4–23.
Klein K, Sorra J. The challenge of innovation implementation. Acad Manag Rev. 1996;21(4):1055–80.
Bennett S, Paina L, Ssengooba F, Waswa D, M’Imunya JM. The impact of Fogarty International Center research training programs on public health policy and program development in Kenya and Uganda. BMC Public Health. 2013;13:770.
Paina L, Ssengooba F, Waswa D, M’Imunya JM, Bennett S. How does investment in research training affect the development of research networks and collaborations? Health Res Policy Syst. 2013;11:18.
Katie Duggan provided scripts for follow-up emails and calls; Derek Hashimoto, Courtney Faust, and Anna Hardy assisted with making phone calls. John Robitscher of the National Association of Chronic Disease Directors provided leadership and support of the training program.
The Evidence-Based Public Health training program was supported in part by: the National Association of Chronic Disease Directors contract numbers 482012 and 312016; Cooperative Agreement Number U48/DP001903 from the Centers for Disease Control and Prevention, Prevention Research Centers Program. Additional support for the preparation of this project came from National Cancer Institute at the National Institutes of Health (5R01CA160327); the National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK Grant Number 1P30DK092950); and the Dissemination and Implementation Research Core of Washington University in St. Louis’ Institute of Clinical and Translational Sciences (5U54CA155496-04).
The findings and conclusions in this article are those of the authors and do not necessarily represent the official positions of the National Institutes of Health or the Centers for Disease Control and Prevention.
The authors declare that they have no competing interests.
Conceptualization and design: All. Survey instrument development: Ross Brownson, Laura Yarber. Data collection: Laura Yarber. Data management and analyses: Rebekah Jacob and Laura Yarber. Manuscript revisions: All. All authors read and approved the final manuscript.
Instrument for evaluating a train-the-trainer approach for improving capacity for evidence-based decision making. (DOCX 21 kb)