What Did They Learn? Objective Assessment Tools Show Mixed Effects of Training on Science Communication Behaviors
There is widespread agreement about the need to assess the success of programs training scientists to communicate more effectively with non-professional audiences. However, there is little agreement about how that should be done. What do we mean when we talk about “effective communication”? What sho...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2022-02-01
|
Series: | Frontiers in Communication |
Subjects: | |
Online Access: | https://www.frontiersin.org/articles/10.3389/fcomm.2021.805630/full |
_version_ | 1819280870098337792 |
---|---|
author | Robert S. Capers Anne Oeldorf-Hirsch Robert Wyss Kevin R. Burgio Margaret A. Rubega |
author_facet | Robert S. Capers Anne Oeldorf-Hirsch Robert Wyss Kevin R. Burgio Margaret A. Rubega |
author_sort | Robert S. Capers |
collection | DOAJ |
description | There is widespread agreement about the need to assess the success of programs training scientists to communicate more effectively with non-professional audiences. However, there is little agreement about how that should be done. What do we mean when we talk about “effective communication”? What should we measure? How should we measure it? Evaluation of communication training programs often incorporates the views of students or trainers themselves, although this is widely understood to bias the assessment. We recently completed a 3-year experiment to use audiences of non-scientists to evaluate the effect of training on STEM (Science, Technology, Engineering and Math) graduate students’ communication ability. Overall, audiences rated STEM grad students’ communication performance no better after training than before, as we reported in Rubega et al. 2018. However, audience ratings do not reveal whether training changed specific trainee communication behaviors (e.g., jargon use, narrative techniques) even if too little to affect trainees’ overall success. Here we measure trainee communication behavior directly, using multiple textual analysis tools and analysis of trainees’ body language during videotaped talks. We found that student use of jargon declined after training but that use of narrative techniques did not increase. Flesch Reading Ease and Flesch-Kincaid Grade Level scores, used as indicators of complexity of sentences and word choice, were no different after instruction. Trainees’ movement of hands and hesitancy during talks was correlated negatively with audience ratings of credibility and clarity; smiling, on the other hand, was correlated with improvement in credibility, clarity and engagement scores given by audience members. We show that objective tools can be used to measure the success of communication training programs, that non-verbal cues are associated with audience judgments, and that an intensive communication course does change some, if not all, communication behaviors. |
first_indexed | 2024-12-24T00:50:40Z |
format | Article |
id | doaj.art-f914028cf0254424bb26574a9adedf4a |
institution | Directory Open Access Journal |
issn | 2297-900X |
language | English |
last_indexed | 2024-12-24T00:50:40Z |
publishDate | 2022-02-01 |
publisher | Frontiers Media S.A. |
record_format | Article |
series | Frontiers in Communication |
spelling | doaj.art-f914028cf0254424bb26574a9adedf4a2022-12-21T17:23:36ZengFrontiers Media S.A.Frontiers in Communication2297-900X2022-02-01610.3389/fcomm.2021.805630805630What Did They Learn? Objective Assessment Tools Show Mixed Effects of Training on Science Communication BehaviorsRobert S. Capers0Anne Oeldorf-Hirsch1Robert Wyss2Kevin R. Burgio3Margaret A. Rubega4Department of Ecology and Evolutionary Biology, University of Connecticut, Storrs, CT, United StatesDepartment of Communication, University of Connecticut, Storrs, CT, United StatesDepartment of Journalism, University of Connecticut, Storrs, CT, United StatesDepartment of Ecology and Evolutionary Biology, University of Connecticut, Storrs, CT, United StatesDepartment of Ecology and Evolutionary Biology, University of Connecticut, Storrs, CT, United StatesThere is widespread agreement about the need to assess the success of programs training scientists to communicate more effectively with non-professional audiences. However, there is little agreement about how that should be done. What do we mean when we talk about “effective communication”? What should we measure? How should we measure it? Evaluation of communication training programs often incorporates the views of students or trainers themselves, although this is widely understood to bias the assessment. We recently completed a 3-year experiment to use audiences of non-scientists to evaluate the effect of training on STEM (Science, Technology, Engineering and Math) graduate students’ communication ability. Overall, audiences rated STEM grad students’ communication performance no better after training than before, as we reported in Rubega et al. 2018. However, audience ratings do not reveal whether training changed specific trainee communication behaviors (e.g., jargon use, narrative techniques) even if too little to affect trainees’ overall success. Here we measure trainee communication behavior directly, using multiple textual analysis tools and analysis of trainees’ body language during videotaped talks. We found that student use of jargon declined after training but that use of narrative techniques did not increase. Flesch Reading Ease and Flesch-Kincaid Grade Level scores, used as indicators of complexity of sentences and word choice, were no different after instruction. Trainees’ movement of hands and hesitancy during talks was correlated negatively with audience ratings of credibility and clarity; smiling, on the other hand, was correlated with improvement in credibility, clarity and engagement scores given by audience members. We show that objective tools can be used to measure the success of communication training programs, that non-verbal cues are associated with audience judgments, and that an intensive communication course does change some, if not all, communication behaviors.https://www.frontiersin.org/articles/10.3389/fcomm.2021.805630/fullgraduate trainingeducationevidence-basedaudiencejargonnarrative |
spellingShingle | Robert S. Capers Anne Oeldorf-Hirsch Robert Wyss Kevin R. Burgio Margaret A. Rubega What Did They Learn? Objective Assessment Tools Show Mixed Effects of Training on Science Communication Behaviors Frontiers in Communication graduate training education evidence-based audience jargon narrative |
title | What Did They Learn? Objective Assessment Tools Show Mixed Effects of Training on Science Communication Behaviors |
title_full | What Did They Learn? Objective Assessment Tools Show Mixed Effects of Training on Science Communication Behaviors |
title_fullStr | What Did They Learn? Objective Assessment Tools Show Mixed Effects of Training on Science Communication Behaviors |
title_full_unstemmed | What Did They Learn? Objective Assessment Tools Show Mixed Effects of Training on Science Communication Behaviors |
title_short | What Did They Learn? Objective Assessment Tools Show Mixed Effects of Training on Science Communication Behaviors |
title_sort | what did they learn objective assessment tools show mixed effects of training on science communication behaviors |
topic | graduate training education evidence-based audience jargon narrative |
url | https://www.frontiersin.org/articles/10.3389/fcomm.2021.805630/full |
work_keys_str_mv | AT robertscapers whatdidtheylearnobjectiveassessmenttoolsshowmixedeffectsoftrainingonsciencecommunicationbehaviors AT anneoeldorfhirsch whatdidtheylearnobjectiveassessmenttoolsshowmixedeffectsoftrainingonsciencecommunicationbehaviors AT robertwyss whatdidtheylearnobjectiveassessmenttoolsshowmixedeffectsoftrainingonsciencecommunicationbehaviors AT kevinrburgio whatdidtheylearnobjectiveassessmenttoolsshowmixedeffectsoftrainingonsciencecommunicationbehaviors AT margaretarubega whatdidtheylearnobjectiveassessmenttoolsshowmixedeffectsoftrainingonsciencecommunicationbehaviors |