Quantitative assessment of students' revision processes
© American Society for Engineering Education 2020. Communication is a crucial skillset for engineers, yet graduates [1]-[3] and their employers [4]-[8] continue to report their lack of preparation for effective communication upon completion of their undergraduate or graduate programs. Thus, technica...
Main Authors: | , , , , , , , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
ASEE Conferences
2021
|
Online Access: | https://hdl.handle.net/1721.1/133507 |
_version_ | 1826203476998225920 |
---|---|
author | Volpatti, LR Hanson, AJ Schall, JM Dunietz, JN Chen, AX Chitnis, R Alm, EJ Takemura, AF Chien, DM |
author2 | Massachusetts Institute of Technology. Department of Chemical Engineering |
author_facet | Massachusetts Institute of Technology. Department of Chemical Engineering Volpatti, LR Hanson, AJ Schall, JM Dunietz, JN Chen, AX Chitnis, R Alm, EJ Takemura, AF Chien, DM |
author_sort | Volpatti, LR |
collection | MIT |
description | © American Society for Engineering Education 2020. Communication is a crucial skillset for engineers, yet graduates [1]-[3] and their employers [4]-[8] continue to report their lack of preparation for effective communication upon completion of their undergraduate or graduate programs. Thus, technical communication training merits deeper investigation and creative solutions. At the 2017 ASEE Meeting, we introduced the MIT School of Engineering Communication Lab, a discipline-specific technical communication service that is akin to a writing center, but embedded within engineering departments [9]. By using the expertise of graduate student and postdoctoral peer coaches within a given discipline, the Communication Lab provides a scalable, content-aware solution with the benefits of just-in-time, one-on-one [10], and peer [11] training. When we first introduced this model, we offered easy-to-record metrics for the Communication Lab's effectiveness (such as usage statistics and student and faculty opinion surveys), as are commonly used to assess writing centers [12], [13]. Here we present a formal quantitative study of the effectiveness of Communication Lab coaching. We designed a pre-post test study for two related tasks: personal statements for applications to graduate school and graduate fellowships. We designed an analytic rubric with seven categories (strategic alignment, audience awareness, context, evidence, organization/flow, language mechanics, and visual impact) and tested it to ensure inter-rater reliability. Over one semester, we collected and anonymized 119 personal statement drafts from 47 unique Communication Lab clients across four different engineering departments. Peer coaches rubric-scored the drafts, and we developed a statistical model based on maximum likelihood to identify significant score changes in individual rubric categories across trajectories of sequential drafts. In addition, post-session surveys of clients and their peer coaches provided insight into clients' qualitative experiences during coaching sessions. Taken together, our quantitative and qualitative findings suggest that our peer coaches are most effective in supporting the skills of organization/flow, strategic alignment, and providing appropriate evidence; this aligns with our program's emphasis on supporting high-level communication skills. Our results also suggest that a major factor in coaching efficacy is coach-client discussion of major takeaways from a session: rubric category scores were more likely to improve across a drafting trajectory when a category had been identified as a takeaway. Hence, we show quantitative evidence that through collaborative conversations, technical peer coaches can guide clients to identify and effectively revise key areas for improvement. Finally, since we have gathered a sizable dataset and developed analytical tools, we have laid the groundwork for future quantitative writing assessments by both our program and others. We argue that although inter-rater variability poses a challenge, statistical methods and skill-based assessments of authentic communication tasks can provide both insights into student writing/revision ability and direction for improvement of communication resources. |
first_indexed | 2024-09-23T12:37:39Z |
format | Article |
id | mit-1721.1/133507 |
institution | Massachusetts Institute of Technology |
language | English |
last_indexed | 2024-09-23T12:37:39Z |
publishDate | 2021 |
publisher | ASEE Conferences |
record_format | dspace |
spelling | mit-1721.1/1335072023-09-12T20:00:58Z Quantitative assessment of students' revision processes Volpatti, LR Hanson, AJ Schall, JM Dunietz, JN Chen, AX Chitnis, R Alm, EJ Takemura, AF Chien, DM Massachusetts Institute of Technology. Department of Chemical Engineering Massachusetts Institute of Technology. Department of Nuclear Science and Engineering Massachusetts Institute of Technology. Department of Biological Engineering Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science Massachusetts Institute of Technology. School of Engineering © American Society for Engineering Education 2020. Communication is a crucial skillset for engineers, yet graduates [1]-[3] and their employers [4]-[8] continue to report their lack of preparation for effective communication upon completion of their undergraduate or graduate programs. Thus, technical communication training merits deeper investigation and creative solutions. At the 2017 ASEE Meeting, we introduced the MIT School of Engineering Communication Lab, a discipline-specific technical communication service that is akin to a writing center, but embedded within engineering departments [9]. By using the expertise of graduate student and postdoctoral peer coaches within a given discipline, the Communication Lab provides a scalable, content-aware solution with the benefits of just-in-time, one-on-one [10], and peer [11] training. When we first introduced this model, we offered easy-to-record metrics for the Communication Lab's effectiveness (such as usage statistics and student and faculty opinion surveys), as are commonly used to assess writing centers [12], [13]. Here we present a formal quantitative study of the effectiveness of Communication Lab coaching. We designed a pre-post test study for two related tasks: personal statements for applications to graduate school and graduate fellowships. We designed an analytic rubric with seven categories (strategic alignment, audience awareness, context, evidence, organization/flow, language mechanics, and visual impact) and tested it to ensure inter-rater reliability. Over one semester, we collected and anonymized 119 personal statement drafts from 47 unique Communication Lab clients across four different engineering departments. Peer coaches rubric-scored the drafts, and we developed a statistical model based on maximum likelihood to identify significant score changes in individual rubric categories across trajectories of sequential drafts. In addition, post-session surveys of clients and their peer coaches provided insight into clients' qualitative experiences during coaching sessions. Taken together, our quantitative and qualitative findings suggest that our peer coaches are most effective in supporting the skills of organization/flow, strategic alignment, and providing appropriate evidence; this aligns with our program's emphasis on supporting high-level communication skills. Our results also suggest that a major factor in coaching efficacy is coach-client discussion of major takeaways from a session: rubric category scores were more likely to improve across a drafting trajectory when a category had been identified as a takeaway. Hence, we show quantitative evidence that through collaborative conversations, technical peer coaches can guide clients to identify and effectively revise key areas for improvement. Finally, since we have gathered a sizable dataset and developed analytical tools, we have laid the groundwork for future quantitative writing assessments by both our program and others. We argue that although inter-rater variability poses a challenge, statistical methods and skill-based assessments of authentic communication tasks can provide both insights into student writing/revision ability and direction for improvement of communication resources. 2021-10-27T19:53:13Z 2021-10-27T19:53:13Z 2020-06-22 2021-08-24T17:44:22Z Article http://purl.org/eprint/type/ConferencePaper https://hdl.handle.net/1721.1/133507 en 10.18260/1-2--35117 ASEE Annual Conference and Exposition, Conference Proceedings Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use. application/pdf ASEE Conferences American Society for Engineering Education (ASEE) |
spellingShingle | Volpatti, LR Hanson, AJ Schall, JM Dunietz, JN Chen, AX Chitnis, R Alm, EJ Takemura, AF Chien, DM Quantitative assessment of students' revision processes |
title | Quantitative assessment of students' revision processes |
title_full | Quantitative assessment of students' revision processes |
title_fullStr | Quantitative assessment of students' revision processes |
title_full_unstemmed | Quantitative assessment of students' revision processes |
title_short | Quantitative assessment of students' revision processes |
title_sort | quantitative assessment of students revision processes |
url | https://hdl.handle.net/1721.1/133507 |
work_keys_str_mv | AT volpattilr quantitativeassessmentofstudentsrevisionprocesses AT hansonaj quantitativeassessmentofstudentsrevisionprocesses AT schalljm quantitativeassessmentofstudentsrevisionprocesses AT dunietzjn quantitativeassessmentofstudentsrevisionprocesses AT chenax quantitativeassessmentofstudentsrevisionprocesses AT chitnisr quantitativeassessmentofstudentsrevisionprocesses AT almej quantitativeassessmentofstudentsrevisionprocesses AT takemuraaf quantitativeassessmentofstudentsrevisionprocesses AT chiendm quantitativeassessmentofstudentsrevisionprocesses |