A Comparative Study of Automated Refactoring Tools

Researchers proposed several refactoring approaches supported by automated and semi-automated refactoring tools. However, the existence of numerous automated refactoring tools imposes difficulties on developers to decide upon the appropriate one according to their needs. Moreover, the performance of...

Full description

Bibliographic Details
Main Authors: Maha Alharbi, Mohammad Alshayeb
Format: Article
Language:English
Published: IEEE 2024-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10418470/
_version_ 1797319558387728384
author Maha Alharbi
Mohammad Alshayeb
author_facet Maha Alharbi
Mohammad Alshayeb
author_sort Maha Alharbi
collection DOAJ
description Researchers proposed several refactoring approaches supported by automated and semi-automated refactoring tools. However, the existence of numerous automated refactoring tools imposes difficulties on developers to decide upon the appropriate one according to their needs. Moreover, the performance of the existing refactoring tools has not been empirically evaluated against the other available tools targeting the same refactoring opportunities. Therefore, the objective of this research is to conduct a comparative study to systematically compare and evaluate refactoring tools that belong to different categories of refactoring approaches. To this end, we propose an evaluation framework based on the DESMET methodology. The framework is used to empirically compare and evaluate four different refactoring tools, namely MultiRefactor, JDeodorant, jSparrow, and Spartenizer, using five open-source projects. The evaluation results show that jSparrow outperforms the other investigated tools by supporting the highest number of quantitative and qualitative features, suggesting that it is the best choice based on various perspectives. On the other hand, Spartenizer demonstrated the least favorable outcomes in terms of both quantitative and qualitative features, including introducing new code smells after applying a refactoring opportunity. The findings of this comparative study would assist the developers in understanding the characteristics and the capability of the studied refactoring tools. Also, it benefits the researchers to focus their efforts on addressing the identified limitations to enhance the smell detection and refactoring process.
first_indexed 2024-03-08T04:08:43Z
format Article
id doaj.art-83c968033b2e41bf893a669d66011e4c
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-03-08T04:08:43Z
publishDate 2024-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-83c968033b2e41bf893a669d66011e4c2024-02-09T00:03:04ZengIEEEIEEE Access2169-35362024-01-0112187641878110.1109/ACCESS.2024.336131410418470A Comparative Study of Automated Refactoring ToolsMaha Alharbi0https://orcid.org/0009-0007-2605-3011Mohammad Alshayeb1https://orcid.org/0000-0001-7950-0099Information and Computer Science Department, King Fahd University of Petroleum and Minerals, Dhahran, Saudi ArabiaInformation and Computer Science Department, King Fahd University of Petroleum and Minerals, Dhahran, Saudi ArabiaResearchers proposed several refactoring approaches supported by automated and semi-automated refactoring tools. However, the existence of numerous automated refactoring tools imposes difficulties on developers to decide upon the appropriate one according to their needs. Moreover, the performance of the existing refactoring tools has not been empirically evaluated against the other available tools targeting the same refactoring opportunities. Therefore, the objective of this research is to conduct a comparative study to systematically compare and evaluate refactoring tools that belong to different categories of refactoring approaches. To this end, we propose an evaluation framework based on the DESMET methodology. The framework is used to empirically compare and evaluate four different refactoring tools, namely MultiRefactor, JDeodorant, jSparrow, and Spartenizer, using five open-source projects. The evaluation results show that jSparrow outperforms the other investigated tools by supporting the highest number of quantitative and qualitative features, suggesting that it is the best choice based on various perspectives. On the other hand, Spartenizer demonstrated the least favorable outcomes in terms of both quantitative and qualitative features, including introducing new code smells after applying a refactoring opportunity. The findings of this comparative study would assist the developers in understanding the characteristics and the capability of the studied refactoring tools. Also, it benefits the researchers to focus their efforts on addressing the identified limitations to enhance the smell detection and refactoring process.https://ieeexplore.ieee.org/document/10418470/Comparative studyrefactoring approachesclustering-basedrule-basedsearch-basedtext-based
spellingShingle Maha Alharbi
Mohammad Alshayeb
A Comparative Study of Automated Refactoring Tools
IEEE Access
Comparative study
refactoring approaches
clustering-based
rule-based
search-based
text-based
title A Comparative Study of Automated Refactoring Tools
title_full A Comparative Study of Automated Refactoring Tools
title_fullStr A Comparative Study of Automated Refactoring Tools
title_full_unstemmed A Comparative Study of Automated Refactoring Tools
title_short A Comparative Study of Automated Refactoring Tools
title_sort comparative study of automated refactoring tools
topic Comparative study
refactoring approaches
clustering-based
rule-based
search-based
text-based
url https://ieeexplore.ieee.org/document/10418470/
work_keys_str_mv AT mahaalharbi acomparativestudyofautomatedrefactoringtools
AT mohammadalshayeb acomparativestudyofautomatedrefactoringtools
AT mahaalharbi comparativestudyofautomatedrefactoringtools
AT mohammadalshayeb comparativestudyofautomatedrefactoringtools