Open Access
ARTICLE
Implementation of Rapid Code Transformation Process Using Deep Learning Approaches
1 Department of Computer Science and Information Engineering, National University of Kaohsiung, Kaohsiung, 811, Taiwan
2 Department of Fragrance and Cosmetic Science, Kaohsiung Medical University, Kaohsiung, 811, Taiwan
* Corresponding Author: Hsiu-Fen Tsai. Email:
Computer Modeling in Engineering & Sciences 2023, 136(1), 107-134. https://doi.org/10.32604/cmes.2023.024018
Received 21 May 2022; Accepted 05 September 2022; Issue published 05 January 2023
Abstract
Our previous work has introduced the newly generated program using the code transformation model GPT-2, verifying the generated programming codes through simhash (SH) and longest common subsequence (LCS) algorithms. However, the entire code transformation process has encountered a time-consuming problem. Therefore, the objective of this study is to speed up the code transformation process significantly. This paper has proposed deep learning approaches for modifying SH using a variational simhash (VSH) algorithm and replacing LCS with a piecewise longest common subsequence (PLCS) algorithm to faster the verification process in the test phase. Besides the code transformation model GPT-2, this study has also introduced Microsoft MASS and Facebook BART for a comparative analysis of their performance. Meanwhile, the explainable AI technique using local interpretable model-agnostic explanations (LIME) can also interpret the decision-making of AI models. The experimental results show that VSH can reduce the number of qualified programs by 22.11%, and PLCS can reduce the execution time of selected pocket programs by 32.39%. As a result, the proposed approaches can significantly speed up the entire code transformation process by 1.38 times on average compared with our previous work.Graphical Abstract
![Implementation of Rapid Code Transformation Process Using Deep Learning Approaches](https://cdn.techscience.cn/uploads/imgs/202301/c4fd73811fb908255875903b0eb20836.png?t=20220620)
Keywords
Cite This Article
![cc](https://www.techscience.com/static/images/cc.jpg?t=20230215)