Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (1)
  • Open Access

    ARTICLE

    Code Transform Model Producing High-Performance Program

    Bao Rong Chang1,*, Hsiu-Fen Tsai2, Po-Wen Su1

    CMES-Computer Modeling in Engineering & Sciences, Vol.129, No.1, pp. 253-277, 2021, DOI:10.32604/cmes.2021.015673 - 24 August 2021

    Abstract This paper introduces a novel transform method to produce the newly generated programs through code transform model called the second generation of Generative Pre-trained Transformer (GPT-2) reasonably, improving the program execution performance significantly. Besides, a theoretical estimation in statistics has given the minimum number of generated programs as required, which guarantees to find the best one within them. The proposed approach can help the voice assistant machine resolve the problem of inefficient execution of application code. In addition to GPT-2, this study develops the variational Simhash algorithm to check the code similarity between sample program More >

Displaying 1-10 on page 1 of 1. Per Page