Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (4)
  • Open Access

    ARTICLE

    Enhancing Relational Triple Extraction in Specific Domains: Semantic Enhancement and Synergy of Large Language Models and Small Pre-Trained Language Models

    Jiakai Li, Jianpeng Hu*, Geng Zhang

    CMC-Computers, Materials & Continua, Vol.79, No.2, pp. 2481-2503, 2024, DOI:10.32604/cmc.2024.050005 - 15 May 2024

    Abstract In the process of constructing domain-specific knowledge graphs, the task of relational triple extraction plays a critical role in transforming unstructured text into structured information. Existing relational triple extraction models face multiple challenges when processing domain-specific data, including insufficient utilization of semantic interaction information between entities and relations, difficulties in handling challenging samples, and the scarcity of domain-specific datasets. To address these issues, our study introduces three innovative components: Relation semantic enhancement, data augmentation, and a voting strategy, all designed to significantly improve the model’s performance in tackling domain-specific relational triple extraction tasks. We first… More >

  • Open Access

    ARTICLE

    Classification of Conversational Sentences Using an Ensemble Pre-Trained Language Model with the Fine-Tuned Parameter

    R. Sujatha, K. Nimala*

    CMC-Computers, Materials & Continua, Vol.78, No.2, pp. 1669-1686, 2024, DOI:10.32604/cmc.2023.046963 - 27 February 2024

    Abstract Sentence classification is the process of categorizing a sentence based on the context of the sentence. Sentence categorization requires more semantic highlights than other tasks, such as dependence parsing, which requires more syntactic elements. Most existing strategies focus on the general semantics of a conversation without involving the context of the sentence, recognizing the progress and comparing impacts. An ensemble pre-trained language model was taken up here to classify the conversation sentences from the conversation corpus. The conversational sentences are classified into four categories: information, question, directive, and commission. These classification label sequences are for… More >

  • Open Access

    ARTICLE

    Personality Trait Detection via Transfer Learning

    Bashar Alshouha1, Jesus Serrano-Guerrero1,*, Francisco Chiclana2, Francisco P. Romero1, Jose A. Olivas1

    CMC-Computers, Materials & Continua, Vol.78, No.2, pp. 1933-1956, 2024, DOI:10.32604/cmc.2023.046711 - 27 February 2024

    Abstract Personality recognition plays a pivotal role when developing user-centric solutions such as recommender systems or decision support systems across various domains, including education, e-commerce, or human resources. Traditional machine learning techniques have been broadly employed for personality trait identification; nevertheless, the development of new technologies based on deep learning has led to new opportunities to improve their performance. This study focuses on the capabilities of pre-trained language models such as BERT, RoBERTa, ALBERT, ELECTRA, ERNIE, or XLNet, to deal with the task of personality recognition. These models are able to capture structural features from textual… More >

  • Open Access

    ARTICLE

    Vulnerability Detection of Ethereum Smart Contract Based on SolBERT-BiGRU-Attention Hybrid Neural Model

    Guangxia Xu1,*, Lei Liu2, Jingnan Dong3

    CMES-Computer Modeling in Engineering & Sciences, Vol.137, No.1, pp. 903-922, 2023, DOI:10.32604/cmes.2023.026627 - 23 April 2023

    Abstract In recent years, with the great success of pre-trained language models, the pre-trained BERT model has been gradually applied to the field of source code understanding. However, the time cost of training a language model from zero is very high, and how to transfer the pre-trained language model to the field of smart contract vulnerability detection is a hot research direction at present. In this paper, we propose a hybrid model to detect common vulnerabilities in smart contracts based on a lightweight pre-trained language model BERT and connected to a bidirectional gate recurrent unit model. More >

Displaying 1-10 on page 1 of 4. Per Page