Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (1)
  • Open Access

    ARTICLE

    Enhancing Deep Learning Semantics: The Diffusion Sampling and Label-Driven Co-Attention Approach

    Chunhua Wang1,2, Wenqian Shang1,2,*, Tong Yi3,*, Haibin Zhu4

    CMC-Computers, Materials & Continua, Vol.79, No.2, pp. 1939-1956, 2024, DOI:10.32604/cmc.2024.048135

    Abstract The advent of self-attention mechanisms within Transformer models has significantly propelled the advancement of deep learning algorithms, yielding outstanding achievements across diverse domains. Nonetheless, self-attention mechanisms falter when applied to datasets with intricate semantic content and extensive dependency structures. In response, this paper introduces a Diffusion Sampling and Label-Driven Co-attention Neural Network (DSLD), which adopts a diffusion sampling method to capture more comprehensive semantic information of the data. Additionally, the model leverages the joint correlation information of labels and data to introduce the computation of text representation, correcting semantic representation biases in the data, and More >

Displaying 1-10 on page 1 of 1. Per Page