Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (1)
  • Open Access

    ARTICLE

    Guided Dropout: Improving Deep Networks Without Increased Computation

    Yifeng Liu1, Yangyang Li1,*, Zhongxiong Xu1, Xiaohan Liu1, Haiyong Xie2, Huacheng Zeng3

    Intelligent Automation & Soft Computing, Vol.36, No.3, pp. 2519-2528, 2023, DOI:10.32604/iasc.2023.033286 - 15 March 2023

    Abstract Deep convolution neural networks are going deeper and deeper. However, the complexity of models is prone to overfitting in training. Dropout, one of the crucial tricks, prevents units from co-adapting too much by randomly dropping neurons during training. It effectively improves the performance of deep networks but ignores the importance of the differences between neurons. To optimize this issue, this paper presents a new dropout method called guided dropout, which selects the neurons to switch off according to the differences between the convolution kernel and preserves the informative neurons. It uses an unsupervised clustering algorithm… More >

Displaying 1-10 on page 1 of 1. Per Page