• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
成果搜索

author:

Chen, Xiao-Yun (Chen, Xiao-Yun.) [1] (Scholars:陈晓云) | Chen, Yuan (Chen, Yuan.) [2]

Indexed by:

EI PKU CSCD

Abstract:

To deal with the clustering problem of high-dimensional complex data, it is usually reguired to reduce the dimensionality and then cluster, but the common dimensional reduction method does not consider the clustering characteristic of the data and the correlation between the samples, so it is difficult to ensure that the dimensional reduction method matches the clustering algorithm, which leads to the loss of clustering information. The nonlinear unsupervised dimensionality reduction method extreme learning machine autoencoder (ELM-AE) has been widely used in dimensionality reduction and denoising in recent years because of its fast learning speed and good generalization performance. In order to maintain the original subspace structure when high-dimensional data is projected into a low-dimensional space, the dimensional reduction method ML-SELM-AE is proposed. This method captures the deep features of the sample set by using the multi-layer extreme learning machine autoencoder while maintaining multi-subspace structure of clustered samples by self-representation model. Experimental results show that the method can effectively improve the clustering accuracy and achieve higher learning efficiency on UCI data, EEG data and gene expression data. Copyright ©2022 Acta Automatica Sinica. All rights reserved.

Keyword:

Clustering algorithms Data reduction Gene expression Knowledge acquisition Machine learning Reduction

Community:

  • [ 1 ] [Chen, Xiao-Yun]College of Mathematics and Computer Science, Fuzhou University, Fuzhou; 350116, China
  • [ 2 ] [Chen, Yuan]College of Mathematics and Computer Science, Fuzhou University, Fuzhou; 350116, China

Reprint 's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Acta Automatica Sinica

ISSN: 0254-4156

CN: 11-2109/TP

Year: 2022

Issue: 4

Volume: 48

Page: 1091-1104

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count: 3

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 3

Online/Total:198/10016170
Address:FZU Library(No.2 Xuyuan Road, Fuzhou, Fujian, PRC Post Code:350116) Contact Us:0591-22865326
Copyright:FZU Library Technical Support:Beijing Aegean Software Co., Ltd. 闽ICP备05005463号-1