• Something wrong with this record ?

ADA-Tucker: Compressing deep neural networks via adaptive dimension adjustment tucker decomposition

Z. Zhong, F. Wei, Z. Lin, C. Zhang,

. 2019 ; 110 (-) : 104-115. [pub] 20181113

Language English Country United States

Document type Journal Article

Despite recent success of deep learning models in numerous applications, their widespread use on mobile devices is seriously impeded by storage and computational requirements. In this paper, we propose a novel network compression method called Adaptive Dimension Adjustment Tucker decomposition (ADA-Tucker). With learnable core tensors and transformation matrices, ADA-Tucker performs Tucker decomposition of arbitrary-order tensors. Furthermore, we propose that weight tensors in networks with proper order and balanced dimension are easier to be compressed. Therefore, the high flexibility in decomposition choice distinguishes ADA-Tucker from all previous low-rank models. To compress more, we further extend the model to Shared Core ADA-Tucker (SCADA-Tucker) by defining a shared core tensor for all layers. Our methods require no overhead of recording indices of non-zero elements. Without loss of accuracy, our methods reduce the storage of LeNet-5 and LeNet-300 by ratios of 691× and 233 ×, respectively, significantly outperforming state of the art. The effectiveness of our methods is also evaluated on other three benchmarks (CIFAR-10, SVHN, ILSVRC12) and modern newly deep networks (ResNet, Wide-ResNet).

References provided by Crossref.org

000      
00000naa a2200000 a 4500
001      
bmc19012077
003      
CZ-PrNML
005      
20190416121445.0
007      
ta
008      
190405s2019 xxu f 000 0|eng||
009      
AR
024    7_
$a 10.1016/j.neunet.2018.10.016 $2 doi
035    __
$a (PubMed)30508807
040    __
$a ABA008 $b cze $d ABA008 $e AACR2
041    0_
$a eng
044    __
$a xxu
100    1_
$a Zhong, Zhisheng $u Key Laboratory of Machine Perception (MOE), School of EECS, Peking University, PR China. Electronic address: zszhong@pku.edu.cn.
245    10
$a ADA-Tucker: Compressing deep neural networks via adaptive dimension adjustment tucker decomposition / $c Z. Zhong, F. Wei, Z. Lin, C. Zhang,
520    9_
$a Despite recent success of deep learning models in numerous applications, their widespread use on mobile devices is seriously impeded by storage and computational requirements. In this paper, we propose a novel network compression method called Adaptive Dimension Adjustment Tucker decomposition (ADA-Tucker). With learnable core tensors and transformation matrices, ADA-Tucker performs Tucker decomposition of arbitrary-order tensors. Furthermore, we propose that weight tensors in networks with proper order and balanced dimension are easier to be compressed. Therefore, the high flexibility in decomposition choice distinguishes ADA-Tucker from all previous low-rank models. To compress more, we further extend the model to Shared Core ADA-Tucker (SCADA-Tucker) by defining a shared core tensor for all layers. Our methods require no overhead of recording indices of non-zero elements. Without loss of accuracy, our methods reduce the storage of LeNet-5 and LeNet-300 by ratios of 691× and 233 ×, respectively, significantly outperforming state of the art. The effectiveness of our methods is also evaluated on other three benchmarks (CIFAR-10, SVHN, ILSVRC12) and modern newly deep networks (ResNet, Wide-ResNet).
650    _2
$a benchmarking $7 D019985
650    _2
$a komprese dat $x metody $x trendy $7 D044962
650    12
$a deep learning $x trendy $7 D000077321
650    _2
$a lidé $7 D006801
650    _2
$a strojové učení $7 D000069550
650    12
$a neuronové sítě $7 D016571
655    _2
$a časopisecké články $7 D016428
700    1_
$a Wei, Fangyin $u Key Laboratory of Machine Perception (MOE), School of EECS, Peking University, PR China. Electronic address: weifangyin@pku.edu.cn.
700    1_
$a Lin, Zhouchen $u Key Laboratory of Machine Perception (MOE), School of EECS, Peking University, PR China. Electronic address: zlin@pku.edu.cn.
700    1_
$a Zhang, Chao $u Key Laboratory of Machine Perception (MOE), School of EECS, Peking University, PR China. Electronic address: chzhang@cis.pku.edu.cn.
773    0_
$w MED00011811 $t Neural networks the official journal of the International Neural Network Society $x 1879-2782 $g Roč. 110, č. - (2019), s. 104-115
856    41
$u https://pubmed.ncbi.nlm.nih.gov/30508807 $y Pubmed
910    __
$a ABA008 $b sig $c sign $y a $z 0
990    __
$a 20190405 $b ABA008
991    __
$a 20190416121511 $b ABA008
999    __
$a ok $b bmc $g 1391387 $s 1050382
BAS    __
$a 3
BAS    __
$a PreBMC
BMC    __
$a 2019 $b 110 $c - $d 104-115 $e 20181113 $i 1879-2782 $m Neural networks $n Neural Netw $x MED00011811
LZP    __
$a Pubmed-20190405

Find record

Citation metrics

Loading data ...

Archiving options

Loading data ...