NEURAL ARCHITECTURE SEARCH VIA SIMILARITY-BASED OPERATOR RANKING
摘要:
Network architecture search (NAS) received a lot of attention. The supernet-based differentiable approach is popular because it can effectively share the weights and lead to more efficient search. However, the mismatch between the architecture and weights caused by weight sharing still exists. Moreover, the coupling effects among different operators are also neglected. To alleviate these problems, embodiments of an effective NAS methodology by similarity-based operator ranking are presented herein. With the aim of approximating each layer's output in the supernet, a similarity-based operator ranking based on statistical random comparison is used. In one or more embodiments, then the operator that possibly causes the least change to feature distribution discrepancy is pruned. In one or more embodiments, a fair sampling process may be used to mitigate the operators' Matthew effect that happened frequently in previous supernet approaches.
信息查询
0/0