Invention Grant
- Patent Title: Method and device for providing compression and transmission of training parameters in distributed processing environment
-
Application No.: US16772557Application Date: 2018-12-13
-
Publication No.: US11663476B2Publication Date: 2023-05-30
- Inventor: Seung-Hyun Cho , Youn-Hee Kim , Jin-Wuk Seok , Joo-Young Lee , Woong Lim , Jong-Ho Kim , Dae-Yeol Lee , Se-Yoon Jeong , Hui-Yong Kim , Jin-Soo Choi , Je-Won Kang
- Applicant: Electronics and Telecommunications Research Institute
- Applicant Address: KR Daejeon
- Assignee: Electronics and Telecommunications Research Institute
- Current Assignee: Electronics and Telecommunications Research Institute
- Current Assignee Address: KR Daejeon
- Agency: William Park & Associates Ltd.
- Priority: KR 20170172827 2017.12.15 KR 20180160774 2018.12.13
- International Application: PCT/KR2018/015845 2018.12.13
- International Announcement: WO2019/117646A 2019.06.20
- Date entered country: 2020-06-12
- Main IPC: G06V10/75
- IPC: G06V10/75 ; G06N3/08 ; G06N3/04 ; G06V10/82 ; G06F18/214

Abstract:
Disclosed herein are a method and apparatus for compressing learning parameters for training of a deep-learning model and transmitting the compressed parameters in a distributed processing environment. Multiple electronic devices in the distributed processing system perform training of a neural network. By performing training, parameters are updated. The electronic device may share the updated parameter thereof with additional electronic devices. In order to efficiently share the parameter, the residual of the parameter is provided to the additional electronic devices. When the residual of the parameter is provided, the additional electronic devices update the parameter using the residual of the parameter.
Information query