Invention Grant
- Patent Title: Parameter server and method for sharing distributed deep learning parameter using the same
-
Application No.: US17216322Application Date: 2021-03-29
-
Publication No.: US11487698B2Publication Date: 2022-11-01
- Inventor: Shin-Young Ahn , Eun-Ji Lim , Yong-Seok Choi , Young-Choon Woo , Wan Choi
- Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
- Applicant Address: KR Daejeon
- Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
- Current Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
- Current Assignee Address: KR Daejeon
- Priority: KR10-2017-0068445 20170601
- Main IPC: G06F15/173
- IPC: G06F15/173 ; G06N3/08 ; H04L67/10 ; G06F9/50

Abstract:
Disclosed herein are a parameter server and a method for sharing distributed deep-learning parameters using the parameter server. The method for sharing distributed deep-learning parameters using the parameter server includes initializing a global weight parameter in response to an initialization request by a master process; performing an update by receiving a learned local gradient parameter from the worker process, which performs deep-learning training after updating a local weight parameter using the global weight parameter; accumulating the gradient parameters in response to a request by the master process; and performing an update by receiving the global weight parameter from the master process that calculates the global weight parameter using the accumulated gradient parameters of the one or more worker processes.
Public/Granted literature
- US20210216495A1 PARAMETER SERVER AND METHOD FOR SHARING DISTRIBUTED DEEP LEARNING PARAMETER USING THE SAME Public/Granted day:2021-07-15
Information query