Invention Grant
- Patent Title: Activation layers for deep learning networks
-
Application No.: US15894867Application Date: 2018-02-12
-
Publication No.: US10366313B2Publication Date: 2019-07-30
- Inventor: Son Dinh Tran , Raghavan Manmatha
- Applicant: A9.com, Inc.
- Applicant Address: US CA Palo Alto
- Assignee: A9.COM, INC.
- Current Assignee: A9.COM, INC.
- Current Assignee Address: US CA Palo Alto
- Agency: Hogan Lovells US, LLP
- Main IPC: G06K9/66
- IPC: G06K9/66 ; G06N3/08 ; G06K9/62

Abstract:
Tasks such as object classification from image data can take advantage of a deep learning process using convolutional neural networks. These networks can include a convolutional layer followed by an activation layer, or activation unit, among other potential layers. Improved accuracy can be obtained by using a generalized linear unit (GLU) as an activation unit in such a network, where a GLU is linear for both positive and negative inputs, and is defined by a positive slope, a negative slope, and a bias. These parameters can be learned for each channel or a block of channels, and stacking those types of activation units can further improve accuracy.
Public/Granted literature
- US20180197049A1 ACTIVATION LAYERS FOR DEEP LEARNING NETWORKS Public/Granted day:2018-07-12
Information query