Invention Application
- Patent Title: ACTIVATION LAYERS FOR DEEP LEARNING NETWORKS
-
Application No.: US15894867Application Date: 2018-02-12
-
Publication No.: US20180197049A1Publication Date: 2018-07-12
- Inventor: Son Dinh Tran , Raghavan Manmatha
- Applicant: A9.com, Inc.
- Main IPC: G06K9/66
- IPC: G06K9/66 ; G06K9/62 ; G06N3/08

Abstract:
Tasks such as object classification from image data can take advantage of a deep learning process using convolutional neural networks. These networks can include a convolutional layer followed by an activation layer, or activation unit, among other potential layers. Improved accuracy can be obtained by using a generalized linear unit (GLU) as an activation unit in such a network, where a GLU is linear for both positive and negative inputs, and is defined by a positive slope, a negative slope, and a bias. These parameters can be learned for each channel or a block of channels, and stacking those types of activation units can further improve accuracy.
Public/Granted literature
- US10366313B2 Activation layers for deep learning networks Public/Granted day:2019-07-30
Information query