-
公开(公告)号:US11157764B2
公开(公告)日:2021-10-26
申请号:US16489084
申请日:2017-03-27
Applicant: INTEL CORPORATION
Inventor: Libin Wang , Anbang Yao , Jianguo Li , Yurong Chen
Abstract: An example apparatus for semantic image segmentation includes a receiver to receive an image to be segmented. The apparatus also includes a gated dense pyramid network comprising a plurality of gated dense pyramid (GDP) blocks to be trained to generate semantic labels for each pixel in the received image. The apparatus further includes a generator to generate a segmented image based on the generated semantic labels.
-
公开(公告)号:US10929977B2
公开(公告)日:2021-02-23
申请号:US16320944
申请日:2016-08-25
Applicant: Intel Corporation
Inventor: Libin Wang , Anbang Yao , Yurong Chen
IPC: G06K9/00 , G06T7/10 , G06K9/46 , G06N3/04 , G06N3/08 , G06K9/34 , G06T7/11 , G06T7/143 , G06F16/55 , G06N5/04
Abstract: Techniques related to implementing fully convolutional networks for semantic image segmentation are discussed. Such techniques may include combining feature maps from multiple stages of a multi-stage fully convolutional network to generate a hyper-feature corresponding to an input image, up-sampling the hyper-feature and summing it with a feature map of a previous stage to provide a final set of features, and classifying the final set of features to provide semantic image segmentation of the input image.
-
公开(公告)号:US11704894B2
公开(公告)日:2023-07-18
申请号:US17510013
申请日:2021-10-25
Applicant: Intel Corporation
Inventor: Libin Wang , Anbang Yao , Jianguo Li , Yurong Chen
IPC: G06V10/44 , G06F18/214 , G06F18/2413 , G06N3/04
CPC classification number: G06V10/454 , G06F18/2148 , G06F18/24143 , G06N3/04
Abstract: An example apparatus for semantic image segmentation includes a receiver to receive an image to be segmented. The apparatus also includes a gated dense pyramid network including a plurality of gated dense pyramid (GDP) blocks to be trained to generate semantic labels for respective pixels in the received image. The apparatus further includes a generator to generate a segmented image based on the generated semantic labels.
-
公开(公告)号:US11635943B2
公开(公告)日:2023-04-25
申请号:US16475080
申请日:2017-04-07
Applicant: Intel Corporation
Inventor: Yiwen Guo , Anbang Yao , Dongqi Cai , Libin Wang , Lin Xu , Ping Hu , Shandong Wang , Wenhua Cheng
Abstract: Described herein are hardware acceleration of random number generation for machine learning and deep learning applications. An apparatus (700) includes a uniform random number generator (URNG) circuit (710) to generate uniform random numbers and an adder circuit (750) that is coupled to the URNG circuit (710). The adder circuit hardware (750) accelerates generation of Gaussian random numbers for machine learning.
-
公开(公告)号:US11790223B2
公开(公告)日:2023-10-17
申请号:US16475076
申请日:2017-04-07
Applicant: INTEL CORPORATION
Inventor: Libin Wang , Yiwen Guo , Anbang Yao , Dongqi Cai , Lin Xu , Ping Hu , Shandong Wang , Wenhua Cheng , Yurong Chen
CPC classification number: G06N3/08 , G06F18/217 , G06F18/2148 , G06N3/045 , G06N3/063 , G06T1/20
Abstract: Methods and systems are disclosed for boosting deep neural networks for deep learning. In one example, in a deep neural network including a first shallow network and a second shallow network, a first training sample is processed by the first shallow network using equal weights. A loss for the first shallow network is determined based on the processed training sample using equal weights. Weights for the second shallow network are adjusted based on the determined loss for the first shallow network. A second training sample is processed by the second shallow network using the adjusted weights. In another example, in a deep neural network including a first weak network and a second weak network, a first subset of training samples is processed by the first weak network using initialized weights. A classification error for the first weak network on the first subset of training samples is determined. The second weak network is boosted using the determined classification error of the first weak network with adjusted weights. A second subset of training samples is processed by the second weak network using the adjusted weights.
-
公开(公告)号:US11594010B2
公开(公告)日:2023-02-28
申请号:US17510013
申请日:2021-10-25
Applicant: Intel Corporation
Inventor: Libin Wang , Anbang Yao , Jianguo Li , Yurong Chen
Abstract: An example apparatus for semantic image segmentation includes a receiver to receive an image to be segmented. The apparatus also includes a gated dense pyramid network including a plurality of gated dense pyramid (GDP) blocks to be trained to generate semantic labels for respective pixels in the received image. The apparatus further includes a generator to generate a segmented image based on the generated semantic labels.
-
公开(公告)号:US11551335B2
公开(公告)日:2023-01-10
申请号:US16474848
申请日:2017-04-07
Applicant: Intel Corporation
Inventor: Lin Xu , Liu Yang , Anbang Yao , Dongqi Cai , Libin Wang , Ping Hu , Shandong Wang , Wenhua Cheng , Yiwen Guo , Yurong Chen
Abstract: Methods and systems are disclosed using camera devices for deep channel and Convolutional Neural Network (CNN) images and formats. In one example, image values are captured by a color sensor array in an image capturing device or camera. The image values provide color channel data. The captured image values by the color sensor array are input to a CNN having at least one CNN layer. The CNN provides CNN channel data for each layer. The color channel data and CNN channel data is to form a deep channel image that stored in a memory. In another example, image values are captured by sensor array. The captured image values by the sensor array are input a CNN having a first CNN layer. An output is generated at the first CNN layer using the captured image values by the color sensor array. The output of the first CNN layer is stored as a feature map of the captured image.
-
公开(公告)号:US11341368B2
公开(公告)日:2022-05-24
申请号:US16475079
申请日:2017-04-07
Applicant: INTEL CORPORATION
Inventor: Anbang Yao , Shandong Wang , Wenhua Cheng , Dongqi Cai , Libin Wang , Lin Xu , Ping Hu , Yiwen Guo , Liu Yang , Yuqing Hou , Zhou Su , Yurong Chen
Abstract: Methods and systems for advanced and augmented training of deep neural networks (DNNs) using synthetic data and innovative generative networks. A method includes training a DNN using synthetic data, training a plurality of DNNs using context data, associating features of the DNNs trained using context data with features of the DNN trained with synthetic data, and generating an augmented DNN using the associated features.
-
公开(公告)号:US11107189B2
公开(公告)日:2021-08-31
申请号:US16474927
申请日:2017-04-07
Applicant: INTEL CORPORATION
Inventor: Shandong Wang , Yiwen Guo , Anbang Yao , Dongqi Cai , Libin Wang , Lin Xu , Ping Hu , Wenhua Cheng , Yurong Chen
IPC: G06K9/00 , G06T3/40 , G06N20/20 , G06N20/10 , G06K9/62 , G06N3/04 , G06N3/08 , G06N5/04 , G06T1/20
Abstract: Methods and systems are disclosed using improved Convolutional Neural Networks (CNN) for image processing. In one example, an input image is down-sampled into smaller images with a smaller resolution than the input image. The down-sampled smaller images are processed by a CNN having a last layer with a reduced number of nodes than a last layer of a full CNN used to process the input image at a full resolution. A result is outputted based on the processed down-sampled smaller images by the CNN having a last layer with a reduced number of nodes. In another example, shallow CNN networks are built randomly. The randomly built shallow CNN networks are combined to imitate a trained deep neural network (DNN).
-
公开(公告)号:US20200026988A1
公开(公告)日:2020-01-23
申请号:US16475075
申请日:2017-04-07
Applicant: INTEL CORPORATION
Inventor: Yiwen Guo , Anbang Yao , Dongqi Cai , Libin Wang , Lin Xu , Ping Hu , Shangong Wang , Wenhua Cheng , Wenhua Cheng , Yurong Chen
Abstract: Methods and systems are disclosed using improved training and learning for deep neural networks. In one example, a deep neural network includes a plurality of layers, and each layer has a plurality of nodes. For each L layer in the plurality of layers, the nodes of each L layer are randomly connected to nodes in a L+1 layer. For each L+1 layer in the plurality of layers, the nodes of each L+1 layer are connected to nodes in a subsequent L layer in a one-to-one manner. Parameters related to the nodes of each L layer are fixed. Parameters related to the nodes of each L+1 layers are updated, and L is an integer starting with 1. In another example, a deep neural network includes an input layer, output layer, and a plurality of hidden layers. Inputs for the input layer and labels for the output layer are determined related to a first sample. Similarity between different pairs of inputs and labels between a second sample with the first sample is estimated using Gaussian regression process.
-
-
-
-
-
-
-
-
-