-
公开(公告)号:US11829848B2
公开(公告)日:2023-11-28
申请号:US15617703
申请日:2017-06-08
Applicant: Microsoft Technology Licensing, LLC
Inventor: Yuxiao Hu , Lei Zhang , Christopher J Buehler , Anna Roth , Cornelia Carapcea
IPC: G06N20/00 , G06F18/24 , G06F18/214 , G06V10/764 , G06V10/774
CPC classification number: G06N20/00 , G06F18/214 , G06F18/24 , G06V10/764 , G06V10/774
Abstract: A method includes obtaining training data for a classifier, the training data comprises one or more target classes, obtaining candidate background classes, selecting negative classes from the candidate background classes, wherein the negative classes exclude candidate background classes that are close to the target classes, wherein the negative classes exclude candidate background classes that are very different from the target classes, and wherein the negative classes include candidate background classes that are similar to the target classes, and training the classifier on a combined set of the selected negative classes and target classes.
-
公开(公告)号:US20190050689A1
公开(公告)日:2019-02-14
申请号:US15676077
申请日:2017-08-14
Applicant: Microsoft Technology Licensing, LLC
Inventor: Yandong Guo , Yuxiao Hu , Christopher J Buehler , Cornelia Carapcea , Lei Zhang
Abstract: Methods, systems, and computer programs are presented for training a deep neural network (DNN). One method includes an operation for training a predecessor network defined for image recognition of items, where parameters of a predecessor classifier are initialized with random numbers sampled from a predetermined distribution, and the predecessor classifier utilizes an image-classification probability function without bias. The method further includes an operation for training a successor network defined for image recognition of items in a plurality of classes, where parameters of a successor classifier are initialized with parameters learned from the predecessor network, and the successor classifier utilizes the image-classification probability function without bias. Further, the method includes operations for receiving an image for recognition, and recognizing the image utilizing the successor classifier.
-
公开(公告)号:US10691981B2
公开(公告)日:2020-06-23
申请号:US16298910
申请日:2019-03-11
Applicant: Microsoft Technology Licensing, LLC
Inventor: Yandong Guo , Yuxiao Hu , Christopher J Buehler , Cornelia Carapcea , Lei Zhang
Abstract: Methods, systems, and computer programs are presented for training a deep neural network (DNN). One method includes an operation for training a predecessor network defined for image recognition of items, where parameters of a predecessor classifier are initialized with random numbers sampled from a predetermined distribution, and the predecessor classifier utilizes an image-classification probability function without bias. The method further includes an operation for training a successor network defined for image recognition of items in a plurality of classes, where parameters of a successor classifier are initialized with parameters learned from the predecessor network, and the successor classifier utilizes the image-classification probability function without bias. Further, the method includes operations for receiving an image for recognition, and recognizing the image utilizing the successor classifier.
-
公开(公告)号:US20190205705A1
公开(公告)日:2019-07-04
申请号:US16298910
申请日:2019-03-11
Applicant: Microsoft Technology Licensing, LLC
Inventor: Yandong Guo , Yuxiao Hu , Christopher J. Buehler , Cornelia Carapcea , Lei Zhang
Abstract: Methods, systems, and computer programs are presented for training a deep neural network (DNN). One method includes an operation for training a predecessor network defined for image recognition of items, where parameters of a predecessor classifier are initialized with random numbers sampled from a predetermined distribution, and the predecessor classifier utilizes an image-classification probability function without bias. The method further includes an operation for training a successor network defined for image recognition of items in a plurality of classes, where parameters of a successor classifier are initialized with parameters learned from the predecessor network, and the successor classifier utilizes the image-classification probability function without bias. Further, the method includes operations for receiving an image for recognition, and recognizing the image utilizing the successor classifier.
-
公开(公告)号:US10262240B2
公开(公告)日:2019-04-16
申请号:US15676077
申请日:2017-08-14
Applicant: Microsoft Technology Licensing, LLC
Inventor: Yandong Guo , Yuxiao Hu , Christopher J Buehler , Cornelia Carapcea , Lei Zhang
Abstract: Methods, systems, and computer programs are presented for training a deep neural network (DNN). One method includes an operation for training a predecessor network defined for image recognition of items, where parameters of a predecessor classifier are initialized with random numbers sampled from a predetermined distribution, and the predecessor classifier utilizes an image-classification probability function without bias. The method further includes an operation for training a successor network defined for image recognition of items in a plurality of classes, where parameters of a successor classifier are initialized with parameters learned from the predecessor network, and the successor classifier utilizes the image-classification probability function without bias. Further, the method includes operations for receiving an image for recognition, and recognizing the image utilizing the successor classifier.
-
公开(公告)号:US20180330272A1
公开(公告)日:2018-11-15
申请号:US15616655
申请日:2017-06-07
Applicant: Microsoft Technology Licensing, LLC
Inventor: Yuxiao Hu , Lei Zhang , Christopher Buehler , Cha Zhang , Anna Roth , Cornelia Carapcea
IPC: G06N99/00
CPC classification number: G06N99/005
Abstract: A method includes obtaining a first classifier trained on a first dataset having a first dataset class, the first classifier having a plurality of first parameters, obtaining a second dataset having a second dataset class, loading the first parameters into a second classifier, merging a subset of the first dataset class and the second dataset class into a merged class, and training the second classifier using the merged class.
-
-
-
-
-