-
公开(公告)号:US20210034977A1
公开(公告)日:2021-02-04
申请号:US16945898
申请日:2020-08-02
Applicant: Google LLC
Inventor: Sercan Omer Arik , Tomas Jon Pfister
Abstract: A method of interpreting tabular data includes receiving, at a deep tabular data learning network (TabNet) executing on data processing hardware, a set of features. For each of multiple sequential processing steps, the method also includes: selecting, using a sparse mask of the TabNet, a subset of relevant features of the set of features; processing using a feature transformer of the TabNet, the subset of relevant features to generate a decision step output and information for a next processing step in the multiple sequential processing steps; and providing the information to the next processing step. The method also includes determining a final decision output by aggregating the decision step outputs generated for the multiple sequential processing steps.
-
公开(公告)号:US12125265B2
公开(公告)日:2024-10-22
申请号:US17809798
申请日:2022-06-29
Applicant: Google LLC
Inventor: Sercan Omer Arik , Jinsung Yoon , Tomas Jon Pfister
IPC: G06V10/774 , G06F18/21 , G06F18/2115 , G06F18/214 , G06N3/006 , G06N3/02 , G06N3/045 , G06N3/084 , G06N3/088 , G06N5/01 , G06N5/045 , G06N7/01 , G06N20/20 , G06V30/19
CPC classification number: G06V10/774 , G06F18/2115 , G06F18/2148 , G06F18/2193 , G06N3/006 , G06N3/02 , G06N3/045 , G06N3/084 , G06N3/088 , G06N5/045 , G06N7/01 , G06V30/19147 , G06N5/01 , G06N20/20
Abstract: A method for training a locally interpretable model includes obtaining a set of training samples and training a black-box model using the set of training samples. The method also includes generating, using the trained black-box model and the set of training samples, a set of auxiliary training samples and training a baseline interpretable model using the set of auxiliary training samples. The method also includes training, using the set of auxiliary training samples and baseline interpretable model, an instance-wise weight estimator model. For each auxiliary training sample in the set of auxiliary training samples, the method also includes determining, using the trained instance-wise weight estimator model, a selection probability for the auxiliary training sample. The method also includes selecting, based on the selection probabilities, a subset of auxiliary training samples and training the locally interpretable model using the subset of auxiliary training samples.
-
公开(公告)号:US11907850B2
公开(公告)日:2024-02-20
申请号:US17454516
申请日:2021-11-11
Applicant: Google LLC
Inventor: Rui Zhang , Jia Li , Tomas Jon Pfister
IPC: G06N3/084 , G06F18/214 , G06F18/22 , G06F18/21 , G06F18/2413 , G06F18/2134 , G06N3/045 , G06N3/047 , G06V10/74 , G06V10/764 , G06V10/774 , G06V10/82
CPC classification number: G06N3/084 , G06F18/214 , G06F18/2148 , G06F18/2193 , G06F18/21347 , G06F18/22 , G06F18/2413 , G06N3/045 , G06N3/047 , G06V10/761 , G06V10/764 , G06V10/774 , G06V10/82
Abstract: A method includes obtaining a source training dataset that includes a plurality of source training images and obtaining a target training dataset that includes a plurality of target training images. For each source training image, the method includes translating, using the forward generator neural network G, the source training image to a respective translated target image according to current values of forward generator parameters. For each target training image, the method includes translating, using a backward generator neural network F, the target training image to a respective translated source image according to current values of backward generator parameters. The method also includes training the forward generator neural network G jointly with the backward generator neural network F by adjusting the current values of the forward generator parameters and the backward generator parameters to optimize an objective function.
-
公开(公告)号:US20220067441A1
公开(公告)日:2022-03-03
申请号:US17454516
申请日:2021-11-11
Applicant: Google LLC
Inventor: Rui Zhang , Jia Li , Tomas Jon Pfister
Abstract: A method includes obtaining a source training dataset that includes a plurality of source training images and obtaining a target training dataset that includes a plurality of target training images. For each source training image, the method includes translating, using the forward generator neural network G, the source training image to a respective translated target image according to current values of forward generator parameters. For each target training image, the method includes translating, using a backward generator neural network F, the target training image to a respective translated source image according to current values of backward generator parameters. The method also includes training the forward generator neural network G jointly with the backward generator neural network F by adjusting the current values of the forward generator parameters and the backward generator parameters to optimize an objective function.
-
公开(公告)号:US20210279517A1
公开(公告)日:2021-09-09
申请号:US17031144
申请日:2020-09-24
Applicant: Google LLC
Inventor: Sercan Omer Arik , Chen Xing , Zizhao Zhang , Tomas Jon Pfister
Abstract: A method for jointly training a classification model and a confidence model. The method includes receiving a training data set including a plurality of training data subsets. From two or more training data subsets in the training data set, the method includes selecting a support set of training examples and a query set of training examples. The method includes determining, using the classification model, a centroid value for each respective class. For each training example in the query set of training examples, the method includes generating, using the classification model, a query encoding, determining a class distance measure, determining a ground-truth distance, and updating parameters of the classification model. For each training example in the query set of training examples identified as being misclassified, the method further includes generating a standard deviation value, sampling a new query, and updating parameters of the confidence model based on the new query encoding.
-
公开(公告)号:US20210089964A1
公开(公告)日:2021-03-25
申请号:US17026225
申请日:2020-09-19
Applicant: Google LLC
Inventor: Zizhao Zhang , Sercan Omer Arik , Tomas Jon Pfister , Han Zhang
Abstract: A method for training a model comprises obtaining a set of labeled training samples each associated with a given label. For each labeled training sample, the method includes generating a pseudo label and estimating a weight of the labeled training sample indicative of an accuracy of the given label. The method also includes determining whether the weight of the labeled training sample satisfies a weight threshold. When the weight of the labeled training sample satisfies the weight threshold, the method includes adding the labeled training sample to a set of cleanly labeled training samples. Otherwise, the method includes adding the labeled training sample to a set of mislabeled training samples. The method includes training the model with the set of cleanly labeled training samples using corresponding given labels and the set of mislabeled training samples using corresponding pseudo labels.
-
公开(公告)号:US20210056417A1
公开(公告)日:2021-02-25
申请号:US17000094
申请日:2020-08-21
Applicant: Google LLC
Inventor: Zizhao Zhang , Tomas Jon Pfister , Sercan Omer Arik , Mingfei Gao
Abstract: A method for active learning includes obtaining a set of unlabeled training samples and for each unlabeled training sample, perturbing the unlabeled training sample to generate an augmented training sample. The method includes generating, using a machine learning model, a predicted label for both samples and determining an inconsistency value for the unlabeled training sample that represents variance between the predicted labels for the unlabeled and augmented training samples. The method includes sorting the unlabeled training samples based on the inconsistency values and obtaining, for a threshold number of samples selected from the sorted unlabeled training samples, a ground truth label. The method includes selecting a current set of labeled training samples including each selected unlabeled training samples paired with the corresponding ground truth label. The method includes training, using the current set and a proper subset of unlabeled training samples, the machine learning model.
-
公开(公告)号:US12271822B2
公开(公告)日:2025-04-08
申请号:US17000094
申请日:2020-08-21
Applicant: Google LLC
Inventor: Zizhao Zhang , Tomas Jon Pfister , Sercan Omer Arik , Mingfei Gao
IPC: G06N3/044 , G06F7/24 , G06F18/211 , G06F18/214 , G06N3/045 , G06N3/08 , G06N3/084 , G06N7/01 , G06N20/00
Abstract: A method for active learning includes obtaining a set of unlabeled training samples and for each unlabeled training sample, perturbing the unlabeled training sample to generate an augmented training sample. The method includes generating, using a machine learning model, a predicted label for both samples and determining an inconsistency value for the unlabeled training sample that represents variance between the predicted labels for the unlabeled and augmented training samples. The method includes sorting the unlabeled training samples based on the inconsistency values and obtaining, for a threshold number of samples selected from the sorted unlabeled training samples, a ground truth label. The method includes selecting a current set of labeled training samples including each selected unlabeled training samples paired with the corresponding ground truth label. The method includes training, using the current set and a proper subset of unlabeled training samples, the machine learning model.
-
公开(公告)号:US12039443B2
公开(公告)日:2024-07-16
申请号:US18045722
申请日:2022-10-11
Applicant: Google LLC
Inventor: Sercan Omer Arik , Chen Xing , Zizhao Zhang , Tomas Jon Pfister
IPC: G06N3/08 , G06F18/214 , G06F18/2413 , G06F18/2431 , G06N3/04
CPC classification number: G06N3/08 , G06F18/2148 , G06F18/2413 , G06F18/2431 , G06N3/04
Abstract: A method includes receiving a training data set including a plurality of training data subsets. From two or more training data subsets in the training data set, the method includes selecting a support set of training examples and a query set of training examples. The method includes determining, using the classification model, a centroid value for each respective class. For each training example in the query set of training examples, the method includes generating, using the classification model, a query encoding, determining a class distance measure, determining a ground-truth distance, and updating parameters of the classification model. For each training example in the query set of training examples identified as being misclassified, the method further includes generating a standard deviation value, sampling a new query, and updating parameters of the confidence model based on the new query encoding.
-
公开(公告)号:US12026614B2
公开(公告)日:2024-07-02
申请号:US16945898
申请日:2020-08-02
Applicant: Google LLC
Inventor: Sercan Omer Arik , Tomas Jon Pfister
Abstract: A method of interpreting tabular data includes receiving, at a deep tabular data learning network (TabNet) executing on data processing hardware, a set of features. For each of multiple sequential processing steps, the method also includes: selecting, using a sparse mask of the TabNet, a subset of relevant features of the set of features; processing using a feature transformer of the TabNet, the subset of relevant features to generate a decision step output and information for a next processing step in the multiple sequential processing steps; and providing the information to the next processing step. The method also includes determining a final decision output by aggregating the decision step outputs generated for the multiple sequential processing steps.
-
-
-
-
-
-
-
-
-