-
公开(公告)号:US11347975B2
公开(公告)日:2022-05-31
申请号:US17235992
申请日:2021-04-21
Applicant: Google LLC
Inventor: Dilip Krishnan , Prannay Khosla , Piotr Teterwak , Aaron Yehuda Sarna , Aaron Joseph Maschinot , Ce Liu , Phillip John Isola , Yonglong Tian , Chen Wang
Abstract: The present disclosure provides an improved training methodology that enables supervised contrastive learning to be simultaneously performed across multiple positive and negative training examples. In particular, example aspects of the present disclosure are directed to an improved, supervised version of the batch contrastive loss, which has been shown to be very effective at learning powerful representations in the self-supervised setting. Thus, the proposed techniques adapt contrastive learning to the fully supervised setting and also enable learning to occur simultaneously across multiple positive examples.
-
公开(公告)号:US20230153629A1
公开(公告)日:2023-05-18
申请号:US17920623
申请日:2021-04-12
Applicant: Google LLC
Inventor: Dilip Krishnan , Prannay Khosla , Piotr Teterwak , Aaron Yehuda Sarna , Aaron Joseph Maschinot , Ce Liu , Philip John Isola , Yonglong Tian , Chen Wang
IPC: G06N3/09 , G06V10/74 , G06V10/776 , G06V10/82
CPC classification number: G06N3/09 , G06V10/761 , G06V10/776 , G06V10/82
Abstract: The present disclosure provides an improved training methodology that enables supervised contrastive learning to be simultaneously performed across multiple positive and negative training examples. In particular, example aspects of the present disclosure are directed to an improved, supervised version of the batch contrastive loss, which has been shown to be very effective at learning powerful representations in the self-supervised setting Thus, the proposed techniques adapt contrastive learning to the fully supervised setting and also enable learning to occur simultaneously across multiple positive examples.
-
公开(公告)号:US20210326660A1
公开(公告)日:2021-10-21
申请号:US17235992
申请日:2021-04-21
Applicant: Google LLC
Inventor: Dilip Krishnan , Prannay Khosla , Piotr Teterwak , Aaron Yehuda Sarna , Aaron Joseph Maschinot , Ce Liu , Phillip John Isola , Yonglong Tian , Chen Wang
Abstract: The present disclosure provides an improved training methodology that enables supervised contrastive learning to be simultaneously performed across multiple positive and negative training examples. In particular, example aspects of the present disclosure are directed to an improved, supervised version of the batch contrastive loss, which has been shown to be very effective at learning powerful representations in the self-supervised setting. Thus, the proposed techniques adapt contrastive learning to the fully supervised setting and also enable learning to occur simultaneously across multiple positive examples.
-
-