MAPPING MICRO-VIDEO HASHTAGS TO CONTENT CATEGORIES

    公开(公告)号:US20240346604A1

    公开(公告)日:2024-10-17

    申请号:US18683171

    申请日:2021-08-13

    Applicant: eBay Inc.

    CPC classification number: G06Q50/01 G06F16/7867

    Abstract: Technologies are shown for mapping micro-video hashtags to content categories that involve collecting content categories from a content service, collecting micro-video, hashtags and user interaction semantic data from a micro-video service, determining a correlation of a content category to the micro-video, hashtags and user interaction semantic data using a multi-layer graph convolution network, and providing the hashtags correlated with the content category to the content service. The correlation can be determined by processing the semantic data with a concatenation layer and a full connected layer to produce a user-specific micro-video and hashtag representations. Similarity scores for determining correlation can be calculated from category content and a dot product of the representations. A content service can process a hashtag received from a micro-video application by identifying a content category correlated to the received hashtag, identifying content from the correlated category, and providing the identified content to the micro-video application for presentation.

    METADATA TAG IDENTIFICATION
    2.
    发明申请

    公开(公告)号:US20230115897A1

    公开(公告)日:2023-04-13

    申请号:US17500455

    申请日:2021-10-13

    Applicant: eBay Inc.

    Abstract: A method for automatic metadata tag identification for videos is described. Content features are extracted from a video into respective data structures. The extracted content features are from at least two different feature modalities. The respective data structures are encoded into a common data structure using an encoder of a recurrent neural network (RNN) model. The common data structure is decoded using a decoder of the RNN model to identify content platform metadata tags to be associated with the video on a social content platform. Decoding is based on group tag data for users of the social content platform that identifies groups of the users and corresponding group metadata tags of interest for the groups of users.

    MULTI-MODAL HYPERGRAPH-BASED CLICK PREDICTION

    公开(公告)号:US20240380822A1

    公开(公告)日:2024-11-14

    申请号:US18569132

    申请日:2021-08-26

    Applicant: eBay Inc.

    Abstract: One of the important signals that online platforms rely upon is the click-through rate prediction. This allows a platform, such as a video platform, to provide items, such as videos, to users based on how likely the user is to interact with the item. A hypergraph model is provided to exploit the temporal user-item interactions to guide the representation learning with multi-modal features, and further predict the user click-through rate of an item. The hypergraph model is built upon the hyperedge notion of hypergraph neural networks. In this way, item modalities, such as visual, acoustic, and textual aspects can be used to enhance the click-through rate prediction and, thus, enhance the likelihood that the online platform will provide relevant content. The technology leverages hypergraphs, including interest-based hypergraphs and item hypergraphs that uniquely provide the relationship between user and items. The hypergraph model described demonstrably outperforms various state-of-the-art methods.

Patent Agency Ranking