Adjusting heavy users' affinity for heavy user entity-pairs in a social network

    公开(公告)号:US10673965B2

    公开(公告)日:2020-06-02

    申请号:US14839531

    申请日:2015-08-28

    IPC分类号: H04L29/08 H04L12/58 G06Q50/00

    摘要: A system and method of adjusting an affinity score between an entity pair in a social network is disclosed. The method may include determining, with a processor, whether a first member of the entity pair is a heavy user member. The method further includes if the first member is the heavy user member, determining, with the processor, an affinity adjustment factor between the first member and the second member, and adjusting, with the processor, the affinity score between the first member and the second member of the entity pair in accordance with the adjustment factor to determine an adjusted affinity score. The method may include determining, with the processor, whether a number of interactions on content items indicates that the first member is the heavy user member. The second member is associated with a content item that is being considered for display to the first member.

    MONITORING AND COMPARING FEATURES ACROSS ENVIRONMENTS

    公开(公告)号:US20190325351A1

    公开(公告)日:2019-10-24

    申请号:US15958999

    申请日:2018-04-20

    IPC分类号: G06N99/00 G06F17/30

    摘要: The disclosed embodiments provide a system for processing data. During operation, the system selects a set of entity keys associated with reference feature values used with one or more machine learning models, wherein the reference feature values are generated in a first environment. Next, the system matches the set of entity keys to feature values from a second environment. The system then compares the feature values and the reference feature values to assess a consistency of a feature across the first and second environments. Finally, the system outputs a result of the assessed consistency for use in managing the feature in the first and second environments.

    EARLY FEEDBACK OF SCHEMATIC CORRECTNESS IN FEATURE MANAGEMENT FRAMEWORKS

    公开(公告)号:US20190325258A1

    公开(公告)日:2019-10-24

    申请号:US15958997

    申请日:2018-04-20

    IPC分类号: G06K9/62 G06F15/18

    摘要: The disclosed embodiments provide a system for processing data. During operation, the system obtains feature configurations for a set of features and a command for inspecting a data set that is produced using the feature configurations. Next, the system obtains, from the feature configurations, one or more anchors containing metadata for accessing the set of features in an environment and a join configuration for joining a feature with one or more additional features. The system then uses the anchors to retrieve feature values of the features and zips the feature values according to the join configuration without matching entity keys associated with the feature values. Finally, the system outputs the zipped feature values in response to the command.

    EMBEDDING OPTIMIZATION FOR MACHINE LEARNING MODELS

    公开(公告)号:US20230124258A1

    公开(公告)日:2023-04-20

    申请号:US17505519

    申请日:2021-10-19

    IPC分类号: G06N3/08 G06F5/01

    摘要: Methods, systems, and computer programs are presented for determining parameters of neural networks and selecting embedding dimensions for the feature fields. One method includes an operation for initializing parameters of a neural network and weights for embedding sizes for each feature associated with the neural network. The parameters of the neural network and the weights are iteratively optimized. Each optimization iteration comprises training the neural network with current parameters of the neural network to optimize a value of the weights, and training the neural network with current values of the weights to optimize the parameters of the neural network. Further, the method includes operations for selecting embedding sizes for the features based on the optimized values of the weights, and for training the neural network based on the selected embedding sizes for the features to obtain an estimator model. A prediction is generated utilizing the estimator model.

    CONNECTING MACHINE LEARNING METHODS THROUGH TRAINABLE TENSOR TRANSFORMERS

    公开(公告)号:US20200311613A1

    公开(公告)日:2020-10-01

    申请号:US16370156

    申请日:2019-03-29

    IPC分类号: G06N20/20 G06N5/04

    摘要: Herein are techniques for configuring, integrating, and operating trainable tensor transformers that each encapsulate an ensemble of trainable machine learning (ML) models. In an embodiment, a computer-implemented trainable tensor transformer uses underlying ML models and additional mechanisms to assemble and convert data tensors as needed to generate output records based on input records and inferencing. The transformer processes each input record as follows. Input tensors of the input record are converted into converted tensors. Each converted tensor represents a respective feature of many features that are capable of being processed by the underlying trainable models. The trainable models are applied to respective subsets of converted tensors to generate an inference for the input record. The inference is converted into a prediction tensor. The prediction tensor and input tensors are stored as output tensors of a respective output record for the input record.

    NEXT CAREER MOVE PREDICTION WITH CONTEXTUAL LONG SHORT-TERM MEMORY NETWORKS

    公开(公告)号:US20190130281A1

    公开(公告)日:2019-05-02

    申请号:US15799396

    申请日:2017-10-31

    IPC分类号: G06N5/02

    摘要: Techniques for predicting a next company and next title of a user are disclosed herein. In some embodiments, an encoder is used for encoding a representation of the user's profile. The encoding includes accessing discrete entities comprising context information included in the user's profile, constructing a plurality of embedding vectors from the context information, and generating a context vector from the plurality of embedding vectors. The plurality of embedding vectors including a skill embedding vector, a school embedding vector, and a location embedding vector. A decoder is for decoding a career path from the context vector. The decoding includes applying a long short-term memory (LSTM) model to the context vector to generate perform the prediction of the user's next company and next title for presentation in a user interface.