Regularization relaxation scheme
    1.
    发明授权

    公开(公告)号:US10438129B1

    公开(公告)日:2019-10-08

    申请号:US14586043

    申请日:2014-12-30

    Applicant: Google LLC

    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training machine learning systems. One of the methods includes receiving a plurality of training examples; and training a machine learning system on each of the plurality of training examples to determine trained values for weights of a machine learning model, wherein training the machine learning system comprises: assigning an initial value for a regularization penalty for a particular weight for a particular feature; and adjusting the initial value for the regularization penalty for the particular weight for the particular feature during the training of the machine learning system.

    Wide and deep machine learning models

    公开(公告)号:US10762422B2

    公开(公告)日:2020-09-01

    申请号:US15394668

    申请日:2016-12-29

    Applicant: Google LLC

    Abstract: A system includes one or more computers and one or more storage devices storing instructions that when executed by the one or more computers cause the computers to implement a combined machine learning model for processing an input including multiple features to generate a predicted output for the machine learning input. The combined model includes: a deep machine learning model configured to process the features to generate a deep model output; a wide machine learning model configured to process the features to generate a wide model output; and a combining layer configured to process the deep model output generated by the deep machine learning model and the wide model output generated by the wide machine learning model to generate the predicted output, in which the deep model and the wide model have been trained jointly on training data to generate the deep model output and the wide model output.

    Using variable length representations for machine learning statistics

    公开(公告)号:US10062035B1

    公开(公告)日:2018-08-28

    申请号:US14104004

    申请日:2013-12-12

    Applicant: Google LLC

    CPC classification number: G06N20/00

    Abstract: The present disclosure provides methods and systems for using variable length representations of machine learning statistics. A method may include storing an n-bit representation of a first statistic at a first n-bit storage cell. A first update to the first statistic may be received, and it may be determined that the first update causes a first loss of precision of the first statistic as stored in the first n-bit storage cell. Accordingly, an m-bit representation of the first statistic may be stored at a first m-bit storage cell based on the determination. The first m-bit storage cell may be associated with the first n-bit storage cell. As a result, upon receiving an instruction to use the first statistic in a calculation, a combination of the n-bit representation and the m-bit representation may be used to perform the calculation.

    Regularization relaxation scheme
    6.
    发明授权

    公开(公告)号:US11663520B1

    公开(公告)日:2023-05-30

    申请号:US16551610

    申请日:2019-08-26

    Applicant: Google LLC

    CPC classification number: G06N20/00

    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training machine learning systems. One of the methods includes receiving a plurality of training examples; and training a machine learning system on each of the plurality of training examples to determine trained values for weights of a machine learning model, wherein training the machine learning system comprises: assigning an initial value for a regularization penalty for a particular weight for a particular feature; and adjusting the initial value for the regularization penalty for the particular weight for the particular feature during the training of the machine learning system.

    Efficient locking of large data collections

    公开(公告)号:US10509772B1

    公开(公告)日:2019-12-17

    申请号:US15393071

    申请日:2016-12-28

    Applicant: Google LLC

    Abstract: The present disclosure provides systems and techniques for efficient locking of datasets in a database when updates to a dataset may be delayed. A method may include accumulating a plurality of updates to a first set of one or more values associated with one or more features. The first set of one or more values may be stored within a first database column. Next, it may be determined that a first database column update aggregation rule is satisfied. A lock assigned to at least a portion of at least a first database column may be acquired. Accordingly, one or more values in the first set within the first database column may be updated based on the plurality of updates. In an implementation, the first set of one or more values may be associated with the first lock.

    Using template exploration for large-scale machine learning

    公开(公告)号:US10713585B2

    公开(公告)日:2020-07-14

    申请号:US14106900

    申请日:2013-12-16

    Applicant: Google LLC

    Abstract: Systems and techniques are provided for template exploration in a large-scale machine learning system. A method may include obtaining multiple base templates, each base template comprising multiple features. A template performance score may be obtained for each base template and a first base template may be selected from among the multiple base templates based on the template performance score of the first base template. Multiple cross-templates may be constructed by generating a cross-template of the selected first base template and each of the multiple base templates. Performance of a machine learning model may be tested based on each cross-template to generate a cross-template performance score for each of the cross-templates. A first cross-template may be selected from among the multiple cross-templates based on the cross-template performance score of the cross-template. Accordingly, the first cross-template may be added to the machine learning model.

Patent Agency Ranking