ROBUST TEST-TIME ADAPTATION WITHOUT ERROR ACCUMULATION

    公开(公告)号:US20240303497A1

    公开(公告)日:2024-09-12

    申请号:US18360712

    申请日:2023-07-27

    CPC classification number: G06N3/091 G06N3/045

    Abstract: A processor-implemented method for adapting an artificial neural network (ANN) at test-time includes receiving by a first ANN model and a second ANN model, a test data set. The test data set includes unlabeled data samples. The first ANN model is pretrained using a training data set and the test data set. The first ANN model generates first estimated labels for the test data set. The second ANN model generates second estimated labels for the test data set. Samples of the test data set are selected based on a confidence difference between the first estimated labels and the second estimated labels. The second ANN model is retrained based on the selected samples.

    CLIENT-AGNOSTIC LEARNING AND ZERO-SHOT ADAPTATION FOR FEDERATED DOMAIN GENERALIZATION

    公开(公告)号:US20240112039A1

    公开(公告)日:2024-04-04

    申请号:US18238998

    申请日:2023-08-28

    CPC classification number: G06N3/098 H04L67/10

    Abstract: Example implementations include methods, apparatuses, and computer-readable mediums of federated learning by a federated client device, comprising identifying client invariant information of a neural network for performing a machine learning (ML) task in a first domain known to a federated server. The implementations further comprising transmitting the client invariant information to the federated server, the federated server configured to generate a ML model for performing the ML task in a domain unknown to the federated server based on the client invariant information and other client invariant information of another neural network for performing the ML task in a second domain known to the federated server.

    Low Power Always-on listening Artificial Intelligence (AI) System

    公开(公告)号:US20250095643A1

    公开(公告)日:2025-03-20

    申请号:US18468964

    申请日:2023-09-18

    Abstract: Various embodiments include systems and methods for continuous speech monitoring artificial intelligence solutions. A low-power always-on listening module (LPALM) may maintain continuous auditory awareness or alertness without consuming an excessive amount of the processing, memory, or battery resources of the user computing system or device. As such, the LPALM may operate on the computing device for an extended period of time without depleting the device's battery resources, rendering the user device non-responsive, or otherwise having a negative or user-perceivable impact on the performance, functionality, or power consumption characteristics of the user device.

    TEST-TIME ADAPTATION VIA SELF-DISTILLED REGULARIZATION

    公开(公告)号:US20240160926A1

    公开(公告)日:2024-05-16

    申请号:US18479723

    申请日:2023-10-02

    CPC classification number: G06N3/08 G06N3/045

    Abstract: A computer-implemented method includes adding an auxiliary network of a group of auxiliary networks to a respective partition of a group of partitions associated with a main network. The method also includes training each of the group of auxiliary networks with training data to adapt to a test distribution. The method further includes adapting each of the group of auxiliary networks with test data to adapt to the test distribution. The method still further includes classifying an input received at a model based on adapting each of the group of auxiliary networks. The model may include the group of partitions and the group of auxiliary networks.

    ADAPTING MACHINE LEARNING MODELS FOR DOMAIN-SHIFTED DATA

    公开(公告)号:US20240119360A1

    公开(公告)日:2024-04-11

    申请号:US18338174

    申请日:2023-06-20

    CPC classification number: G06N20/00

    Abstract: Certain aspects of the present disclosure provide techniques and apparatuses for adapting a machine learning model for inferencing against a target data set in a shifted domain from a source data set used to train the machine learning model. An example method generally includes identifying one or more domain-sensitive layers in a machine learning model based on differences between outputs generated by one or more layers in the machine learning model for inputs in a source domain and inputs in a shifted domain. Normalizing values are updated for each respective domain-sensitive layer of the one or more domain-sensitive layers based on a mixing factor, fixed normalizing values for data in the source domain, and calculated normalizing values for data in the shifted domain. The updated normalizing values are applied to each respective domain-sensitive layer of the one or more domain-sensitive layers in the machine learning model.

    SEMANTIC-AWARE RANDOM STYLE AGGREGATION FOR SINGLE DOMAIN GENERALIZATION

    公开(公告)号:US20230376753A1

    公开(公告)日:2023-11-23

    申请号:US18157723

    申请日:2023-01-20

    CPC classification number: G06N3/08

    Abstract: Systems and techniques are provided for training a neural network model or machine learning model. For example, a method of augmenting training data can include augmenting, based on a randomly initialized neural network, training data to generate augmented training data and aggregating data with a plurality of styles from the augmented training data to generate aggregated training data. The method can further include applying semantic-aware style fusion to the aggregated training data to generate fused training data and adding the fused training data as fictitious samples to the training data to generate updated training data for training the neural network model or machine learning model.

    TEST-TIME ADAPTATION WITH UNLABELED ONLINE DATA

    公开(公告)号:US20230281509A1

    公开(公告)日:2023-09-07

    申请号:US18086586

    申请日:2022-12-21

    CPC classification number: G06N20/00

    Abstract: A processor-implemented method includes training a machine learning model on a source domain. The method also includes testing the machine learning model on a target domain, after training. The method further includes training the machine learning model on the target domain by regularizing weights of the machine learning model such that shift-agnostic weights are subjected to a higher penalty than shift-biased weights.

Patent Agency Ranking