MULTI-SOURCE DOMAIN ADAPTATION VIA PROMPT-BASED META-LEARNING

    公开(公告)号:US20250148293A1

    公开(公告)日:2025-05-08

    申请号:US18934676

    申请日:2024-11-01

    Abstract: Methods and systems include adapting an initial prompt to a target domain corresponding to an input time series to generate an adapted prompt. The adapted prompt and the input time series are combined. The input time series is processed with the adapted prompt using a modular transformer encoder that has a plurality of sub-encoders, with a policy network selecting a subset of the plurality of encoders that are applied to the input time series and the adapted prompt.

    PROMPT-BASED MODULAR NETWORK FOR TIME SERIES FEW SHOT TRANSFER

    公开(公告)号:US20250005373A1

    公开(公告)日:2025-01-02

    申请号:US18749887

    申请日:2024-06-21

    Abstract: Systems and methods are provided for adapting a model trained from multiple source time-series domains to a target time-series domain, including integrating input data from source time-series domains to pretrain a model with a set of domain-invariant representations, fine-tuning the model by learning prompts specific to each source time-series domain using data from the source time-series domains, and applying instance normalization and segmenting the time-series data into subseries-level normalized patches for the target time-series domain. The normalized patches are fed into a transformer encoder to generate high-dimensional representations of the normalized patches, and a limited number of samples from the target time-series domain are utilized to learn the prompt specific to the target domain. Cosine similarity between the prompt of the target domain and the prompts of source domains is calculated to identify a nearest neighbor prompt, which is utilized for model prediction in the target time-series domain.

    CROSS-LINGUAL ZERO-SHOT TRANSFER VIA SEMANTIC AND SYNTHETIC REPRESENTATION LEARNING

    公开(公告)号:US20220075945A1

    公开(公告)日:2022-03-10

    申请号:US17464005

    申请日:2021-09-01

    Abstract: A computer-implemented method is provided for cross-lingual transfer. The method includes randomly masking a source corpus and a target corpus to obtain a masked source corpus and a masked target corpus. The method further includes tokenizing, by pretrained Natural Language Processing (NLP) models, the masked source corpus and the masked target corpus to obtain source tokens and target tokens. The method also includes transforming the source tokens and the target tokens into a source dependency parsing tree and a target dependency parsing tree. The method additionally includes inputting the source dependency parsing tree and the target dependency parsing tree into a graph encoder pretrained on a translation language modeling task to extract common language information for transfer. The method further includes fine-tuning the graph encoder and a down-stream network for a specific NLP down-stream task.

    Cross-lingual zero-shot transfer via semantic and synthetic representation learning

    公开(公告)号:US12050870B2

    公开(公告)日:2024-07-30

    申请号:US17464005

    申请日:2021-09-01

    CPC classification number: G06F40/284 G06F40/205 G06F40/295 G06N3/04

    Abstract: A computer-implemented method is provided for cross-lingual transfer. The method includes randomly masking a source corpus and a target corpus to obtain a masked source corpus and a masked target corpus. The method further includes tokenizing, by pretrained Natural Language Processing (NLP) models, the masked source corpus and the masked target corpus to obtain source tokens and target tokens. The method also includes transforming the source tokens and the target tokens into a source dependency parsing tree and a target dependency parsing tree. The method additionally includes inputting the source dependency parsing tree and the target dependency parsing tree into a graph encoder pretrained on a translation language modeling task to extract common language information for transfer. The method further includes fine-tuning the graph encoder and a down-stream network for a specific NLP down-stream task.

Patent Agency Ranking