SYSTEMS AND METHODS FOR TIME SERIES FORECASTING

    公开(公告)号:US20230244947A1

    公开(公告)日:2023-08-03

    申请号:US17843775

    申请日:2022-06-17

    CPC classification number: G06N3/088 G06Q10/04

    Abstract: Embodiments described herein provide a method of forecasting time series data at future timestamps in a dynamic system. The method of forecasting time series data also includes receiving, via a data interface, a time series dataset. The method also includes determining, via a frequency attention layer, a seasonal representation based on a frequency domain analysis of the time series data. The method also includes determining, via an exponential attention layer, a growth representation based on the seasonal representation. The method also includes generating, via a decoder, a time series forecast based on the seasonal representation and the trend representation.

    SYSTEMS AND METHODS FOR ITERATIVE CODE GENERATION WITH LARGE LANGUAGE MODELS AND REPRESENTATIVE SUB-MODULES

    公开(公告)号:US20250103300A1

    公开(公告)日:2025-03-27

    申请号:US18424372

    申请日:2024-01-26

    Abstract: The embodiments are directed to generating source code for a program from a problem description. One or more pre-trained code large language models (LLMs) generate sub-modules from a problem description in a natural language. The sub-modules are filtered based on testing criteria and encoded into sub-module encodings in an embedding space. The sub-module encodings are clustered into multiple clusters. A subset of sub-modules encoding that are close to the centroids of the clusters are selected. The sub-set of sub-modules is decoded into representative sub-modules. The problem description is augmented with the representative sub-modules and fed into one or more pre-trained code LLMs and new sub-modules are generated. The iterations continue until a program is generated from the representative sub-modules.

    SYSTEMS AND METHODS FOR TIME-SERIES DATA PROCESSING IN MACHINE LEARNING SYSTEMS

    公开(公告)号:US20230367271A1

    公开(公告)日:2023-11-16

    申请号:US17947605

    申请日:2022-09-19

    CPC classification number: G05B13/027 G16H50/70

    Abstract: Embodiments described herein provide using a measure of distance between time-series data sequences referred to as optimal transport warping (OTW). Measuring the OTW distance between unbalanced sequences (sequences with different sums of their values) may be accomplished by including an unbalanced mass cost. The OTW computation may be performed using cumulative sums over local windows. Further, embodiments herein describe methods for dealing with time-series data with negative values. Sequences may be split into positive and negative components before determining the OTW distance. A smoothing function may also be applied to the OTW measurement allowing for a gradient to be calculated. The OTW distance may be used in machine learning tasks such as clustering and classification. An OTW measurement may also be used as an input layer to a neural network.

    SYSTEMS AND METHODS FOR ONLINE TIME SERIES FORCASTING

    公开(公告)号:US20230244943A1

    公开(公告)日:2023-08-03

    申请号:US17871819

    申请日:2022-07-22

    CPC classification number: G06N3/084 G06N3/0472

    Abstract: Embodiments provide a framework combining fast and slow learning Networks (referred to as “FSNet”) to train deep neural forecasters on the fly for online time-series fore-casting. FSNet is built on a deep neural network backbone (slow learner) with two complementary components to facilitate fast adaptation to both new and recurrent concepts. To this end, FSNet employs a per-layer adapter to monitor each layer's contribution to the forecasting loss via its partial derivative. The adapter transforms each layer's weight and feature at each step based on its recent gradient, allowing a finegrain per-layer fast adaptation to optimize the current loss. In addition, FSNet employs a second and complementary associative memory component to store important, recurring patterns observed during training. The adapter interacts with the memory to store, update, and retrieve the previous transformations, facilitating fast learning of such patterns.

Patent Agency Ranking