-
公开(公告)号:US20200050676A1
公开(公告)日:2020-02-13
申请号:US16058724
申请日:2018-08-08
申请人: Peijie Li , Yu Gu , Hongqin Song
发明人: Peijie Li , Yu Gu , Hongqin Song
IPC分类号: G06F17/30 , G06F12/126 , G06F12/0875
摘要: A data system may dynamically prioritize and ingest data so that, regardless of the memory size of the dataset hosted by the data system, it may process and analyze the hosted dataset in constant time. The system and method may implement a first space-efficient probabilistic data structure on the dataset, wherein the dataset includes a plurality of profile data. It may then receive update data corresponding to some of the plurality of profile data and implement a second space-efficient probabilistic data structure on the dataset including the update data. The system and method may then determine a set of non-shared profile data of the second space-efficient probabilistic data structure and prioritize the set of non-shared profile data of the second space-efficient probabilistic data structure over other profile data of the dataset for caching.
-
公开(公告)号:US20200151726A1
公开(公告)日:2020-05-14
申请号:US16189565
申请日:2018-11-13
申请人: Hongqin Song , Yu GU , Dan WANG , Peter WALKER
发明人: Hongqin Song , Yu GU , Dan WANG , Peter WALKER
IPC分类号: G06Q20/40 , G06N99/00 , G06F12/0875 , G06N7/00
摘要: Embodiments of the invention are directed to systems and methods for utilizing a cache to store historical transaction data. A predictive model may be trained to identify particular identifiers associated with historical data that is likely to be utilized on a particular date and/or within a particular time period. The historical data corresponding to these identifiers may be stored in a cache of the processing computer. Subsequently, an authorization request message may be received that includes an identifier. The processing computer may utilize the identifier to retrieve historical transaction data from the cache. The retrieved data may be utilized to perform any suitable operation. By predicting the data that will be needed to perform these operations, and preemptively store such data in a cache, the latency associated with subsequent processing may be reduced and the performance of the system as a whole improved.
-