-
公开(公告)号:US11609787B2
公开(公告)日:2023-03-21
申请号:US16947055
申请日:2020-07-16
Inventor: Xiaofei Liao , Yicheng Chen , Yu Zhang , Hai Jin , Jin Zhao , Xiang Zhao , Beibei Si
Abstract: The present disclosure relates to an FPGA-based dynamic graph processing method, comprising: where graph mirrors of a dynamic graph that have successive timestamps define an increment therebetween, a pre-processing module dividing the graph mirror having the latter timestamp into at least one path unit in a manner that incremental computing for any vertex only depends on a preorder vertex of that vertex; an FPGA processing module storing at least two said path units into an on-chip memory directly linked to threads in a manner that every thread unit is able to process the path unit independently; the thread unit determining an increment value between the successive timestamps of the preorder vertex while updating a state value of the preorder vertex, and transferring the increment value to a succeeding vertex adjacent to the preorder vertex in a transfer direction determined by the path unit, so as to update the state value of the succeeding vertex.
-
公开(公告)号:US11461239B2
公开(公告)日:2022-10-04
申请号:US17038680
申请日:2020-09-30
Applicant: HUAZHONG UNIVERSITY OF SCIENCE AND TECHNOLOGY , TENCENT TECHNOLOGY (SHENZHEN) COMPANY LIMITED
Inventor: Ke Zhou , Yu Zhang , Hua Wang , Yong Guang Ji , Bin Cheng
IPC: G06F12/121 , G06F12/0891
Abstract: A method and apparatus for caching a data block are provided. The method includes: obtaining, from a terminal, an access request for requesting access to a first data block; determining that the first data block is missed in a cache space of a storage system; detect whether a second data block satisfies a lazy condition, the second data block being a candidate elimination block in the cache space and the lazy condition being a condition for determining whether to delay replacing the second data block from the cache space according to a re-access probability; determining that the second data block satisfies the lazy condition; and accessing the first data block from a storage space of the storage system and skipping replacing the second data block from the cache space.
-
公开(公告)号:US20210191763A1
公开(公告)日:2021-06-24
申请号:US16947055
申请日:2020-07-16
Inventor: Xiaofei LIAO , Yicheng Chen , Yu Zhang , Hai Jin , Jin Zhao , Xiang Zhao , Beibei Si
IPC: G06F9/48 , G06F30/331 , G06F9/30 , G06F9/38 , G06F9/50
Abstract: The present disclosure relates to an FPGA-based dynamic graph processing method, comprising: where graph mirrors of a dynamic graph that have successive timestamps define an increment therebetween, a pre-processing module dividing the graph mirror having the latter timestamp into at least one path unit in a manner that incremental computing for any vertex only depends on a preorder vertex of that vertex; an FPGA processing module storing at least two said path units into an on-chip memory directly linked to threads in a manner that every thread unit is able to process the path unit independently; the thread unit determining an increment value between the successive timestamps of the preorder vertex while updating a state value of the preorder vertex, and transferring the increment value to a succeeding vertex adjacent to the preorder vertex in a transfer direction determined by the path unit, so as to update the state value of the succeeding vertex.
-
公开(公告)号:US12197330B2
公开(公告)日:2025-01-14
申请号:US18470346
申请日:2023-09-19
Inventor: Zhan Zhang , Yu Zhang , Jin Zhao , Haifei Wu
IPC: G06F12/0804
Abstract: The present disclosure provides a data storage system, including data cache module, data processing module, and a persistent memory. The data cache module includes an on-chip mapping data cache and an on-chip counter cache, where the mapping data cache is configured to cache mapping data, and when the free space of the mapping data cache is less than a preset threshold, the least recently used mapping data cache line will be evicted from the cache and written back to the persistent memory. The data processing module encrypts/decrypts persistent memory data by using their counters, and accesses the persistent memory blocks indicated by their corresponding mapping data. The persistent memory comprises the first and second storage regions for the latest checkpoint data and modified working data in the current checkpoint interval respectively.
-
公开(公告)号:US10962504B2
公开(公告)日:2021-03-30
申请号:US15829586
申请日:2017-12-01
Applicant: Hubei University of Technology , Tsinghua University , China Special Equipment Inspection and Research Institute , Eddysun (Xiamen) Electronic Co., Ltd. , Huazhong University of Science and Technology
Inventor: Songling Huang , Xiaochun Song , Gongtian Shen , Wei Zhao , Junming Lin , Yihua Kang , Yu Zhang , Shen Wang
Abstract: Disclosed are a method and device for compressing and reconstructing data. The method includes: disposing a transmitting EMAT array and a receiving EMAT array; exciting a Lamb wave, receiving the Lamb wave, subjecting the Lamb wave to narrowband filtering with the narrowband frequency, to form detecting data x(n); analysing the detecting data with a DFT; reconstructing original detecting data and calculating a reconstruction error according to the measurement vector and the recovery matrix by using a TLBO algorithm; optimizing measurement vector and recovery matrix; transmitting the measurement vector to a supervisory device.
-
公开(公告)号:US20210011857A1
公开(公告)日:2021-01-14
申请号:US17038680
申请日:2020-09-30
Applicant: HUAZHONG UNIVERSITY OF SCIENCE AND TECHNOLOGY , TENCENT TECHNOLOGY (SHENZHEN) COMPANY LIMITED
Inventor: Ke ZHOU , Yu Zhang , Hua Wang , Yong Guang Ji , Bin Cheng
IPC: G06F12/121 , G06F12/0891
Abstract: A method and apparatus for caching a data block are provided. The method includes: obtaining, from a terminal, an access request for requesting access to a first data block; determining that the first data block is missed in a cache space of a storage system; detect whether a second data block satisfies a lazy condition, the second data block being a candidate elimination block in the cache space and the lazy condition being a condition for determining whether to delay replacing the second data block from the cache space according to a re-access probability; determining that the second data block satisfies the lazy condition; and accessing the first data block from a storage space of the storage system and skipping replacing the second data block from the cache space.
-
-
-
-
-