Data processing method and a computer using distribution service module

    公开(公告)号:US10241830B2

    公开(公告)日:2019-03-26

    申请号:US14936118

    申请日:2015-11-09

    Abstract: A data processing apparatus and a data processing method are provided. The apparatus includes M protocol stacks and at least one distribution service module, and the M protocol stacks separately run on different logic cores of a processor and are configured to independently perform protocol processing on a data packet to be processed. The distribution service module receives an input data packet from a network interface and sends the data packet to one of the M protocol stacks for protocol processing, and receives data packets processed by the M protocol stacks and sends the data packets outwards through the network interface. The present disclosure implements a function of parallel protocol processing by multiple processes in user space of an operating system in a multi-core environment by using a parallel processing feature of a multi-core system, thereby reducing resource consumption caused by data packet copying.

    Cooperative caching method and apparatus
    12.
    发明授权
    Cooperative caching method and apparatus 有权
    协同缓存方法及装置

    公开(公告)号:US09532083B2

    公开(公告)日:2016-12-27

    申请号:US13907503

    申请日:2013-05-31

    Abstract: Embodiments of the present invention disclose a cooperative caching method and apparatus, relating to the field of network technologies, to improve the local hit ratio without increasing the local server costs. The technical solution provided in the present invention includes: obtaining, according to cache information, end-to-end delay between a local server and a neighbor server, and popularity in a cache list, a consolidated gain value of a cached video segment and a consolidated gain value of a candidate video segment in the local server; and replacing the cached video segment with the candidate video segment when the consolidated gain value of the cached video segment and the consolidated gain value of the candidate video segment in the local server meet a replacement condition.

    Abstract translation: 本发明的实施例公开了与网络技术领域相关的协作缓存方法和装置,以提高本地命中率而不增加本地服务器成本。 本发明提供的技术方案包括:根据缓存信息获取本地服务器和邻居服务器之间的端到端延迟,以及缓存列表中的流行度,缓存视频段的合并增益值和 本地服务器中候选视频段的合并增益值; 并且当本地服务器中的缓存的视频段的合并增益值和候选视频段的合并增益值满足替换条件时,用候选视频段替换高速缓存的视频段。

    Parameter Inference Method, Calculation Apparatus, and System Based on Latent Dirichlet Allocation Model
    13.
    发明申请
    Parameter Inference Method, Calculation Apparatus, and System Based on Latent Dirichlet Allocation Model 有权
    基于潜在Dirichlet分配模型的参数推理方法,计算装置和系统

    公开(公告)号:US20140129510A1

    公开(公告)日:2014-05-08

    申请号:US14153257

    申请日:2014-01-13

    CPC classification number: G06N5/048 G06F17/30011 G06K9/6218

    Abstract: A parameter inference method to solve a problem that precision of a Latent Dirichlet Allocation model is poor is provided. The method includes: calculating a Latent Dirichlet Allocation model according to a preset initial first hyperparameter, a preset initial second hyperparameter, a preset initial number of topics, a preset initial count matrix of documents and topics, and a preset initial count matrix of topics and words to obtain probability distributions; obtaining the number of topics, a first hyperparameter, and a second hyperparameter that maximize log likelihood functions of the probability distributions; and determining whether the number of topics, the first hyperparameter, and the second hyperparameter converge, and if not, putting the number of topics, the first hyperparameter, and the second hyperparameter into the Latent Dirichlet Allocation model until the optimal number of topics, an optimal first hyperparameter, and an optimal second hyperparameter that maximize the log likelihood functions of the probability distributions.

    Abstract translation: 提供了一种参数推理方法来解决潜在的Dirichlet分配模型的精度差的问题。 该方法包括:根据预设的初始第一超参数,预设的初始第二超参数,预设的初始数量的主题,预设的文档和主题的初始计数矩阵以及主题的预设初始计数矩阵来计算潜在Dirichlet分配模型,以及 词获得概率分布; 获得最大化概率分布的对数似然函数的主题数量,第一超参数和第二超参数; 以及确定主题数量,第一超参数和第二超参数是否收敛,如果不是,将主题数量,第一超参数和第二超参数放入潜在Dirichlet分配模型中,直到最优数量的主题, 最优的第一超参数,以及最大化概率分布的对数似然函数的最优的第二超参数。

Patent Agency Ranking