METHOD AND APPARATUS FOR ALLOCATING COMPUTING TASK OF NEURAL NETWORK IN HETEROGENEOUS RESOURCES, AND DEVICE

    公开(公告)号:US20240311193A1

    公开(公告)日:2024-09-19

    申请号:US18571650

    申请日:2022-04-28

    IPC分类号: G06F9/50

    CPC分类号: G06F9/5027 G06F2209/5017

    摘要: A method and apparatus for allocating a computing task of a neural network in heterogeneous resources, a computer device, and a storage medium. The method includes: acquiring task information of the computing task of the neural network and resource information of the heterogeneous resources; determining, according to the task information and the resource information, an allocation mode for allocating each subtask to the heterogeneous resources for execution and a task processing cost corresponding to each allocation mode; constructing a directed acyclic graph according to each allocation mode and each task processing cost; obtaining a value of a loss function corresponding to each allocation path according to the task processing cost corresponding to each subtask in an allocation path of the directed acyclic graph; and selecting a target allocation path according to the value of each loss function.

    NATURAL LANGUAGE PROCESSING METHOD AND APPARATUS, DEVICE, AND READABLE STORAGE MEDIUM

    公开(公告)号:US20240330711A1

    公开(公告)日:2024-10-03

    申请号:US18699231

    申请日:2022-06-30

    IPC分类号: G06N5/02

    CPC分类号: G06N5/02

    摘要: Embodiments of the present application disclose a natural language processing method and apparatus, a device, and a readable storage medium. The method includes: obtaining a target sentence to be processed, and determining each first entity in the target sentence; for each first entity, in response to the first entity being present in a preset entity set, determining, in the preset entity set, a second entity in maximum correlation with the first entity, generating extended information based on the determined second entity, and adding the extended information after a location of the first entity in the target sentence, to obtain an updated target sentence, where the second entity is any entity in the preset entity set other than the first entity; and inputting the updated target sentence to a bidirectional encoder representations from transformer (BERT) model, such that the BERT model performs a natural language processing task.