-
11.
公开(公告)号:US20250094739A1
公开(公告)日:2025-03-20
申请号:US18968830
申请日:2024-12-04
Inventor: Zhongjun He , Hua Wu , Haifeng Wang
Abstract: An information processing method. The method includes obtaining a first bilingual sentence pair, in which the first bilingual sentence pair comprises a source language sentence and a target language sentence; and obtaining a distilled second bilingual sentence pair by distilling a first language sentence in the first bilingual sentence pair based on a large language model (LLM), in which the first language sentence is the source language sentence or the target language sentence.
-
12.
公开(公告)号:US12210982B2
公开(公告)日:2025-01-28
申请号:US17872318
申请日:2022-07-25
Inventor: Wenbin Jiang , Yajuan Lyu , Yong Zhu , Hua Wu , Haifeng Wang
Abstract: The present disclosure provides a method for processing intelligent question-answering, an intelligent question-answering system, an electronic device and a storage medium, and relates to the field of artificial intelligence technologies, such as machine learning technologies, natural language processing technologies, or the like. An implementation includes: acquiring an input question and input data information; and based on the question, the data information and a plurality of knowledge bases, deciding an answer to the question by multilayer appreciation using a plurality of understanding module layers.
-
公开(公告)号:US12210956B2
公开(公告)日:2025-01-28
申请号:US18074853
申请日:2022-12-05
Inventor: Ruiqing Zhang , Hui Liu , Zhongjun He , Zhi Li , Hua Wu
Abstract: The present disclosure provides a translation method and apparatus, an electronic device, and a non-transitory storage medium. An implementation includes: determining an encoded feature of a sentence to be translated by an encoding module; determining, by a graph network module, a knowledge fusion feature of the sentence to be translated based on a preset graph network, wherein the preset graph network is constructed based on a polysemous word in a source language corresponding to the sentence to be translated and a plurality of translated words corresponding to the polysemous word in a target language; determining, by a decoding network, a translated sentence corresponding to the sentence to be translated based on the encoded feature and the knowledge fusion feature.
-
公开(公告)号:US12197882B2
公开(公告)日:2025-01-14
申请号:US17885152
申请日:2022-08-10
Inventor: Ruiqing Zhang , Xiyang Wang , Zhongjun He , Zhi Li , Hua Wu
IPC: G06F40/58
Abstract: A translation method, an electronic device and a storage medium, which relate to the field of artificial intelligence technologies, such as machine learning technologies, information processing technologies, are disclosed. An implementation includes: acquiring an intermediate translation result generated by each of multiple pre-trained translation models for a to-be-translated specified sentence in a same iteration of a translation process, so as to obtain multiple intermediate translation results; acquiring a co-occurrence word based on the multiple intermediate translation results; and acquiring a target translation result of the specified sentence based on the co-occurrence word.
-
公开(公告)号:US20250094722A1
公开(公告)日:2025-03-20
申请号:US18968920
申请日:2024-12-04
Inventor: Dai DAI , Hua Wu , Gangqiang Hu
IPC: G06F40/30
Abstract: An annotation method for a large language model, an electronic device, and a medium are provided. The method may include: obtaining a plurality of response texts that are generated by a large language model for a request text and that meet a difference requirement; obtaining a plurality of scores corresponding to the plurality of response texts, where each of the plurality of scores indicates a degree to which a corresponding response text in the plurality of response texts matches the request text; and obtaining an annotated text for at least one of the plurality of response texts based on the plurality of scores, where the annotated text is used to adjust a parameter of the large language model.
-
公开(公告)号:US12131728B2
公开(公告)日:2024-10-29
申请号:US17828773
申请日:2022-05-31
Inventor: Siyu Ding , Chao Pang , Shuohuan Wang , Yanbin Zhao , Junyuan Shang , Yu Sun , Shikun Feng , Hao Tian , Hua Wu , Haifeng Wang
CPC classification number: G10L15/063 , G10L15/02 , G10L15/18
Abstract: The present application provides a method of training a natural language processing model, which relates to a field of artificial intelligence, and in particular to a field of natural language processing. A specific implementation scheme includes: performing a semantic learning for multi-tasks on an input text, so as to obtain a semantic feature for the multi-tasks, wherein the multi-tasks include a plurality of branch tasks; performing a feature learning for each branch task based on the semantic feature, so as to obtain a first output result for each branch task; calculating a loss for each branch task according to the first output result for the branch task; and adjusting a parameter of the natural language processing model according to the loss for each branch task. The present application further provides a method of processing a natural language, an electronic device, and a storage medium.
-
公开(公告)号:US12019990B2
公开(公告)日:2024-06-25
申请号:US17124030
申请日:2020-12-16
Inventor: Haifeng Wang , Wenbin Jiang , Yajuan Lv , Yong Zhu , Hua Wu
IPC: G06N20/00 , G06F18/214 , G06F18/2413 , G06F40/279 , G06F40/30 , G06N5/022
CPC classification number: G06F40/30 , G06F18/214 , G06F18/24147 , G06F40/279 , G06N5/022
Abstract: The present application discloses a text processing method and device based on natural language processing and a knowledge graph, and relates to the in-depth field of artificial intelligence technology. A specific implementation is: an electronic device uses a joint learning model to obtain a semantic representation, which is obtained by the joint learning model by combining knowledge graph representation learning and natural language representation learning, it combines a knowledge graph representation learning and a natural language representation learning, compared to using only the knowledge graph representation learning or the natural language representation learning to learn semantic representation of a prediction object, factors considered by the joint learning model are more in quantity and comprehensiveness, so accuracy of semantic representation can be improved, and thus accuracy of text processing can be improved.
-
-
-
-
-
-