METHOD FOR LARGE-SCALE DISTRIBUTED MACHINE LEARNING USING FORMAL KNOWLEDGE AND TRAINING DATA

    公开(公告)号:US20190385087A1

    公开(公告)日:2019-12-19

    申请号:US16480625

    申请日:2019-01-17

    IPC分类号: G06N20/20 G06N5/04

    摘要: A method for large-scale distributed machine learning using input data comprising formal knowledge and/or training data. The method consisting of independently calculating discrete algebraic models of the input data in one or many computing devices, and in sharing indecomposable components of the algebraic models among the computing devices without constraints on when or on how many times the sharing needs to happen. The method uses an asynchronous communication among machines or computing threads, each working in the same or related learning tasks. Each computing device improves its algebraic model every time it receives new input data or the sharing from other computing devices, thereby providing a solution to the scaling-up problem of machine learning systems.

    Method for large-scale distributed machine learning using formal knowledge and training data

    公开(公告)号:US11521133B2

    公开(公告)日:2022-12-06

    申请号:US16480625

    申请日:2019-01-17

    IPC分类号: G06N5/04 G06N20/20

    摘要: A method for large-scale distributed machine learning using input data comprising formal knowledge and/or training data. The method consisting of independently calculating discrete algebraic models of the input data in one or many computing devices, and in sharing indecomposable components of the algebraic models among the computing devices without constraints on when or on how many times the sharing needs to happen. The method uses an asynchronous communication among machines or computing threads, each working in the same or related learning tasks. Each computing device improves its algebraic model every time it receives new input data or the sharing from other computing devices, thereby providing a solution to the scaling-up problem of machine learning systems.