BIT WIDTH SELECTION FOR FIXED POINT NEURAL NETWORKS
    2.
    发明申请
    BIT WIDTH SELECTION FOR FIXED POINT NEURAL NETWORKS 审中-公开
    固定点神经网络的位宽选择

    公开(公告)号:WO2016182659A1

    公开(公告)日:2016-11-17

    申请号:PCT/US2016/026944

    申请日:2016-04-11

    CPC classification number: G06N3/08 G06F17/11 G06N3/063 G06N3/10

    Abstract: A method for selecting bit widths for a fixed point machine learning model includes evaluating a sensitivity of model accuracy to bit widths at each computational stage of the model. The method also includes selecting a bit width for parameters, and/or intermediate calculations in the computational stages of the mode. The bit width for the parameters and the bit width for the intermediate calculations may be different. The selected bit width may be determined based on the sensitivity evaluation.

    Abstract translation: 用于选择固定点机器学习模型的位宽度的方法包括在模型的每个计算阶段评估模型精度对位宽度的灵敏度。 该方法还包括在模式的计算阶段中选择参数的位宽度和/或中间计算。 参数的位宽和中间计算的位宽可能不同。 可以基于灵敏度评估来确定所选择的位宽度。

    TRANSFER LEARNING IN NEURAL NETWORKS
    5.
    发明申请
    TRANSFER LEARNING IN NEURAL NETWORKS 审中-公开
    转移神经网络学习

    公开(公告)号:WO2017052709A2

    公开(公告)日:2017-03-30

    申请号:PCT/US2016/039661

    申请日:2016-06-27

    CPC classification number: G06N3/08 G06K9/4628 G06N3/0454

    Abstract: A method of transfer learning includes receiving second data and generating, via a first network, second labels for the second data. In one configuration, the first network has been previously trained on first labels for first data. Additionally, the second labels are generated for training a second network.

    Abstract translation: 传送学习的方法包括:接收第二数据,并经由第一网络生成第二数据的第二标签。 在一个配置中,第一个网络已经在第一个数据的第一个标签上进行过培训。 另外,生成用于训练第二网络的第二标签。

    EVALUATION OF A SYSTEM INCLUDING SEPARABLE SUB-SYSTEMS OVER A MULTIDIMENSIONAL RANGE
    6.
    发明申请
    EVALUATION OF A SYSTEM INCLUDING SEPARABLE SUB-SYSTEMS OVER A MULTIDIMENSIONAL RANGE 审中-公开
    在多维范围内评估包括可分子系统的系统

    公开(公告)号:WO2015065738A2

    公开(公告)日:2015-05-07

    申请号:PCT/US2014/061220

    申请日:2014-10-17

    Abstract: An artificial neural network may be configured to test the impact of certain input parameters. To improve testing efficiency and to avoid test runs that may not alter system performance, the effect of input parameters on neurons or groups of neurons may be determined to classify the neurons into groups based on the impact of certain parameters on those groups. Groups may be ordered serially and/or in parallel based on the interconnected nature of the groups and whether the output of neurons in one group may affect the operation of another. Parameters not affecting group performance may be pruned as inputs to that particular group prior to running system tests, thereby conserving processing resources during testing.

    Abstract translation: 可以配置人造神经网络来测试某些输入参数的影响。 为了提高测试效率并避免可能不改变系统性能的测试运行,可以确定输入参数对神经元或神经元组的影响,以基于特定参数对这些组的影响将神经元分类为组。 基于组的相互关联的性质以及一组中的神经元的输出是否可能影响另一组的神经元的输出,可以串联和/或并联地排序组。 在运行系统测试之前,不影响组性能的参数可以被修剪为该特定组的输入,从而在测试期间节省处理资源。

    DISTRIBUTED MODEL LEARNING
    9.
    发明申请
    DISTRIBUTED MODEL LEARNING 审中-公开
    分布式模型学习

    公开(公告)号:WO2015175155A1

    公开(公告)日:2015-11-19

    申请号:PCT/US2015/026617

    申请日:2015-04-20

    CPC classification number: G06N99/005 G06N3/08

    Abstract: A method of learning a model includes receiving model updates from one or more users. The method also includes computing an updated model based on a previous model and the model updates. The method further includes transmitting data related to a subset of the updated model to the a user(s) based on the updated model.

    Abstract translation: 学习模型的方法包括从一个或多个用户接收模型更新。 该方法还包括基于先前的模型和模型更新来计算更新的模型。 该方法还包括基于所更新的模型将与更新的模型的子集相关的数据发送给用户。

Patent Agency Ranking