Decompression and compression of neural network data using different compression schemes

    公开(公告)号:US11537853B1

    公开(公告)日:2022-12-27

    申请号:US16455258

    申请日:2019-06-27

    Abstract: Described herein is a neural network accelerator (NNA) with a decompression unit that can be configured to perform multiple types of decompression. The decompression may include a separate subunit for each decompression type. The subunits can be coupled to form a pipeline in which partially decompressed results generated by one subunit are input for further decompression by another subunit. Depending on which types of compression were applied to incoming data, any number of the subunits may be used to produce a decompressed output. In some embodiments, the decompression unit is configured to decompress data that has been compressed using a zero value compression scheme, a shared value compression scheme, or both. The NNA can also include a compression unit implemented in a manner similar to that of the decompression unit.

    Architectures and topologies for vehicle-based, voice-controlled devices

    公开(公告)号:US10540970B2

    公开(公告)日:2020-01-21

    申请号:US15838878

    申请日:2017-12-12

    Abstract: This disclosure describes, in part, techniques for implementing voice-enabled devices in vehicle environments to facilitate voice interaction with vehicle computing devices. Due to the differing communication capabilities of existing vehicle computing devices, the techniques described herein describe different communication topologies for facilitating voice interaction with the vehicle computing devices. In some examples, the voice-enabled device may be communicatively coupled to a user device, which may communicate with a remote speech-processing system to determine and perform operations responsive to the voice commands, such as conducting phone calls using loudspeakers of the vehicle computing device, streaming music to the vehicle computing device, and so forth. In this way, the communication topologies between the voice-enabled computing device, the vehicle computing device, and the user device provide for voice control of vehicle computing devices which may otherwise be unable to be controlled by voice commands.

    Neural network accelerator with reconfigurable memory

    公开(公告)号:US12169786B1

    公开(公告)日:2024-12-17

    申请号:US16455334

    申请日:2019-06-27

    Abstract: Described herein is a neural network accelerator (NNA) with reconfigurable memory resources for forming a set of local memory buffers comprising at least one activation buffer, at least one weight buffer, and at least one output buffer. The NNA supports a plurality of predefined memory configurations that are optimized for maximizing throughput and reducing overall power consumption in different types of neural networks. The memory configurations differ with respect to at least one of a total amount of activation, weight, or output buffer memory, or a total number of activation, weight, or output buffers. Depending on which type of neural network is being executed and the memory behavior of the specific neural network, a memory configuration can be selected accordingly.

    Decompression and compression of neural network data using different compression schemes

    公开(公告)号:US11868867B1

    公开(公告)日:2024-01-09

    申请号:US17989340

    申请日:2022-11-17

    CPC classification number: G06N3/048 G06N5/04 H03M7/42 H03M7/702

    Abstract: Described herein is a neural network accelerator (NNA) with a decompression unit that can be configured to perform multiple types of decompression. The decompression may include a separate subunit for each decompression type. The subunits can be coupled to form a pipeline in which partially decompressed results generated by one subunit are input for further decompression by another subunit. Depending on which types of compression were applied to incoming data, any number of the subunits may be used to produce a decompressed output. In some embodiments, the decompression unit is configured to decompress data that has been compressed using a zero value compression scheme, a shared value compression scheme, or both. The NNA can also include a compression unit implemented in a manner similar to that of the decompression unit.

Patent Agency Ranking