-
公开(公告)号:US20230084603A1
公开(公告)日:2023-03-16
申请号:US17474568
申请日:2021-09-14
Applicant: Arm Limited , Apical Limited
Inventor: Eric KUNZE , Jared Corey SMOLENS , Aaron DEBATTISTA , Elliot Maurice Simon ROSEMARINE
Abstract: Aspects of the present disclosure relate to apparatus comprising execution circuitry comprising at least one execution unit to execute program instructions, and control circuitry. The control circuitry receives a stream of processing instructions, and issues each received instruction to one of said at least one execution unit. Responsive to determining that a first type of context switch is to be performed from an initial context to a new context, issuing continues until a pre-emption point in the stream of processing instructions is reached. Responsive to reaching the pre-emption point, state information is stored, and the new context is switched to. Responsive to determining that a context switch is to be performed to return from the new context to the initial context, the processing status is restored from the state information, and the stream of processing instructions is continued.
-
公开(公告)号:US20210027148A1
公开(公告)日:2021-01-28
申请号:US16518444
申请日:2019-07-22
Applicant: Arm Limited
Inventor: Lingchuan MENG , John Wakefield BROTHERS, III , Jens OLSON , Jared Corey SMOLENS , Eric KUNZE , Ian Rudolf BRATT
Abstract: A processor arranged to compress neural network activation data comprising an input module for obtaining neural network activation data. The processor also comprises a block creation module arranged to split the neural network activation data into a plurality of blocks; and a metadata generation module for generating metadata associated with at least one of the plurality of blocks. Based on the metadata generated a selection module selects a compression scheme for each of the plurality of blocks, and a compression module for applying the selected compression scheme to the corresponding block to produce compressed neural network activation data. An output module is also provided for outputting the compressed neural network activation data.
-