Low latency memory notification
    271.
    发明授权

    公开(公告)号:US12056072B1

    公开(公告)日:2024-08-06

    申请号:US17457603

    申请日:2021-12-03

    Abstract: Techniques to reduce the latency of data transfer notifications in a computing system are disclosed. The techniques can include receiving, at a memory, a first access request of a set of access requests associated with a data transfer. The first access request has a token and an access count indicating the number of access requests in the set of access requests. A counter is initiated to count the number of received access requests having the token. When additional access requests belonging to the set of access requests are received, the counter is incremented for each of the additional access requests being received. A notification is transmitted to an integrated circuit component in response to receiving the last access request of the set of access requests having the token to notify the integrated circuit component that the memory is ready for access.

    CROSS-ASSISTANT COMMAND PROCESSING
    274.
    发明公开

    公开(公告)号:US20240257808A1

    公开(公告)日:2024-08-01

    申请号:US18435024

    申请日:2024-02-07

    Inventor: Robert John Mars

    Abstract: A speech-processing system may provide access to one or more virtual assistants via a voice-controlled device. A user may leverage a first virtual assistant to translate a natural language command from a first language into a second language, which the device can forward to a second virtual assistant for processing. The device may receive a command from a user and send input data representing the command to a first speech-processing system representing the first virtual assistant. The device may receive a response in the form of a first natural language output from the first speech-processing system along with an indication that the first natural language output should be directed to a second speech-processing system representing the second virtual assistant. For example, the command may be in the first language, and the first natural language output may be in the second language, which is understandable by the second speech-processing system.

    Dynamically moving transcoding of content between servers

    公开(公告)号:US12052447B1

    公开(公告)日:2024-07-30

    申请号:US17850493

    申请日:2022-06-27

    CPC classification number: H04N21/234309 H04N21/2187

    Abstract: Dynamically re-locating transcoding processes of live content data is described herein. In an example, a computer system causes a first server to execute a first transcode process on a first portion of live stream content. A first output of executing the first transcode process includes first transcoded content. The computer system determines a transcode capacity of one or more servers. The computer system determines that transcoding the live stream content is to be moved to a second server based at least in part on the transcode capacity and a transcode optimization parameter. The computer system causes the second server to execute a second transcode process on a second portion of the live stream content. The second transcode process is equivalent to the first transcode process. A second output of executing the second transcode process includes second transcoded content.

    Multi-tenant caching service in a hosted computing environment

    公开(公告)号:US12050534B1

    公开(公告)日:2024-07-30

    申请号:US17657557

    申请日:2022-03-31

    CPC classification number: G06F12/0806 G06F2212/62

    Abstract: Systems and methods are described for implementing a multi-tenant caching service. The multi-tenant caching service provides a scalable infrastructure with dedicated per-tenant cache widths for tenants of a hosted computing environment, and allows tenants to implement a caching layer between cloud-based services that would otherwise need to scale up in response to load. Tenants may also use the service as a public facing endpoint that caches content provided by backend servers. Content provided by the tenants may be distributed and cached across a cell-based architecture, each cell of which may include a set of storage volumes that are partitioned into caches for individual tenants and configured to store a portion of the content provided by that tenant. Eviction policies may be implemented based on tenant cache usage across multiple cells, and geolocation policies may be implemented to ensure that cached content remains within a particular geographic region.

    SPEECH INTERFACE DEVICE WITH CACHING COMPONENT
    279.
    发明公开

    公开(公告)号:US20240249725A1

    公开(公告)日:2024-07-25

    申请号:US18425465

    申请日:2024-01-29

    CPC classification number: G10L15/30 G10L15/18 H04L67/5683

    Abstract: A speech interface device is configured to receive response data from a remote speech processing system for responding to user speech. This response data may be enhanced with information such as a remote ASR result(s) and a remote NLU result(s). The response data from the remote speech processing system may include one or more cacheable status indicators associated with the NLU result(s) and/or remote directive data, which indicate whether the remote NLU result(s) and/or the remote directive data are individually cacheable. A caching component of the speech interface device allows for caching at least some of this cacheable remote speech processing information, and using the cached information locally on the speech interface device when responding to user speech in the future. This allows for responding to user speech, even when the speech interface device is unable to communicate with a remote speech processing system over a wide area network.

    Electronic device
    280.
    外观设计

    公开(公告)号:USD1036420S1

    公开(公告)日:2024-07-23

    申请号:US29911249

    申请日:2023-08-30

    Abstract: FIG. 1 is a top, front, right-side perspective view of an electronic device;
    FIG. 2 is a bottom perspective view thereof;
    FIG. 3 is a front view thereof;
    FIG. 4 is a back view thereof;
    FIG. 5 is a left-side view thereof;
    FIG. 6 is a right-side view thereof;
    FIG. 7 is a top view thereof; and,
    FIG. 8 is a bottom view thereof.
    The dashed broken lines depict portions of the electronic device that form no part of the claimed design.

Patent Agency Ranking