Dynamically rendering 360-degree videos using view-specific-filter parameters

    公开(公告)号:US11178374B2

    公开(公告)日:2021-11-16

    申请号:US16428201

    申请日:2019-05-31

    Applicant: Adobe Inc.

    Abstract: This disclosure relates to methods, non-transitory computer readable media, and systems that generate and dynamically change filter parameters for a frame of a 360-degree video based on detecting a field of view from a computing device. As a computing device rotates or otherwise changes orientation, for instance, the disclosed systems can detect a field of view and interpolate one or more filter parameters corresponding to nearby spatial keyframes of the 360-degree video to generate view-specific-filter parameters. By generating and storing filter parameters for spatial keyframes corresponding to different times and different view directions, the disclosed systems can dynamically adjust color grading or other visual effects using interpolated, view-specific-filter parameters to render a filtered version of the 360-degree video.

    DYNAMICALLY GENERATING AND CHANGING VIEW-SPECIFIC-FILTER PARAMETERS FOR 360-DEGREE VIDEOS

    公开(公告)号:US20220060671A1

    公开(公告)日:2022-02-24

    申请号:US17519332

    申请日:2021-11-04

    Applicant: Adobe Inc.

    Abstract: This disclosure relates to methods, non-transitory computer readable media, and systems that generate and dynamically change filter parameters for a frame of a 360-degree video based on detecting a field of view from a computing device. As a computing device rotates or otherwise changes orientation, for instance, the disclosed systems can detect a field of view and interpolate one or more filter parameters corresponding to nearby spatial keyframes of the 360-degree video to generate view-specific-filter parameters. By generating and storing filter parameters for spatial keyframes corresponding to different times and different view directions, the disclosed systems can dynamically adjust color grading or other visual effects using interpolated, view-specific-filter parameters to render a filtered version of the 360-degree video.

    SELECTING OBJECTS WITHIN A THREE-DIMENSIONAL POINT CLOUD ENVIRONMENT

    公开(公告)号:US20210149543A1

    公开(公告)日:2021-05-20

    申请号:US16685581

    申请日:2019-11-15

    Applicant: Adobe Inc.

    Abstract: Techniques for interacting with virtual environments. For example, a virtual reality application outputs a three-dimensional virtual reality scene. The application receives a creation of a slicing volume that is positioned within the three-dimensional virtual space. The slicing volume includes virtual elements of an object within the scene. The application projects the slicing volume onto a two-dimensional view. The application displays the two-dimensional view within the three-dimensional virtual reality scene. The application associates a surface of a physical object with the two-dimensional view. The application receives an interaction with the surface of the physical object, and based on the interaction, selects one or more virtual elements.

    DYNAMICALLY RENDERING 360-DEGREE VIDEOS USING VIEW-SPECIFIC-FILTER PARAMETERS

    公开(公告)号:US20200382755A1

    公开(公告)日:2020-12-03

    申请号:US16428201

    申请日:2019-05-31

    Applicant: Adobe Inc.

    Abstract: This disclosure relates to methods, non-transitory computer readable media, and systems that generate and dynamically change filter parameters for a frame of a 360-degree video based on detecting a field of view from a computing device. As a computing device rotates or otherwise changes orientation, for instance, the disclosed systems can detect a field of view and interpolate one or more filter parameters corresponding to nearby spatial keyframes of the 360-degree video to generate view-specific-filter parameters. By generating and storing filter parameters for spatial keyframes corresponding to different times and different view directions, the disclosed systems can dynamically adjust color grading or other visual effects using interpolated, view-specific-filter parameters to render a filtered version of the 360-degree video.

    3D simulation of a 3D drawing in virtual reality

    公开(公告)号:US11783534B2

    公开(公告)日:2023-10-10

    申请号:US17321726

    申请日:2021-05-17

    Applicant: Adobe Inc.

    CPC classification number: G06T15/205 G06F3/011 G06T19/003 G06T19/20

    Abstract: Embodiments of the present invention provide systems, methods, and computer storage media which retarget 2D screencast video tutorials into an active VR host application. VR-embedded widgets can render on top of a VR host application environment while the VR host application is active. Thus, VR-embedded widgets can provide various interactive tutorial interfaces directly inside the environment of the VR host application. For example, VR-embedded widgets can present external video content, related information, and corresponding interfaces directly in a VR painting environment, so a user can simultaneously access external video (e.g., screencast video tutorials) and a VR painting. Possible VR-embedded widgets include a VR-embedded video player overlay widget, a perspective thumbnail overlay widget (e.g., a user-view thumbnail overlay, an instructor-view thumbnail overlay, etc.), an awareness overlay widget, a tutorial steps overlay widget, and/or a controller overlay widget, among others.

    Dynamically generating and changing view-specific-filter parameters for 360-degree videos

    公开(公告)号:US11539932B2

    公开(公告)日:2022-12-27

    申请号:US17519332

    申请日:2021-11-04

    Applicant: Adobe Inc.

    Abstract: This disclosure relates to methods, non-transitory computer readable media, and systems that generate and dynamically change filter parameters for a frame of a 360-degree video based on detecting a field of view from a computing device. As a computing device rotates or otherwise changes orientation, for instance, the disclosed systems can detect a field of view and interpolate one or more filter parameters corresponding to nearby spatial keyframes of the 360-degree video to generate view-specific-filter parameters. By generating and storing filter parameters for spatial keyframes corresponding to different times and different view directions, the disclosed systems can dynamically adjust color grading or other visual effects using interpolated, view-specific-filter parameters to render a filtered version of the 360-degree video.

    Interfaces and techniques to retarget 2D screencast videos into 3D tutorials in virtual reality

    公开(公告)号:US11030796B2

    公开(公告)日:2021-06-08

    申请号:US16163428

    申请日:2018-10-17

    Applicant: ADOBE INC.

    Abstract: Embodiments of the present invention provide systems, methods, and computer storage media which retarget 2D screencast video tutorials into an active VR host application. VR-embedded widgets can render on top of a VR host application environment while the VR host application is active. Thus, VR-embedded widgets can provide various interactive tutorial interfaces directly inside the environment of the VR host application. For example, VR-embedded widgets can present external video content, related information, and corresponding interfaces directly in a VR painting environment, so a user can simultaneously access external video (e.g., screencast video tutorials) and a VR painting. Possible VR-embedded widgets include a VR-embedded video player overlay widget, a perspective thumbnail overlay widget (e.g., a user-view thumbnail overlay, an instructor-view thumbnail overlay, etc.), an awareness overlay widget, a tutorial steps overlay widget, and/or a controller overlay widget, among others.

Patent Agency Ranking