-
公开(公告)号:EP4327294A1
公开(公告)日:2024-02-28
申请号:EP22828875.9
申请日:2022-05-10
申请人: Lemon Inc.
发明人: LI, Yunzhu , CHENG, Haiying , SUN, Chen
-
公开(公告)号:EP4321994A1
公开(公告)日:2024-02-14
申请号:EP22814852.4
申请日:2022-04-06
发明人: WANG, Min , ZHOU, Shuai , CHEN, Xu , XIE, Bing
IPC分类号: G06F9/445 , G06F9/451 , G06F3/0481 , G06T13/80
摘要: This application provides a display method and an electronic device, and relates to the field of terminal technologies. In this application, an icon animation and starting window image decoding may be performed in parallel. After a launch operation of a user is detected, an application launch animation may start to be displayed. This reduces a waiting latency and improves user experience. The method includes: in response to an application launch operation, starting to draw a starting window, and before drawing of the starting window is completed, in response to the application launch operation, starting to display an application icon launch animation. After the drawing of the starting window is completed, a starting window animation is displayed, to complete display of the application launch animation.
-
23.
公开(公告)号:EP4315263A1
公开(公告)日:2024-02-07
申请号:EP22779206.6
申请日:2022-02-11
IPC分类号: G06T17/00 , H04N21/44 , G06T19/00 , H04N21/854 , G06T19/20 , H04N21/472 , H04N21/218 , H04N21/6587 , G06F3/01 , G06T13/40
-
-
公开(公告)号:EP4285329A1
公开(公告)日:2023-12-06
申请号:EP22746447.6
申请日:2022-01-24
申请人: Spree3D Corporation
发明人: SPENCER, Gil , PINSKIY, Dmitriy , SMYTH, Evan
IPC分类号: G06T13/00
-
公开(公告)号:EP4281935A1
公开(公告)日:2023-11-29
申请号:EP22858613.7
申请日:2022-07-13
发明人: SACHDEVA, Kapil , AGARWAL, Neha
-
公开(公告)号:EP3915108B1
公开(公告)日:2023-11-29
申请号:EP20744394.6
申请日:2020-01-27
-
公开(公告)号:EP4275167A1
公开(公告)日:2023-11-15
申请号:EP22788848.4
申请日:2022-04-13
申请人: Spree3D Corporation
发明人: DAVIDSON, Robert , SPENCER, Gil , PINSKIY, Dmitriy , SMYTH, Evan
-
公开(公告)号:EP4270391A1
公开(公告)日:2023-11-01
申请号:EP23154135.0
申请日:2023-01-31
摘要: This disclosure relates generally to methods and systems for emotion-controllable generalized talking face generation of an arbitrary face image. Most of the conventional techniques for the realistic talking face generation may not be efficient to control the emotion over the face and have limited scope of generalization to an arbitrary unknown target face. The present disclosure proposes a graph convolutional network that uses speech content feature along with an independent emotion input to generate emotion and speech-induced motion on facial geometry-aware landmark representation. The facial geometry-aware landmark representation is further used in by an optical flow-guided texture generation network for producing the texture. A two-branch optical flow-guided texture generation network with motion and texture branches is designed to consider the motion and texture content independently. The optical flow-guided texture generation network then renders emotional talking face animation from a single image of any arbitrary target face.
-
公开(公告)号:EP4235366A3
公开(公告)日:2023-11-01
申请号:EP23181771.9
申请日:2017-07-24
申请人: Magic Leap, Inc.
发明人: MACNAMARA, John Graham , SAMEC, Nicole Elizabeth , ROBAINA, Nastasja U. , BAERENRODT, Eric , HARRISES, Christopher M.
IPC分类号: G02B7/28 , G06F1/14 , G06T13/40 , A61B3/13 , A61B3/00 , A61B3/14 , A61B3/10 , A61B3/028 , G02B27/01 , G06F1/16 , G06F3/01 , G02C11/00 , G16H30/40 , G16H40/63 , G16H50/20
摘要: A wearable ophthalmic device is disclosed. The device may include an outward facing head-mounted light field camera (e.g., 16) to receive light from a user's surroundings and to generate numerical light field image data. The device may also include a light field processor (e.g., 76) to access the numerical light field image data, and to computationally introduce an amount of optical power to the numerical light field image data based on the viewing distance from the user to an object to generate modified numerical light field image data. The device may also include a head-mounted light field display (e.g., 62) to generate a physical light field corresponding to the modified numerical light field image data.
-
-
-
-
-
-
-
-
-