Abstract:
An analysis apparatus comprises at least one memory storing instructions, and at least one processor configured to execute the instructions to acquire emotion data including time data from an emotion data generation apparatus that generates the emotion data from face image data of participants in an online meeting, acquire meeting data regarding the meeting including the time data, generate chapters for the meeting based on the meeting data, generate analysis data regarding the meeting based on the emotion data for each of the chapters, output the generated analysis data.
Abstract:
An information processing apparatus (2000) displays an operation image (20) on a display screen (50) associated with a head mounted display (100). First, the information processing apparatus (2000) acquires work information. The work information includes information for determining a work target which is a target of work performed by a user of the head mounted display (100). The information processing apparatus (2000) determines a work target satisfying a provision condition by using the work information. The information processing apparatus (2000) displays the operation image (20) for selecting the determined work target on the display screen (50) in an operable manner. The information processing apparatus (2000) detects an input operation on the operation image (20).
Abstract:
A marker (3020) is any part of a user's body or is any mark attached to the user. A sensor (3040) is attached to the user. An operation region calculation unit (2020) calculates an operation region included in a captured image on the basis of a position of the marker (3020) included in the captured image generated by a camera. A recognition unit (2040) calculates a position or motion of an operation body captured in the operation region, and recognizes an input operation on the basis of the calculated position or motion of the operation body. Note that the recognition unit (2040) calculates a position of the operation body captured in the operation region at a timing based on a result of detection by the sensor (3040). The recognition unit (2040) calculates motion of the operation body captured in the operation region in a period including a timing based on a result of detection by the sensor (3040).
Abstract:
One of the purposes of the present disclosure is to provide a virtual space providing device and the like that are capable of inferring a feeling of a user, who uses a virtual space, toward a specific target while suppressing calculation load. An information processing device according to one aspect of the present disclosure comprises: an output control means for performing control to output an output image, which is an image according to an avatar in a virtual space, to a user who operates the avatar; a line-of-sight inference means for inferring the line of sight of the user on the basis of a predetermined range in the output image; and a feeling inference means for inferring a feeling of the user on the basis of a captured image in which the user imaged by an imaging device is included.
Abstract:
The server device includes: an acquisition unit; and an information provision unit. The acquisition unit is capable of acquiring a plurality of attribute values for the same item when acquiring the respective profiles of a plurality of users using a conference assistance system. When items exist for which a plurality of attribute values have been set for the profile of a first participant participating in the conference among the plurality of users, the information provision unit selects at least one or more attribute values among the plurality of attribute values. The information provision unit provides, to a second participant participating in the same conference as the conference in which the first participant is participating, the profile of the first participant that includes the at least one or more attribute values thus selected.
Abstract:
Provided is an inspection assistance device which includes: a data acquisition unit that acquires data generated as a result of an operation for inspection; a time acquisition unit that acquires, from the data, a first time required by the operation for the inspection; and a recording control unit that, in the case when a prescribed relation is satisfied by the first time and a second time which is previously determined as a time required by the operation for the inspection, records operation-related information recognized from the data and information related to the inspection in association with each other.
Abstract:
Provided are a tactile force presentation device, an information terminal, a tactile force presentation method, and a program that make it possible to present three-dimensional tactile force across a wide range while achieving a reduction in device size. The information terminal is provided with: a display device; a position detection unit that detects the position of a part of a user that is to be presented with tactile force; an information processing unit that alters content in accordance with a detected position and calculates the amount of tactile force to be presented on the basis of the altered content; and a tactile force presentation device. The tactile force presentation device is provided with: a transmission member that extends to the part of the user that is to be presented with tactile force and transmits tensile force; a drive unit that generates tensile force and provides the tensile force to the transmission member; and a control unit that causes tensile force to be generated as initial force in the drive unit in advance and increases or decreases the initial force when the calculated tactile force is presented so that the tactile force is provided to the user via the transmission member.
Abstract:
An analysis apparatus (100) includes: emotion data acquisition means (111) for acquiring emotion data that includes time data, the emotion data being generated based on face image data of participants in an online meeting; meeting data acquisition means (112) for acquiring meeting data including image data that includes time data in the meeting; analysis data generation means (113) for generating analysis data regarding the meeting based on the emotion data; screen generation means (114) for generating a playback screen including a seek bar for specifying a time, the analysis data that corresponds to the specified time, and the image data; and output means (115) for outputting the playback screen.
Abstract:
An analysis apparatus acquires individual emotion data for each participant generated on the basis of face image data of the participants in an online meeting during the meeting. The analysis apparatus generates, for each participant, analysis data indicating a degree of emotion in the online meeting on the basis of the individual emotion data. The analysis apparatus stores each piece of the analysis data for each participant in association with corresponding color tone information. The analysis apparatus generates, as a display image indicating a state of the online meeting, an image in which element figures represented by the color tone information associated with the analysis data are disposed for each of a plurality of the participants who have participated in the online meeting. The analysis apparatus outputs the generated display image.
Abstract:
The server device includes an acquisition unit, a user database, an attendee management database, and a processing unit. The acquisition unit acquires biological information for each of a plurality of users using an entry/exit management system. The user database associates the biological information with user IDs respectively identifying the users and stores the same. The attendee management database associates the user ID corresponding to an attendee in a restricted area, into which entry and exit is restricted, with position information for the attendee, and stores the same. The processing unit transmits the position information of a meeting candidate, who a visitor wants to meet, to an authentication terminal. The authentication terminal is a terminal which, if an authentication using the biological information recorded in the user database is successful, permits a successfully authenticated user to enter the restricted area.