Abstract:
An interaction system is provided. The interaction system has: a mobile device having: a location detection unit configured to retrieve a geographical location of the mobile device; and a server configured to retrieve the geographical location of the mobile device, wherein the server has a database configured to store at least one interaction object and location information associated with the interaction object, and the server further determines whether the location information of the interaction object corresponds to the geographical location of the mobile device, wherein when the location information of the interaction object corresponds to the geographical location of the mobile device, the server further transmits the interaction object to the mobile device, so that the mobile device executes the at least one interaction object.
Abstract:
An interaction system is provided. The interaction system has a first mobile device, configured to capture images of a scene, and a server, configured to recognize a first electronic device from the images captured by the first mobile device, so that the first electronic device and the first mobile device interact with each other.
Abstract:
An interaction system is provided. The interaction system has a first mobile device, configured to capture images of a scene, and a server, configured to recognize a first electronic device from the images captured by the first mobile device, so that the first electronic device and the first mobile device interact with each other.
Abstract:
A system for establishing a virtual social network including at least one emotion detector and a processing module is provided. The emotion detector detects emotional reactions of a plurality of users while they are watching a video to generate a plurality of detection signals. The processing module analyzes the detection signals to obtain emotion data corresponding to a plurality of time indices in the video, and analyzes content of the video to obtain metadata corresponding to the time indices in the video. Also, the processing module classifies the users into social groups according to the emotion data and the metadata.