Abstract:
A system for using body motion capture for musical performances. A motion detection camera captures a series of body movements which are assigned to begin one of more songs, to activate musical filters, and to active sound effects. Once the movements are captured and assignment, the user begins the performance.
Abstract:
An interactive music method for controlling a media player device is provided. The interactive music method comprises the steps of receiving one or more gestures, interpreting the gesture in accordance with a plurality of predefined gestures, and executing at least one process corresponding to the gesture. The process comprises controlling audio for a specific amount of time.
Abstract:
A user can move the position of a pan flute in a horizontal direction on a screen by sliding a touch position in the horizontal direction on the screen while touching a touch panel with a stick. Moreover, when the user blows on a microphone hole, a musical sound of a pitch corresponding to a pipe of the pan flute which is displayed in an overlapping manner with a valid position display image at this time is outputted via a sound hole.
Abstract:
A sensor obtains an angular rate of a stick member. A sound source map includes plural areas disposed in a virtual space. CPU presumes a direction of a turning axis of the stick member based on the angular rate obtained by the sensor, while a user is operating the stick member, calculates an angular rate of a top of the stick member based on the angular rate obtained by the sensor with the angular rate in the longitudinal direction of the stick member removed, calculates a position of the holding member in the virtual space after a predetermined time, based on the recent direction of the turning axis of the stick member and recent angular rate of the top of the stick member, and sends a sound generating unit a note-on event of a musical tone assigned to an area corresponding to the calculated position among the plural areas.
Abstract:
A performance apparatus 11 extends in its longitudinal direction to be held by a player with his or her hand, and is provided with an acceleration sensor 23 for detecting an acceleration sensor value and an angular rate sensor 22 for detecting an angular rate of rotation of the apparatus 11 about its longitudinal axis. CPU 21 detects a sound-generation timing based on the acceleration sensor value. Using the angular rate, CPU 21 calculates a rotation angle of the performance apparatus 11 made about its longitudinal axis in a period from a first and a second timing, wherein the first and second timing correspond to starting and finishing of swinging motion of the performance apparatus, respectively. CPU 21 determines to increase or decrease a sound volume level, in accordance with the direction and amount of the calculated rotation angle, thereby adjusting a sound volume level of musical tone.
Abstract:
A system may be provided that includes computing equipment and an optical input accessory. The computing equipment may use an imaging system to track the relative locations of light sources on the optical input device and to continuously capture images of optical markers on the optical input accessory and of user input objects. The computing equipment may be used to operate the system in operational modes that allow a user to record and playback musical sounds based on user input gathered with the optical input device, to generate musical sounds based on user input gathered with the optical input device and based on musical data received from a remote location, to provide musical instruction to a user of the optical input device, to generate a musical score using the optical input device, or to generate user instrument acoustic profiles.
Abstract:
A musical instrument (1) includes a stick (10) to be held by a player and provided with a marker (15) at a leading end that emits light and switches off; a camera unit (20) that captures an image of the player holding the stick (10); and a center unit (30) that generates a percussion instrument sound based on a position of the marker (15) while emitting light in image-capture space captured by the camera unit (20), in which the stick (10) causes the marker (15) to emit light under the condition of detecting start of a down swing movement by the player, and causes the marker (15) to switch off under the condition of detecting the end of this movement.
Abstract:
An electric instrument music control device is provided having at least two multi-axis position sensors. One sensor is a reference multi-axis position sensor having at least one axis held in a fixed position. Another sensor is a moveable multi-axis position sensor rotatable about at least one axis corresponding to the at least one axis of the reference multi-axis position sensor. The electric music control device also includes a processor in communication with both the reference multi-axis position sensor and the moveable multi-axis position sensor. The processor calculates an angular difference in response to receiving the angular position of the at least one axis of the reference multi-axis position sensor and the angular position of the at least one axis of the moveable multi-axis position sensor. The angular difference correlates to a music effect of an electric instrument.
Abstract:
The present invention relates to an interactive sound-and-light art device with wireless transmission and sensing functions, which is primarily composed of a plurality of acoustic sensor nodes in artistic shapes, whereas each of the plural acoustic sensor nodes is designed to interact with people through the detection of multi-track music playing or voice-based exhibition of twitter conversations. Substantially, each acoustic sensor node is an artistically-shaped frame having a plurality of sensors embedded therein, which includes sensors for detecting environmental information, and sensors for detection human motion. Moreover, each artistically-shaped frame can further be embedded with interactive components, using which each acoustic sensor node is able to interact with people through multi-track music playing or exhibition of LED light variations, according to the detection of its environment sensors and human motion sensors.
Abstract:
A sound generation timing is set as when a position of a playing device main body (11) is positioned within a main region as well as the position thereof being positioned in a sub region, and a CPU (21) generates a Note-On-Event with a tone stored in a main region/tone table and associated with the main region, and with a pitch stored in a sub region/pitch table and associated with the sub region. The Note-On-Event is transmitted from the playing device main body (11) to a electronic instrument unit (10), and a sound source unit (31) of the electronic instrument unit generates and outputs a musical sound of a tone and pitch in accordance with the Note-On-Event.