Abstract:
Disclosed is a platform enabling users to generate animations synchronized to a musical track in an app-based environment which facilitates collaboration. A method of generating such animations involves first receiving, by a host device executing an app input data from one or more input devices. The app utilizes a layers module to assign the input data to one or more layers. The app assigns input data to layers and allows modification of inputs and timestamps of the input data. The app further allows application of one or more media assets to the inputs. Users of the platform may each execute the app, or execute a companion app that allow communication to the host device. The app may also facilitate compiling the one or more layers into a musical animation file. The musical animation file may store the one or more layers and metadata identifying the musical track.
Abstract:
An electronic device is provided. The electronic device includes a touch screen display, at least one of a speaker and a sound interface, a processor configured to electrically connect to the touch screen display, the speaker, and the sound interface, and a memory configured to electrically connect to the processor. The memory stores instructions for, when executed, causing the processor to display at least one item comprising a musical instrument shape on the touch screen display, receive a touch input through the touch screen display, load sound data corresponding to the at least one item based on the touch input, process the sound data based at least in part on information associated with the touch input, and output the processed sound data through the speaker or the sound interface.
Abstract:
An electric instrument music control device is provided having a foot pedal comprising a base portion and a treadle, wherein the treadle moves with respect to the base portion. The device further has a magnetic displacement sensor coupled to the base portion and a magnet coupled to the treadle. The magnet is located adjacent the magnetic displacement sensor to place the sensor in a field-saturated mode, wherein the magnet moves with respect to the magnetic displacement sensor in response to movement of the treadle with respect to the base portion. A sound characteristic of the electric instrument is modified in response to moving the magnet with respect to the magnetic displacement sensor.
Abstract:
A wireless sensor network for musical instruments is provided that will allow a musician to communicate natural performance gestures (orientation, pressure, tilt, etc) to a computer. User interfaces and computing modules are also provided that enable a user to utilize the data communicated by the wireless sensor network to supplement and/or augment the artistic expression.
Abstract:
Methods and apparatus for choreographing movement of individuals for a performance event are disclosed. In an embodiment, a method includes providing a performance event configuration having a plurality of location indicators to assist in the placing or movement of individuals conducting a performance event. The method also includes providing each individual with a wireless audio unit and transmitting body movement instruction signals to the audio units of each individual. In this embodiment, the wireless audio unit may be a wireless, cellular, or mobile telephone. The audio units are configured to receive the signals and to play audio directions for each individual that correspond to choreographed and coordinated body movements directly the individuals at, towards or away from the location indicators to carry out the performance event.
Abstract:
A computerized musical percussion instrument is disclosed.Markers carried by the musician are observed by an imager to produce a series of two dimensional images over the time of the performance.A processor receives the images and distinguishes between markers (e.g. left hand, right hand) by comparing the position and size of unidentified markers in the current image to the position and size of identified markers in preceding images.The processor analyzes each markers' movements and detects a drum hit when a marker undergoes a sharp reversal of its motion direction after reaching sufficient speed. The processor determines which drum the musician intends to hit by comparing the position and size of the marker at the instant of the hit to the position and size attributes of each drum. The processor outputs an audio signal for each hit, corresponding to the drum hit, with a volume determined by marker speed.
Abstract:
An electronic device executing a music playing application is provided. The electronic device includes a communication module configured to receive control information from a second electronic device connected through a wireless communication when the music playing application is executed, a processor configured to process the received control information to be applied to music data requested to be reproduced according to the execution of the music playing application, and an audio module configured to output the music data processed by the processor.
Abstract:
An audio/visual system (e.g., such as an entertainment console or other computing device) plays a base audio track, such as a portion of a pre-recorded song or notes from one or more instruments. Using a depth camera or other sensor, the system automatically detects that a user (or a portion of the user) enters a first collision volume of a plurality of collision volumes. Each collision volume of the plurality of collision volumes is associated with a different audio stem. In one example, an audio stem is a sound from a subset of instruments playing a song, a portion of a vocal track for a song, or notes from one or more instruments. In response to automatically detecting that the user (or a portion of the user) entered the first collision volume, the appropriate audio stem associated with the first collision volume is added to the base audio track or removed from the base audio track.
Abstract:
A computer-implemented method including generating a user interface implemented on a touch-sensitive display configured to generate a virtual dual flywheel system for modulating a lifecycle of a musical note or chord. The dual flywheel system (DFS) includes a first VFS and a second VFS, where the first virtual flywheel system series connected to the second virtual flywheel system such that an output of the first virtual flywheel system is coupled to an input of the second virtual flywheel system. Upon receiving a user input on the user interface, the dual flywheel system determines a virtual momentum for the first virtual flywheel based on the user input and a predetermined mass coefficient of the first virtual flywheel system, and determines a virtual momentum for the second virtual flywheel based on the virtual momentum of the first virtual flywheel system and a predetermined mass coefficient of the second virtual flywheel.
Abstract:
A musical instrument (1) includes a stick (10) to be held by a player and provided with a marker (15) at a leading end that emits light and switches off; a camera unit (20) that captures an image of the player holding the stick (10); and a center unit (30) that generates a percussion instrument sound based on a position of the marker (15) while emitting light in image-capture space captured by the camera unit (20), in which the stick (10) causes the marker (15) to emit light under the condition of detecting start of a down swing movement by the player, and causes the marker (15) to switch off under the condition of detecting the end of this movement.