Abstract:
A footswitch controller for an electric stringed musical instrument is provided. In one embodiment, the footswitch controller comprises a foot pedal assembly, base assembly, bottom plate assembly, battery pocket assembly, and compound assembly. The footswitch controller does not directly alter the input sound, but upon activation by a user sends a signal to the digital signal processor within the electric stringed musical instrument to alter the sound.
Abstract:
A user interface implemented on a touch-sensitive display for a virtual musical instrument comprising a plurality of chord touch regions configured in a predetermined sequence, each chord touch region corresponding to a chord in a musical key and being divided into a plurality of separate touch zones, the plurality of chord touch regions defining a predetermined set of chords, where each of the plurality of separate touch zones in each region is associated with one or more preselected MIDI files stored in a computer-readable medium. In some embodiments, the touch zones are configured to provide different harmonic configurations of a base chord associated with the chord touch region. Some harmonic configurations provide progressively wider harmonic ranges across each adjacent touch zone. Other harmonic configurations can provide chords with a progressively higher relative pitch across each adjacent touch zone.
Abstract:
A music-based video game additionally evaluates a game player's physical performance. The game player's physical performance is evaluated in some embodiments by determining if an indication of controller movement matches predefined patterns while the game player responds to instructive cues to operate a representation of a musical instrument. A representation of a musician may be provided, with the game player to mimic movements of the representation of the musician. The game player may be provided additional points for properly responding to instructive cues while moving in predefined manners.
Abstract:
A wireless sensor network for musical instruments is provided that will allow a musician to communicate natural performance gestures (orientation, pressure, tilt, etc) to a computer. User interfaces and computing modules are also provided that enable a user to utilize the data communicated by the wireless sensor network to supplement and/or augment the artistic expression.
Abstract:
A user interface implemented on a touch-sensitive display for a virtual musical instrument comprising a plurality of chord touch regions configured in a predetermined sequence, each chord touch region corresponding to a chord in a musical key and being divided into a plurality of separate touch zones, the plurality of chord touch regions defining a predetermined set of chords, where each of the plurality of separate touch zones in each region is associated with one or more preselected MIDI files stored in a computer-readable medium. Each of the plurality of touch zones is configured to detect one or more of a plurality of touch gesture articulations including at least one of a legato articulation, a pizzicato articulation, or a staccato articulation. The one or more of the plurality of touch gesture articulations determines the preselected MIDI file associated with each of the plurality of separate touch zones.
Abstract:
Systems, apparatus, methods, and articles of manufacture provide for determining one or more chords and/or music notes to output (e.g., via a mobile device) based on a direction of movement and/or a speed of movement (e.g., of a mobile device). In some embodiments, determining a music note for output may comprise determining whether a speed of a mobile device has increased, decreased, or remained constant.
Abstract:
The free-space gesture MIDI controller technique described herein marries the technologies embodied in a free-space gesture controller with MIDI controller technology, allowing a user to control an infinite variety of electronic musical instruments through body gesture and pose. One embodiment of the free-space gesture MIDI controller technique described herein uses a human body gesture recognition capability of a free-space gesture control system and translates human gestures into musical actions. Rather than directly connecting a specific musical instrument to the free-space gesture controller, the technique generalizes its capability and instead outputs standard MIDI signals, thereby allowing the free-space gesture control system to control any MIDI-capable instrument.
Abstract:
A footswitch controller for an electric stringed musical instrument is provided. In one embodiment, the footswitch controller comprises a foot pedal assembly, base assembly, bottom plate assembly, battery pocket assembly, and compound assembly. The footswitch controller does not directly alter the input sound, but upon activation by a user sends a signal to the digital signal processor within the electric stringed musical instrument to alter the sound.
Abstract:
A new type of Markovian sequence generator and generation method generates a Markovian sequence having controllable properties, notably properties that satisfy at least one control criterion which is a computable requirement holding on items in the sequence. The Markovian sequence is generated chunkwise, each chunk containing a plurality of items in the sequence. During generation of each chunk a search is performed in the space of Markovian sequences to find a chunk-sized series of items which enables the control criterion to be satisfied. The search can be performed using a generate and test approach in which chunk-sized Markovian sequences are generated then tested for compliance with the requirement(s) of the control criteria. Alternatively, the search can be performed by formulating the sequence-generation task as a constraint satisfaction problem, with one or more constraints ensuring that the generated sequence is Markovian and one or more constraints enforcing the requirement(s) of the control criteria. The sequence generator can be used in an interactive system where a user specifies the control criterion via an inputting device (20).
Abstract:
A method, apparatus, and computer product for: receiving a first user input indicative of a movement of a device; receiving a second user input indicative of a touch gesture entered on the device; determining that a combination of the first and second user inputs is associated with a function having at least first and second parameters; determining the first parameter based upon at least the first user input; determining the second parameter based upon at least the second user input; and causing the function to be performed according to the determined first and second parameters.