Abstract:
A performance-information processing apparatus processes performance information entered thereto. When the performance information is entered in a time interval between (i) a starting time point, at which performance information of one note starts entering and (ii) a first timing, that is a certain time after another performance information of a note has been entered after the performance information of said one note was entered, a tempo determining unit determines a tempo of the performance information, based on the performance information entered in the time interval, and a meter determining unit which determines a meter of the performance information based on the tempo determined by the tempo determining unit.
Abstract:
A method implemented by a processor includes receiving performance data including pitch data; determining, based on the pitch data that is included in the received performance data, a key among a plurality of keys; selecting, based on the determined key and the pitch data, a first-type image from among a plurality of first-type images; and displaying the selected first-type image.
Abstract:
An automatic key adjusting apparatus is provided, which determines keys on an input melody in real time and adjusts the determined keys in non real time to obtain accurate keys, thereby enhancing accuracy of placement of chords. The automatic key adjusting apparatus is provided with a keyboard for playing a melody of a musical piece. CPU judges keys on the melody in real time based on a history of pitches of the played melody of the musical piece, and adjusts the result of the key judgment in non real time after the melody of the musical piece is played.
Abstract:
A method performed by one or more processors in an information processing device for an electronic musical instrument includes, via the one or more processors: receiving performance data generated by a user performance of the electronic musical instrument; extracting time-series characteristics of a sequence of notes from the performance data; detecting a performance technique from the extracted characteristics; and generating an image data reflecting the detected performance technique and outputting the generated image data.
Abstract:
For example, provided is an information processing device with which the entire performance can be felt really visually.The information processing device TB according to the present invention includes a processor performing a reception process of receiving input of performance information including pitch information, a first image output process of outputting a first image according to the received performance information, a performance determination process of determining at least any of tonality, a chord type, and a pitch name on the basis of the received performance information, and second image output process of outputting a second image according to a result determined in the performance determination process.
Abstract:
An electronic wind instrument according to one aspect of the present invention includes a plurality of performance keys for specifying pitches, a breath sensor which detects at least a breath input operation, and a controller (CPU), wherein the controller (CPU) selectively switches between a first mode of outputting first sound waveform data generated on the basis of the breath input operation and operation of at least one performance key from among the plurality of performance keys, and a second mode of, when the breath input operation is detected, outputting second sound waveform data based on musical piece data regardless of whether operation of the at least one performance key is detected or is not detected.
Abstract:
An automatic accompaniment apparatus is provided. The apparatus is provided with a music database having music data of plural musical pieces recorded therein, the music data including melody information and chords corresponding to the melody information, a performance recording unit for recording performance information for giving an instruction of generating a musical tone in response to performed operation, a music searching unit for searching for music data including melody information corresponding to the performance information in the performance recording unit through the music database, a chord judging unit for judging chords from the performance information in the performance recording unit, a chord selecting unit for selecting one of the chords included in the music data found by the music searching unit and the chords judged by the chord judging unit, and an automatic accompaniment unit for giving an instruction of generating accompaniment in accordance with the selected chords.