Abstract:
A graphical user interface (GUI) for an audio editing application enables a user to easily and conveniently shift a temporal and/or pitch of a sequence of note events within a musical piece, e.g., via a touch-sensitive display. The GUI displays a set of note events on a matrix grid and a subset of the note events (e.g., selected by the user) on a note events grid that overlaps the matrix grid. The note events grid is moveable with respect to the matrix grid such that the subset of note events is shifted against the remaining note events while the note events within the subset maintain a spatial relationship with respect to each other. Further, the user can shift the note events grid (and the note events therein) to any location within the matrix grid, without unintentionally snapping the note events to a nearest grid location on the matrix grid.
Abstract:
Embodiments of the invention are related to a computer-implemented method that includes receiving musical data, identifying a succession of accentuated events in the musical data, determining a pattern in the succession of accentuated events, comparing the pattern to a plurality of reference patterns, and determining a match for the pattern using the plurality of reference patterns. The method further includes selecting one of the matching reference patterns, and generating a rhythmic musical accompaniment for the musical data based on the selected matching reference pattern. In some cases, the musical data is MIDI data or analog audio data. Analog audio data analysis includes detecting transients in the analog audio data by identifying the succession of accentuated event in the musical data. This may include identifying a plurality of events in the musical data, and determining whether each of the plurality of events is an accent.
Abstract:
Embodiments of the invention include storing musical elements in a database and processing performance data, where the musical elements including a plurality of reference accent pattern data and a plurality of reference system pattern data. Processing performance data can include receiving input data corresponding to a musical performance, determining an accent pattern for the musical performance, matching the accent pattern data to one or more reference accent pattern data in the database, and selecting one of the matching reference accent patterns. Processing performance data further includes receiving input corresponding to a selection of a musical style, one or more musical performance parameters, and generating a musical accompaniment based on the processed performance data, the selected musical style, and the selected one or more musical performance.
Abstract:
A graphical user interface (GUI) for an audio editing application enables a user to easily and conveniently shift a temporal and/or pitch of a sequence of note events within a musical piece, e.g., via a touch-sensitive display. The GUI displays a set of note events on a matrix grid and a subset of the note events (e.g., selected by the user) on a note events grid that overlaps the matrix grid. The note events grid is moveable with respect to the matrix grid such that the subset of note events is shifted against the remaining note events while the note events within the subset maintain a spatial relationship with respect to each other. Further, the user can shift the note events grid (and the note events therein) to any location within the matrix grid, without unintentionally snapping the note events to a nearest grid location on the matrix grid.
Abstract:
Some embodiments provide a music editing application that enables a user to compose and edit note characteristics, e.g., via a touch-sensitive display. The GUI can display a portion of a music track including note events. In response to receiving a user selection of a note event and a user indication for editing a note event, the GUI can display a menu providing a list of characteristics. The characteristics can include an option for associating at least one of several virtual instruments or one of several articulations with the note event. Upon receiving a user input indicating a characteristic, the matrix editor can associate the note event with the characteristic based on the user input. The music editing application can allow the user to edit additional note characteristics (e.g., an instrument, an articulation) because of the extended capacity for data associated with each note event.