Help on module musicpy.musicpy in musicpy: NAME musicpy.musicpy CLASSES mido_fix.midifiles.meta.MetaSpec_key_signature(mido_fix.midifiles.meta.MetaSpec) MetaSpec_key_signature class MetaSpec_key_signature(mido_fix.midifiles.meta.MetaSpec_key_signature) | Method resolution order: | MetaSpec_key_signature | mido_fix.midifiles.meta.MetaSpec_key_signature | mido_fix.midifiles.meta.MetaSpec | builtins.object | | Methods defined here: | | check(self, name, value) | | decode(self, message, data) | | ---------------------------------------------------------------------- | Methods inherited from mido_fix.midifiles.meta.MetaSpec_key_signature: | | encode(self, message) | | ---------------------------------------------------------------------- | Data and other attributes inherited from mido_fix.midifiles.meta.MetaSpec_key_signature: | | attributes = ['key'] | | defaults = ['C'] | | type_byte = 89 | | ---------------------------------------------------------------------- | Data descriptors inherited from mido_fix.midifiles.meta.MetaSpec: | | __dict__ | dictionary for instance variables (if defined) | | __weakref__ | list of weak references to the object (if defined) FUNCTIONS C = trans(obj, pitch=4, duration=0.25, interval=None, custom_mapping=None, pitch_interval=True) N = to_note(notename, duration=0.25, volume=100, pitch=4, channel=None) S = to_scale(obj, pitch=None) adjust_to_scale(current_chord, current_scale) analyze_rhythm(current_chord, include_continue=True, total_length=None, remove_empty_beats=False, unit=None, find_unit_ignore_duration=False, merge_continue=True) arp = arpeggio(chord_type, start=3, stop=7, durations=0.25, intervals=0.03125, first_half=True, second_half=False) arpeggio(chord_type, start=3, stop=7, durations=0.25, intervals=0.03125, first_half=True, second_half=False) bar_to_real_time(bar, bpm, mode=0) build(*tracks_list, **kwargs) chord_progression(chords, durations=0.25, intervals=0, volumes=None, chords_interval=None, merge=True, scale=None, separator=',') chord_to_piece(current_chord, bpm=120, start_time=0, has_track_num=False) closest_note(note1, note2, get_distance=False) closest_note_from_chord(note1, chord1, mode=0, get_distance=False) concat(chordlist, mode='+', extra=None, start=None) dataclass_repr(s, keywords=None) degree_to_note(degree, duration=0.25, volume=100, channel=None) degrees_to_chord(ls, *args, **kwargs) distribute(current_chord, length=0.25, start=0, stop=None, method=, mode=0) dotted(duration, num=1) find_first_tempo(file, is_file=False) freq_to_note(freq, to_str=False, standard=440) get_accidental(current_note) get_chord(start, current_chord_type=None, duration=0.25, intervals=None, interval=None, cumulative=True, pitch=4, start_time=0, custom_mapping=None, pitch_interval=True) get_chord_by_interval(start, interval1, duration=0.25, interval=0, cumulative=True, start_time=0) get_chords_from_rhythm(chords, current_rhythm, set_duration=True) get_freq(y, standard=440) get_note_name(current_note) get_note_num(current_note) get_pitch_interval(note1, note2) get_ticks_per_beat(file, is_file=False) intervalof(current_chord, cumulative=True, translate=False) inversion(current_chord, num=1) is_valid_note(current_note) load_data(name) method_wrapper(cls) modulation(current_chord, old_scale, new_scale, **args) change notes (including both of melody and chords) in the given piece of music from a given scale to another given scale, and return the new changing piece of music. multi_voice(*current_chord, method=, start_times=None) note_range(note1, note2) note_to_degree(obj) parse_dotted(text, get_fraction=False) parse_num(duration, get_fraction=False) play(current_chord, bpm=120, channel=0, start_time=None, name='temp.mid', instrument=None, i=None, save_as_file=True, msg=None, nomsg=False, ticks_per_beat=None, ignore_instrument=False, ignore_bpm=False, ignore_track_names=False, wait=False, **midi_args) read(name, is_file=False, get_off_drums=False, clear_empty_notes=False, clear_other_channel_msg=False, split_channels=None) read_json(file) read_musicxml(file, load_musicxml_args={}, save_midi_args={}) read_yaml(file) real_time_to_bar(time, bpm) relative_note(a, b) return the notation of note a from note b with accidentals (how note b adds accidentals to match the same pitch as note a), works for the accidentals including sharp, flat, natural, double sharp, double flat (a, b are strings that represents a note, could be with accidentals) reset(self, **kwargs) riff_to_midi(riff_name, name='temp.mid', output_file=False) secondary_dom(root, current_scale='major') secondary_dom7(root, current_scale='major') standardize_note(current_note) stopall() sums(*chordls) to_dict(current_chord, bpm=120, channel=0, start_time=None, instrument=None, i=None) to_note(notename, duration=0.25, volume=100, pitch=4, channel=None) to_scale(obj, pitch=None) to_tuple(obj) trans(obj, pitch=4, duration=0.25, interval=None, custom_mapping=None, pitch_interval=True) trans_note(notename, duration=0.25, volume=100, pitch=4, channel=None) translate(pattern, default_duration=0.125, default_interval=0, default_volume=100, start_time=None) write(current_chord, bpm=120, channel=0, start_time=None, name='temp.mid', instrument=None, i=None, save_as_file=True, msg=None, nomsg=False, ticks_per_beat=None, ignore_instrument=False, ignore_bpm=False, ignore_track_names=False, **midi_args) write_data(obj, name='untitled.mpb') write_json(current_chord, bpm=120, channel=0, start_time=None, filename='untitled.json', instrument=None, i=None) write_musicxml(current_chord, filename, save_musicxml_args={}) write_yaml(current_chord, bpm=120, channel=0, start_time=None, filename='untitled.yaml', instrument=None, i=None) DATA has_audio_interface = True FILE c:\users\christian\appdata\local\packages\pythonsoftwarefoundation.python.3.9_qbz5n2kfra8p0\localcache\local-packages\python39\site-packages\musicpy\musicpy.py