midi

Simple embeddable MidiSynth for iOS?

岁酱吖の 提交于 2019-12-29 03:49:25
问题 I have a guitar diagram app for Android that I am porting to iOS. Android has a embedded midi synthesizer (sonivox), so I can generate midi files and let Android handle the playback. Is there a way to do this on iOS? Or are there very lightweight embeddable synths for iOS? 回答1: Update: My answer is out-of-dated. @lukebuehler's answer is much appropriate. If you don't mind non-opensource solution, try FMOD. Being a commercial audio engine for games, fmod equips a simple MIDI synth. I've tried

How do I use the Swift sampler to play a tone then pause before playing the next?

烂漫一生 提交于 2019-12-25 07:25:43
问题 I have code to take a sequence of letters in a string and interpret them as notes. The code will then play the notes. The problem is that they all play at the same time. How do I play them each as a quarter note, essentially to play a note, wait for it to end, and then play the next note? @IBAction func playButton(sender: AnyObject) { fractalEngine.output = "adgadefe" var notes = Array(fractalEngine.output.characters) var counter = 0 while counter < notes.count { var note = notes[counter] if

Receiving Sysex messages with audiokit

橙三吉。 提交于 2019-12-24 17:07:24
问题 I have an app which is sending controller settings to a hardware synthesizer using sysex. In other words: such a syses messages selects a parameter from the synth, and sets its value. With audiokit this is pretty simple. Such a message looks like this: [240, 00, 32, 51, 1, 16, 112, 00, 40, 95, 247] Which sets parameter 40 (in parameter group 112) to 95 00, 32, 51, 1 defines the synth model, other the part number and channel. Now I try to build the opposite: the synth sends its parameters and

Accurate delays between notes when synthesising a song

泄露秘密 提交于 2019-12-24 16:04:32
问题 I'm writing C++ code which plays both digital audio (synthesised music) and MIDI music at the same time (using the RtMidi library.) The digitised music will play out of the computer's audio device, but the MIDI music could play out of an external synthesiser. I want to play a song that uses both digitised instruments and MIDI instruments, and I am not sure of the best way to synchronise these two audio streams: It is not possible to use a function like Sleep() as the delay time is both uneven

Midi device connection and latency

蹲街弑〆低调 提交于 2019-12-24 14:23:03
问题 I am currently programming a simple application that takes input from a midi device and outputs the corresponding sound. I have gotten everything to work now, but there is two problems: First of all, when i plug in a midi device AFTER the program has started or disconnect it while the program is running, it will not get recognized. Second of all, the latency is very high on OSX, Linux (Raspbian) and Windows alike. Does anyone know the solution to either of these issues? Here is what i have so

How to get integer value from byte array returned by MetaMessage.getData()?

时间秒杀一切 提交于 2019-12-24 08:38:00
问题 I need to get the tempo value from midi file. I found out, that the set_tempo command has value 0x51, so i have this piece of code: for (int i = 0; i < tracks[0].size(); i++) { MidiEvent event = tracks[0].get(i); MidiMessage message = event.getMessage(); if (message instanceof MetaMessage) { MetaMessage mm = (MetaMessage) message; if(mm.getType()==SET_TEMPO){ // now what? mm.getData(); } } } But the method getData() returns an array of bytes! How can I convert it to normal human form, a.k.a.

Preventing a BPM ticker from slowly drifting out of sync with a real metronome

两盒软妹~` 提交于 2019-12-24 08:25:33
问题 I'm working on a music generator that takes a BPM value as input, after which it will start generating some chords, bass notes, and triggering a drum VSTi using MIDI signals. In order to keep everything running at the correct number of beats per minutes, I'm using a wall clock timer that starts the clock at 0 when you hit play, and then starts counting 1/128th notes as "ticks" at a regular interval. Every time the function ticks over, I check how many ticks into the future we are by simply

Java midi note to string mapping via octave of a note

余生长醉 提交于 2019-12-24 07:58:18
问题 In my project I want to be able to at least inform the user what string the note they need to play is on. I can get the note and its octave but as I've discovered, that note and its octave can appear in multiple places on a guitar fret board. So my question is: Is there anyway to map a midi note to a guitar string? 回答1: Here's code that takes the MIDI note value and returns the position on the guitar fretboard closest to the end of the instrument. Fret zero is an open string. static class

Access self from a C style pointer [duplicate]

雨燕双飞 提交于 2019-12-24 07:03:22
问题 This question already has an answer here : Swift: Pass data to a closure that captures context (1 answer) Closed 2 years ago . I'm working on an application that utilises MIDI equipment. After some fooling around in a playground with CoreMIDI, I found how to get MIDI input signal, so I implemented this: func makeInputSource() { var midiClient : MIDIClientRef = 0 var inPort : MIDIPortRef = 0 MIDIClientCreate("WobClient" as CFString, nil, nil, &midiClient) MIDIInputPortCreate(midiClient,

QuickBasic 4.5 Gravis Ultrasound Library

社会主义新天地 提交于 2019-12-24 06:45:36
问题 I am currently working on a small project in QuickBasic that requires the use of MIDI files. As the DOS environment I'm using DOSBox 0.74, which provides emulation of the Gravis Ultrasound card. So far, I've been able to access GUS only by using the PLAYMIDI.EXE file in the C:\ULTRASND directory. However, it is impossible to launch it from inside QuickBasic: the SHELL statement creates a child COMMAND.COM process, so when PLAYMIDI.EXE starts playing in the background the child process