synthesizer

Converting MIDI file to raw audio using a software synth

岁酱吖の 提交于 2019-12-13 11:41:41
问题 I'm trying to dynamically generate a small MP4 audio+video file directly from my Android app. My original plan of attack: The user enters some basic song data (a chord progression, etc) and the app builds a MIDI file. The system builds chord diagrams for each chord, and using a MIDI reader it generates the animations frames array that is timed to the MIDI Convert the MIDI into a raw PCM audio data <-- this S.O. question is specific to this point Apply the raw audio to the animation frames -

iOS sine wave generation - audible clicking

喜夏-厌秋 提交于 2019-12-10 21:47:05
问题 I am in the process of creating a synthesiser for iOS. After playing around and attempting to learn core audio, I have encountered a problem that I cannot get my head around. My sine wave makes a clicking noise on regular intervals, which Im guessing is related to the phase. I have looked at several guides and books on the subject, and all suggest that I am doing it correctly. If anybody would be so kind to look at my code for me it would be greatly appreciated. static OSStatus renderInput

Sending pitch bend to MIDI sequencer in Java

雨燕双飞 提交于 2019-12-10 17:09:06
问题 I understand the basics of getting a MIDI sequencer up and running and I would like to be able to increase/decrease the pitch of the sequence during playback, but pitch bend is a message that gets sent to the synthesizer, not the sequencer. I tried setting the sequencer's receiver to be the synthesizer's transmitter, and when I sent pitch-bend short messages, the sequencer stayed the same pitch but then the synthesizer played a second track at the new pitch bend value, creating some pretty

Web Audio synthesis: how to handle changing the filter cutoff during the attack or release phase?

☆樱花仙子☆ 提交于 2019-12-04 02:30:59
I'm building an emulation of the Roland Juno-106 synthesizer using WebAudio. The live WIP version is here . I'm hung up on how to deal with updating the filter if the cutoff frequency or envelope modulation amount are changed during the attack or release while the filter is simultaneously being modulated by the envelope. That code is located around here . The current implementation doesn't respond the way an analog synth would, but I can't quite figure out how to calculate it. On a real synth the filter changes immediately as determined by the frequency cutoff, envelope modulation amount, and

Converting MIDI file to raw audio using a software synth

那年仲夏 提交于 2019-12-03 12:46:25
I'm trying to dynamically generate a small MP4 audio+video file directly from my Android app. My original plan of attack: The user enters some basic song data (a chord progression, etc) and the app builds a MIDI file. The system builds chord diagrams for each chord, and using a MIDI reader it generates the animations frames array that is timed to the MIDI Convert the MIDI into a raw PCM audio data <-- this S.O. question is specific to this point Apply the raw audio to the animation frames - and encode the audio and video frames into an MP4 Provide the resulting MP4 video to the user with

@property/@synthesize question

别说谁变了你拦得住时间么 提交于 2019-12-02 18:16:35
I'm going through all of my documentation regarding memory management and I'm a bit confused about something. When you use @property, it creates getters/setters for the object: .h: @property (retain, nonatomic) NSString *myString .m: @synthesize myString I understand that, but where I get confused is the use of self. I see different syntax in different blogs and books. I've seen: myString = [NSString alloc] initWithString:@"Hi there"]; or self.myString = [NSString alloc] initWithString:@"Hi there"]; Then in dealloc I see: self.myString = nil; or [myString release]; or self.myString = nil;

python synthesize midi with fluidsynth

瘦欲@ 提交于 2019-12-01 01:14:25
问题 I can't import fluidsynth. [Maybe there's an better module?] I'm trying to synthesize midi from python or pygame. I can send midi events from pygame. I'm using mingus, and it seemed pyfluidsynth would be good / easiest. I think this means pyfluidsynth is installed, but a seperate fluidsynth was not. I don't know if it requires a 'fluidsynth' installer to work? test.py: import fluidsynth print ":(" error: Traceback (most recent call last): File "test.py", line 1, in <module> import fluidsynth

What is common case for @dynamic usage?

岁酱吖の 提交于 2019-11-29 11:20:23
There is previous post about difference of @synthesize and @dynamic. I wanna to know more about dynamic from the perspective of how to use @dynamic usually. Usually we use @dynamic together with NSManagedObject // Movie.h @interface Movie : NSManagedObject { } @property (retain) NSString* title; @end // Movie.m @implementation Movie @dynamic title; @end Actually there are no generated getter/setter during compiler time according to understanding of @dynamic, so it is necessary to implement your own getter/setter. My question is that in this NSManagedObject case, what is the rough

Beginner Digital Synth

点点圈 提交于 2019-11-29 02:29:40
I'm looking into writing a audio syntesizer in Java, and was wondering if anybody has any advice or good resources for writing such a program. I'm looking for info on generating raw sound waves, how to output them into a usable form (playing over speakers), as well as general theory on the topic. Thanks guys. secr This problem is basically about mapping functions to arrays of numbers. A language that supports first-class functions would come in really handy here. Check out http://www.harmony-central.com/Computer/Programming and http://www.developer.com/java/other/article.php/3071021 for some

MIDI player/synthesizer library for the iPhone

核能气质少年 提交于 2019-11-29 00:41:24
Does anyone know if there is a free/cheap MIDI player/synthesizer library that I can incorporate into my iPhone application? As I understand it the iPhone doesn't have native support for MIDI playback. To work around this limitation I've created a bank of .caf sound samples that I playback myself but I'd really like to improve the implementation and use MIDI if possible. Any advice would be greatly appreciated. Since 10 october of 2011, that is since iOS5, Apple has started delivering basic API for midi files playback. The API is called MusicPlayer along with MusicSequence Check this out :