Anything is possible. But it doesn’t really have anything to do with the Midi Team.
The Haiku API has a BMidiSynthFile class that can be used by any application to play MIDI files. The application that plays the event sounds (is this the app_server?) could use BMidiSynthFile to play MIDI files as well.
However, I don’t think this is a very good idea. Event sounds are usually short and MIDI files are usually lenghty. Who plays a 3-minute song when new email arrives?
If you must, play the MIDI file with MidiPlayer, record the output as a wave file and use that wave file as the event sound.
Aren’t midi files handled by audio translators? Don’t the event sounds get played using translators? If the answer to any of those questions is negative, why?
When the new Media Kit was designed (after BeOS R3), Be originally decided to put MIDI into that new Media Kit. Later they changed their minds and released a separate Midi Kit. (The reason was that MIDI data is very different from other media data.)
So while WAVs, etc, are played by the Media Kit, MIDI files are handled by the Midi Kit. Of course, when the MIDI synthesizer plays back the MIDI files, it uses the Media Kit to produce the actual sounds. But the MIDI files themselves are decoded by the Midi Kit, nowhere else.
Theoretically speaking, an audio translator could read MIDI data and output waveforms (so that translator would be the synthesizer). But that’s not how it happens now.
Hu hu … perhaps i have drink too much beer this weekend but isn’t event sounds were in AIFF in BeOS ???
I think mp3 files for events would be far better from Midi, or ogg to stay with opensource things. If Midi has to use Midi kit and then MediaKit, let’s not use it for event please …