Auto Scroll
Select text to annotate, Click play in YouTube to begin
hello welcome everyone thanks for coming the slightly flippant title of my talk is better MIDI by Friday I'm not alluded to the fact that I want to talk about some of the many things that are happening in the MIDI Manufacturers Association and the one
that specifically okay the one specifically that are imminent so it's not actually by Friday but we hope that they're about divided soon and they've become part of a spec is there something we can do about that sound okay you want
me to keep talking okay technology okay well there probably is on can everybody hear me anyway oh good right let's be handheld so there are two things I want to talk about specifically
um the first is MIDI capability inquiry or if you're American inquiry and set next-generation protocol which I will say very little about because well it'll become clear why I'm not going to say
much about it so three quick disclaimers if I'm speaking on behalf of the MIDI Manufacturers Association it saves me saying these things now so I don't have to say them with every slide the first thing is time I have 20 minutes 25
minutes I think to explain the things that I need to explain and that's only really time for an overview and it's a time to inform a little and infuse a little but it's not a how-to and it's not a tech inspiration it's it's simply an introduction to what's what's around
the corner the second caveat is I have a front row full of people who know much more about this subject than I do while I led the efforts of mpa and i chaired that working group i much more of a passenger
in the capability inquiry working group and then the next generation midi working group so as well as being able to answer questions i'll probably also try and question answers and i may not be the authority in the room and i fully acknowledge that so please feel free to
interrupt and throw things the third thing is fake news when there is consensus led development of any specification and you have a lot of commercial vested interests attempting to cooperate things can change quickly
people can get more involved and suddenly things that you thought might have been unassailable become quite problematic for a few users and things end up changing in specification committees so everything you read now is
an intention right now but stuff might change so don't treat it as infallible okay so now I've said those three things at the beginning of my talk I don't have to say them in every slide MIDI capability inquiry inquiry what it is
why we need it what it does not necessarily what it is so if you go back to the late 1980s when MIDI was can when MIDI MIDI was conceived in the early eighties but over the next few years it
kind of popularized and settled down a bit and the use cases became entrenched so this is a traditional use case of MIDI in the kind of thing it does very well you have Atari ST there because they handled MIDI natively which gave him an effective
monopoly in recording studios for about 20 years so on a super Jupiter so a a mid to late eighties example of a synthesizer module and what MIDI is very good at is this monologue and that's
that's how it exists at the moment so you have the Atari ST telling of synthesizer there's a tick to synchronize its its sequences and arpeggiators and and and synchronous
note rate parameters there's no two vents so you can play C sharp 3 middle of velocity 100 in you can release it at the end control messages pitch messages all very well defined everything knows how to process them and it's very good
at that kind of one-way conversation the other thing that MIDI has been quite good at from the start is manufacturer-specific dialogues so you can say in the computer can say in rodent language send me preset 1 all the data you have and the device will
respond in a manufacturer specific language starting F 0 form which is this is a Roland packet and ending F seven end of packet and everything in the middle is manufacturer specific so it can be documented as much as a
manufacturer wants it to be and you can transmit data and receive data that way but everything that you do is germane to a manufacturer and a specific product so you can have dialogues in MIDI but they're not really part of the the open
specification so if you think about the first two decades of MIDI and how it was used and why it was conceived physical interfaces were quite an expensive thing so a computer has one midian arm or midi out and that's the way it talks to
everything there is one port and several channels and all of those channels are used to communicate with different instruments or with different patches on the same instrument so your MIDI studio is one cable made long equipment shares
the same port through MIDI through so mid through was a third socket on MIDI devices and all that does is echo what's coming in the MIDI import outwards again so you can daisy chain things together off the same MIDI line MIDI is a lingua
franca so those manual communication at both ends you set up your patch before you play it you'll make sure the synthesizers are set according to pitch what pitch Bend you expect and and the volumes are set nominally and everything
gets configured at the computer been manually and configured at the synth in manually and from that setup you then start playing out MIDI and recording what comes back and that was the way that a MIDI studio would have worked for for quite a long time probably up to the
early 21st century that's the that's why MIDI was originally conceived you've also got this other layer of stuff which is kind of semi opened so there's a universal system exclusive message you don't have to read all this it's just there to demonstrate the fact that it
does exist cool device inquiry that is a hello what are you message and many devices do follow the specification you can speak to something and at the other end of a cable not knowing what it is and it will play back giving you its
manufacturing device numbers this is sort of useful especially in the days of Windows XP when every class compliant MIDI device he plugged in was called USB audio device and you didn't know if you were talking to the piece of equipment
that you wanted to talk to this was great for disambiguating that kind of thing so they're great advantages to this this part of the MIDI protocol the first thing is that we can confirm that we're capable of dialogue and we're not speaking down a DIN cable we're speaking
through two din cables or a USB cable and we can actually get a response from the device we're talking to we can chat to our friends so if we recognize the manufacturer ID and the device ID that's coming back then we can start having a
conversation we can be a patched librarian or or some kind of firmware upgrade er or some sort of sequencer downloading samples or or patches or other things so we can have that dialogue device inquiry is not
universally supported there are some things that we'll just sell cannot send a response though if some MIDI devices are so basic they don't even handle system exclusive messages I wrote one once in the days when 8-bit processors
were around and you had very little ram to handle midi packets you had very little rom to write your firmware in it was actually fairly standard for system exclusive messages just to be ignored and that's still the case for a few devices even today and everything we
haven't explicitly coded into our software everything we don't have a lookup for in our manufacturing device IDs is still a mystery to us so there is no there's no way of building on top of the open midi specification something that's more
general without going to the manufacturers association and writing a new recommended practice so there we are and then there was a thing called general MIDI which came out around 1991 as attempt to make the specification a bit more generalized if
you were transmitter actually a lot more specific if you're a receiving device outside the general specific MIDI depending on which hardware you make but that was when channel 10 became the percussion channel and that was when the drum map wasn't worked out so the bass
drum is on C and the I had his on a flat and B flat for some reason and when various patches and presets were put in the standard places so piano is the first preset you have and then the very
curious telephone ring and dog bark pockets were put in at the end somewhere all of that was part of general MIDI earnest and while general MIDI is immoral aberrated specification there are quite a few synthesizer now that that just just support that
automatically you turn it on and it's in general MIDI mode so that was an attempt to just and a dice some some deeper things so latter-day MIDI we have this
this wonderful situation where we can communicate from a computer to a device and if we recognize the device and we can have a manufacturer specific conversation with it this is a general MIDI this is this is a genuine MIDI
stream and I can't remember what it means I think it's just a statist um saying at the initial settings of a seaboard rise that's the kind of thing that happens unfortunately if you're trying to write a generalize piece of software then you plug in some other
device welcome back to 1983 we know how to talk to you in in the language that we invented originally but it's quite hard to build a richer experience so this is what we're trying to improve I've got a minion over there by the way
I was told that it might be pejorative to use an other piece of manufacturers equipment so I picked one that I designed many years ago so I'm yeah I'm guilty of that it's a very nice product
so these days we have this kind of explosion of devices that do one thing well we've moved beyond general MIDI so it's the attack of the drawbar organ as for example and it would be nice if you
had the software and hardware drawbar organs that emulate b3s and another drawbar things it would be nice if you could say well I've had this piece of equipment for 15 years I wonder what it sounds like if I load up the latest
wacky software plug-in that's there b3 emulation and if it sounds any better and you can kind of do that but there's no way of setting up that new software plugin exactly as you had your old one unless all the manufacturers have colluded and there's no way necessarily
of having the drawbar changes map to the same control parameters so when things change MIDI has to change very slowly because the general MIDI spec is huge there's a there's billions of dollars of R&D already invested in it and these things necessarily change at a
geological pace and so and we have things like drawbar organs we have new emergent devices such as expressive controllers so I can harp I can continue humble instrument see board all things that allow per note expression in a much
richer way than was originally envisaged by midis creators and we also have things like grid controllers where okay there may be some variations on a theme blocks is rather different from a Novation Launchpad or a mono and the lights might
do different things but in the end you have an 8 by something pad and to be able to use some kind of lingua franca to communicate with some other manufacturers pad is great for the manufacturer and it's great for the community so there are things that MIDI
could do better and this is where capability inquiry comes in so there were three things that we're setting out to standardize in some way the first thing is protocol negotiation which is a
future proofing the second thing is profile configuration I'm hope I'm getting the verbs right so that we can have these general general devices that don't necessarily have everything that
general MIDI requires them to have but nevertheless can talk in a kind of class compliant way with other MIDI devices and we have property exchange I'm going to use exchange where the control
messages and mappings and patch names and so on it can again be exchanged in a standard way so we'll start with protocol negotiation I got the verb right brownie point all devices still
speak MIDI 1.0 the specification that we all know and love today if devices both devices support another protocol and then they can say that they both support it then they can agree between them to switch and one device
will be dominant and the other device will be submissive and there's a kind of arbitration process whereby one is allowed to switch the other and the test message but the monster protocol switch has happened insures everything's gone alright and if it hasn't gone aright then it can fall
back to MIDI one point naught again so customers don't get too grieved if something doesn't quite work as advertised the thing that isn't part of this that I've had to change from my original slide presentation but is still
worth mentioning is that with USB MIDI you have a theoretical very fast data transfer rate nevertheless Computers still restrict you to the dinner rate of 31.2 hello
30 1.25 kilobits per second which is exceptionally slow by modern standards the reason for that is that if you try sending fast stuff to a device such as a
MIDI din interface that converts you to the old format and its buffer gets full it can possibly crash because it's not being tested when the buffer gets full or it can start dropping packets and behave sub-optimally and so it's much
safer for the driver writers at Microsoft to say right we're throttling everything that says it's a MIDI device just in case this happens because we don't want support calls and we don't want negative publicity so there is the speed limit on even USB MIDI it would be
great if we can lift that because it's things like the C board where you send very fast rates of data out of it as a matter of course suffer and you have to affect the temporal resolution of the data you're sending in order not to fall foul of this limit so that's something
that we're going to be doing in future but probably part of the USB MIDI specification and not a layer on top of that next thing is that manufacturer specific protocols are supported so a third-party can come along and write a
new protocol and they can attach their own manufacturer ID to it and anybody can use it it doesn't have to be that manufacturer so the protocols can have manufacturers as well as the system exclusive messages and it means that
there can be an explosion of ideas in the marketplace if that's what we want next thing is profile configuration this is the most exciting part I think
profiles specify it's a second most exciting part I'm building up to it profiles specify how a receiver responds to certain MIDI messages so the drawbar organ is an example of that very Pro var
will be followed by the hardware or the software you are making that makes sound and receive the message isn't and it's not something it's not a problem not properties of the transmitter receiving devices have profiles transmitters don't they might support
them this is an early draft so the shell TCAPS there is if anybody is watching this on the video as making a drawbar organ they don't just copy this MIDI implementation chart this is a very evil is an early piece of draft work and it's
being used by a couple of manufacturers at the moment to try and get the profiles tested as a proof of concept but as you can see drawbar organ is being very much based on a Hammond prototype there are perhaps eight to ten sliders as a percussion control vibrato
rotary speaker simulation every drawbar organ simulator in the world as far as I can make out will support all of these and having a profile will give them a standard controller map and it doesn't and it can be a controller map that
interferes with some other different profiles that might be doing other things I can turn the shaft II caps off but I think I finished talking about this so I can move on to the next slide profile configuration what are we
talking about yes a device can support many profiles and the profiles can be channel wide it can be independently turned off and they can either be device wide so a whole device can switch your profile or they can be in BOM MIDI
channel only so it's a little bit like taking general MIDI and splitting it up into a much more explicit way of doing things so specific devices can implement part of part of the general MIDI specification or part of a new MIDI
specification support control mappings on certain channels and have patched them have program change numbers in certain places and and you can make very specific devices that still talk in a
standard way to other kinds of devices in that class they can be turned on or off so it's kind of if you look at the way that general MIDI works and the splits into the keyboard categories
piano organ synthesizers stringed instruments turf or plucked and bowed variety and the various things that happen it's quite it's quite easy to envisage a future where there are
specialists working groups who are making woodwind controllers who are making pianos or let's say subtractive synthesizers which will have a very very similar ways of of similar
conceptual models there wasa laters and LFO this and envelope controllers you can support general MIDI and have to do all of this now you only need to do some of this if we split these up into different profiles and different expert
groups can move things in different ways and it just means that we can move a lot more quickly as a as a group without being shackled to everything in general MIDI and all the whole gamut of stuff that general MIDI makes you need to
support so you can see a future where that happens and where percussion could be on any channel you like because you've configured a profile to be on a certain channel next thing is our property exchange this is this is very
exciting because it completely changes the conceptual model of MIDI so we always set property values by sending controllers or non registered parameter numbers or any other kind of control
change messages to another device what we could never do is easily find a way of reading those controllers back from the device the read-only stuff normally comes in a manufacturer specific
parameter dump or some people put a button saying send all on their devices which sends a massive great burst of control data from the device but there is no way otherwise of reading back standard control messages so we'll soon
have a standard way of doing that which should blow your mind in a very small way because there are a lot of companies making controllers and this has been a big limitation of of controller
technology for a long time the next thing we should blow your mind even more is that we can do that well we will be able to do things and remember the three three disclaimers I made at the beginning we expect to be able to do
things such as finding the minimum of maximum expected values of control whether it's discrete or continuous whether it's something you page through it's better attached to a knob and and although those kind of useful things and
what the control mappings are of a device so what you control with a particular CC number or an RPN all that stuff can be read back in a status dump and suddenly it makes things like Novation or Native Instruments general
control devices and other general control devices are available but they're suddenly means that the experience can be much more automatic richer and almost instantaneous and and the most mind-blowing thing is that
we're exchanging property names and we can use we can start sending standard messages in in in well nominally utf-8 at the moment but that sub subject to final settlement using JSON which is this some strange
web development specification for exchanging hierarchical information so we can do a complete we can do complete recalls and patch dumps of any device in a in a way that is readable by other
devices so you can do that wonderful thing of deciding you don't like your 15 year old Hammond organ simulator anymore and you want to buy the latest one and your door will your more complicated torn out because this stuff is a little requires a little bit more of the
transmitting device will be able just to unplug the settings from that old old device and transfer them to the new one so they should sound the same they should respond in the same way they the
patches should be very similar so all of this stuff is coming in capability inquiry terms and conditions apply but it is quite transformative in terms of what MIDI can do here is an example and it's a very concocted example of the
kind of stuff that is being worked on at the moment so there's a system univer Universal system exclusive message that wraps up this JSON and then somewhere in the middle there is this some ASCII text well utf-8 but they all look the same if
you have a top bit clear where you can transmit very standard Bank program numbers text you can even stick tags in there that are readable and all this
stuff suddenly becomes a standard way of organizing patches and storing status information in your digital audio workstation or or between any two devices oh yes there's an asterisk there
JSON payload I need to point out the fact at the moment this might not necessarily end up being a JSON payload and it might not look the way that is just looked and all of that stuff so there's a separate working group called prop I'm working on this part of this
capability inquiry spec called property exchange the two might not appear at exactly the same time but the wrapper will be ready for the pail I just wanted to make that clear because part of my job is to set expectations
and the engineers job is to set expectations first and then meet them later so I want to make sure the expectations are a nice and easy to surmount and summary for capability
inquiry protocol negotiation for future proofing so we can start with midi 1.0 and go faster or go richer when we want to profile configuration to assist compatibility of your device or any
devices and parameter exchange which if you're making controller hardware is it's quite revolutionary and changes an awful lot initially we're starting out with no protocols there may be some
appearing soon but they're not part of the capability inquiry spec and a few profiles forth or test purposes and just to make sure that the infrastructure works and is fit for purpose but more can be defined and if you want to define
your own if you've wanted to join you design your own profiles and I recommend you join the MIDI Manufacturers Association because it's very inexpensive and very good now I've got my plug in now and remember all of this
is draft nothing is certain so everything you've seen hence here may be a well-intentioned lie let's get on to the other exciting thing next generation protocol I can say very little about
this because of the because of the churning and the vested interest and the fact that if I start trying to nail something to the wall the wall may move so I'm going to say everything that I can about where we are at the moment and
where we intend to get you remember last year see maybe have a thousand views but I've only got one percent approval rating I want to point that out I'm not boasting some last year I started
introducing HD and where it was and I want to make it clear that things have moved on quite a lot since then HD itself is split into a kind of and it's not called HD anymore but it's kind of split into a long term vision and a
and and what we need for the immediate future wishes are less disruptive faster moving easier to implement version of the MIDI specification that fixes the frustrations that people are facing on a day to day basis and the long term strategy is about what you might be
using in some years time so um that's how long-term trustee works people want the whole chocolate shop but actually if they're screaming you just want to give them a chocolate huh for now and the
chocolate bar represents the things that people are finding very frustrating about MIDI now and those are the things that we really need to deliver while not losing sight of the fact that it would be nice to build a massive chocolate factory what you might look forward to okay
it'll be negotiated using MIDI capability inquiry we know that for certain and it'll be on the protocol there it'll be a familiar Pro approach and so while the note model for MIDI where you have note on and note office
constraining if you're building strange instruments where notes don't necessarily start at a point in time or they don't have a pitch associated with them the conceptual model will be fairly similar to MIDI now that notes will have numbers associated with them that may or
may not pertain to pitch but there will be starts of events and ends of events and the whole semantic model can't change too much because if it does and you're trying to build an instrument that supports both protocols you'll have to build the same instrument effectively
twice in software and we don't want to do that and no to touch parameters will be new so possibly evolving MPE which was a kind of note per channel version of MIDI to make MIDI expressive on a on
a note level basis which I made quite clear last year is a kind of sticking-plaster for now and it's as good as we're going to get using the current model to actually have a parameters that are attached to notes that can be defined at notes start changed during the note and finish at
note end is one of the intentions of this protocol more of what you like so people ask for more channels I think 16 channels is 15 too many but other people need a lot of MIDI channels because they
use MIDI in a different way controller numbers so that we don't have the limitation of 127 controllers and then this big R P N and n RP n space it looks like it needs tidying up and it probably
will be in some way more bits of resolution because people complain about 7 bit controllers being far too granular and if you're making something like a mixing desk with a nice 20 mil 200 millimeter fader then then it is far too
granular and 14 bit controllers are available but not not over the entire range of controls bandwidth of course so I've already talked about that unless of what you don't like and the thing that people
don't like about MIDI is and non-atomic messages where one piece of information is conveyed by several separate messages that have all kind of loosely meaningfully coupled together but not
necessarily packet in an official way so and I'll be a setting of non-registered parameter number involves sending four separate control messages that a device has to aggregate and keep together and understand semantically in order not to mess things up when you're editing a
MIDI stream program change messages are often often Prien before they're often preceded by a bank change message so that's another piece of information where you have a control message and a program change message that meaningfully
grouped together there are all kinds of examples of this in MIDI because of the way it's evolved from its very first very first implementations and you have many different messages that
semantically mean one thing and aren't the same message so we'll tidy up that kind of stuff as well so with any luck all pieces of information will be atomic that's what you can look forward to so it's a tidy up of the MIDI spec rather
than anything revolutionary and it's more of what you like an s of what you don't that's what we're for and finally I was asked to include this diagram because it's useful but it's not really an introduction to capability inquiry it's more of a summary slide and if I
started with this picture I'd probably spent half an hour on the same slide so this is how it all breaks down and this is part of the specification capability inquiry in the middle and you can talk to hardware and software devices
bi-directionally and arbitrate between them in case of anything failing if you're trying to negotiate faster protocols it will always fall back to MIDI 1.0 so you're not losing anything by adopting this specification and
you've got profile negotiation life profile negotiation portugal negotiation on the left profile configuration in the middle and property exchange on the right and that's this is my last slide because it's a whole of everything i've
talked about in one handy rectangle thank you very much they won't have any questions that I can forward to somebody else okay
that's basically what I'll be doing anyway oh well ah there we are
yes excellent okay hello everyone hi I'm Amos Gaines from like mode music and the MIDI Manufacturers Association technical standards board and I have a cohort of
colleagues here from the MMA Tom white is MMA president we have Phil Burke and Florian bohmers and JB who are members respectively and collectively of the working groups who are working on
everything that Ben was just talking about and the Executive Board of the MMA and between us we probably know a thing or two about all of this and are interested both in answering questions
and gathering requirements and ideas and inspiration you know from from you based on what you've heard here and and also not at all subtly encouraging any of you
who have ideas or your own agenda that you would like to inform this work as it develops too please join the MMA that's the best way to you know the best way to get your ideas heard and to become a
part of the final standard is to join and and contribute to the work it is you know we're a member based organization whose purpose is to facilitate the work of our member companies so if you were
doing related work please join us and pitch in and help craft the final standard and so essentially I think the best thing to do is to open the floor for questions based on Ben's talk and we
can riff on that and take it from there so if anyone has any questions about what is coming in the world of MIDI and what you just heard please by all means we have one microphone for the entire room I
think we may have to pass it back and forth or just speak up no we should use the microphone because we have about 25 people watching online also very go all right maybe we can repeat the questions we should hear the question as well but
I can say kid the mic here well should we start with question or should we have everyone present were they here the latter sounds fine sure we can become less anonymous to those here and abroad
some would you just don't I liked being anonymous that's better actually hi I'm Tom White and the president of the MIDI Manufacturers Association and the only significant thing about that would be
that I do not represent a member company I am the the one neutral person and I'm the mediator however I'm the mediator my
name is Phil Burke with Google and my background before Google a lot of it was associated with experimental music and experimental intonation so one of the things I was particularly interested in and one of the reasons I got started
with the new protocols was because I was interested in opening up the pitch space and having fractional pitch and greater pitch control so sort of my original agenda now working with mobile devices
they are particularly useful furs or multi multi-finger gestures where you can have instead of with a mouse you've got sort of one cursor moving but with the mobile devices with touch screens
you can have multiple gestures in action at once with each with multiple dimensions and want to make sure that that is can be expressed appropriately through MIDI that's a high so I'm Jean
Batista it's a boy I work at Rowley and I got involved with the MMA when we wanted the Seaboard which you may have seen before which sends to reach an expression we wanted to find a way to
for the signal to communicate seamlessly whether it was every software out there and it just wasn't possible using one media channel and all software almost all software the time we're just listening to an Channel so we needed to find a way to to
encourage adoption and and we agreed with the few manufacturers Roger Linn and Keith McKinnon and a few others too to have the same exact way of sending data now we needed to convince the
software the world and so we did manage to convince issue or work with the future to get there but to get it standardized we brought that discussion
to the MMF to have this evolve this idea evolved into into a bigger and more established protocol so now I've been working with Emma for two years and what
I'm very keen to see happening is is to have more software engineers of software developers come to the Mme to represent that because the hardware is very well represented there but the software is a
bit missing and and that's you know software rules the world now so we need software plus how that you work together so yeah playin my name is Florian
bombers I work at bomb software creating MIDI processing software and hardware and well my main interest in MIDI is to create expressive controllers or to
support expressive controllers to make more expressive music and for that I would like MIDI to be faster and have more resolution I'm still been a supper
and I am jb stable major Rowley and before that I was Novation for six and a half years so I've been involved in MIDI for quite some time I chair the working group on MPE so there's some polyphonic
expression which is now coming to the end I think of the of the development cycle in terms of getting the specification finalized and I've been enjoying the tension between trying to get a specification right for as many
parties as possible and trying to get it out quickly while it's still relevant and while there's still an urgent need for it so yeah the tension between engineering and marketing is a very interesting one and it's something where that nobody ever finally resolved
in this very interesting place to be when you're chairing a working group all right now I think I would like to
request any questions or thoughts opinions desires motivations criticisms surrounding all these concepts from you the audience all right I reviewed in a second distant
ah hi guys that said some sounding very great good I'm very excited to hear about it just a quick one we're having more channels for example it does place a bit more burden on the receiving
device to maintain the state takes up more RAM for example I think more channels is great how about keys is there going to be more key numbers because I'm kind of
currently tied to 128 and it works really well even for alternative controllers like guitars things like that so will there be more keys I'll need to support well that may depend on
whom you ask and on which protocol you were talking about I would say certainly there will always be midi 1.0 which will always have its same number of keys and that's always the default or fallback
right so go ahead one answer is that there are manufacturers in their respective working groups who like more note numbers or key numbers to do more
interesting things with it but as you said backwards compatibility is always a problem memory constraints so I don't
think we can say something like that will come but it's definitely considered just this doesn't MIDI 1.0 there are some devices that even though MIDI has 16 channels are some devices that only
support one channel and there's no reason that if a protocol supports hundreds of channels a device needn't necessarily support all those so we're
looking at multiple different protocols but generally there's there are ways to device to exchange information and be able to say well how many channels do you support that so that could be part
of these capability inquiries so a device might just say I'm just gonna be a really good you know single channel device and I'm gonna support a 16 note range and that's all I do so um even
though the protocol may support many controllers and many channels and very high resolution that doesn't mean that that every device has to support all those it could choose a small subset of capabilities right there would
definitely be no hard requirement that one must support more notes than one desires and there would be a language for negotiating you know I support this many notes ok I'll only send you that many notes that's almost the entire
principle behind the capability enquiries is to be able to say we can actually issue a specification that has much more than anybody needs and then they can pick what they need you can just question it and say do you support this great thank you that's all I'm
gonna do well why do you don't you skip the old protocol and complete creating a
completely new protocol at this modern I think about Ethernet connection I think blowing up that old protocol maybe is something that this could be happened 10
years ago now we are in a completely different time throw everything away make a completely new protocol that is called maybe MIDI next generation and have a fallback line but separately
coding it's maybe a small side path flight like the Intel has always this score is a process or just make it comfortable but I think we want to maybe transfer
even samples from from these devices and make us steps stone that everyone can accept or not the big manufacturers maybe say okay that's cool transferring
samples and between the workstations and so on maybe even protocol as is capable da way W capacitive abilities and then you have a very good start with Jason
for example new protocols that are very good to read to handle to to pass and all these things I think this would be a approach that for me for as a software
engineer would be the best one almost exactly 10 years ago we set out to do exactly that thing and here we are 10 years later and that effort did not succeed precisely because it was very
hard to get industry buy-in for blowing up the old protocol that was sort of the sticking point and so where we have arrived now is the viable path forward
because there are you know it's been mentioned there are billions of dollars at this point sort of invested in MIDI and billions of devices which support it etc is rather than blowing up MIDI and
saying here's a completely discrete new protocol which is not backwards compatible the idea is to negotiate a path forward so that MIDI is essentially the language in which you determine what
this next general what you know how you get to the next generation as you negotiate to it via MIDI once you once you establish that you can perform this negotiation then the field is open for
entirely new protocols which once both devices agree upon them can have all of these advanced capabilities and richness and you know transferring of huge amounts of information at higher speeds
it was just that yet blowing up the old protocol was deemed non-viable by you know there enough of the industry was unwilling to do that that it wasn't a viable path forward but all of the work over that past 10 years
developing these feature capabilities we should be able to deliver those via this less contentious method that would be my initial response if anyone cares to elaborate por favor anyone else once you have something we have a few more questions to go I would
just like to reinforce the concept of how MMA works which is consensus-based so he was implying if I'm trying to be explicit about it if you for example even in this room that sounds like a great idea but I'd go ahead and
challenge you to get over half the room to agree with you it's gonna take some work so that's the challenge inside of ma 2 is we had a bunch of people say let's just do a completely new thing and
over half the people said no ok let's do that ok so wants to blow up MIDI and it's not something new that's fairly
reflective of alright so those watching online there was like about 15% I could get consent all right next question next
version here and then to Ben go back to your on the slide about the different profile configurations and having different working groups for sealing a
subtractive synthesizer makers or you know organ makers has there been any thought about also doing that for like on the hardware size you know we've seen a big rise in trend and you know melodic
grid controllers from like you know geo shred to the iPad 2 push you know I would love to see a working group based around so of you know defining you know scales keys in octaves for these so you
know ok I set a scale in my door my my plug in and whatever controller I set you'll you know lay that scale out for me and as I've been any fault in that at all well that would be transformative specifically regarding that there isn't
a working group at the moment but I would encourage people to set one up and as we found with MPE if you're working on a fairly open standard specification for a way of doing things
it was it wasn't too hard to get 20 manufacturers interested in what we were doing and speaking in the same way and to build that consensus up so there's definitely room for that to happen and I think if it did happen it would very quickly become adopted because there's a
real hunger for it if hardware manufacturers can talk to third-party software and vice versa then suddenly you go from having a few products that you're trying to sell to having a whole ecosystem you can cooperate with and everybody benefits from that both kind of commercially and
technically so that's my advert for making it happen I totally think we should it would be absolutely brilliant if there were a common specification a common language for mapping scales to these grid devices
which are also self similar except that there's no way to set that up automatically please join the MMA help us make that happen that is you know I think that would be transformative and you get a lot of buy-in from people who
as soon as they recognized that there was you know there were multiple people wanting to work towards that same goal I think you could even get a lot of a lot of buy-in on that how do you fancy
chairing a working group JB you have the time I don't know yeah just thanks my mind is quite simple I really wanted mostly to thank you guys all I know
you're in the unenviable position of building consensus and this is hard every every year I see these debates online and stuff so thank you very much Ben I always love your presentations I'm gonna get on YouTube and give you a thumbs up so he can get you above 1% but
my questions are very simple I got here late so I wanted to know if there was a timeline for any of this like a year from now maybe is this gonna be out that kind of thing so back in 2005 we said we were gonna do
a new protocol so does that answer your question unfortunately we can't predict when we're going to be done because it is it's building consensus so I can tell
you that it's also it's a policy not to try to to mislead people and then Ben did say that the very beginning that anything that we say could change it's not our fault you're doing as some new
developer comes in and suddenly says why are you doing that you should be doing this and we have to listen we have to allow that conversation to go so things could change at any time having said that I think I it's safe to say and I
have executive board and technical board members here who can correct me that we do not think that's going to be another ten years we didn't waste that ten years we spent a lot of time talking about it and we're here now showing this although
we're not promising anything because this is coming something like this is coming that's so that's the best I can tell you I just don't wanna put a date on it but it's it's not I would say it's
it's reality something's gonna happen right there are working implementations of some of these things for multiple manufacturers talking to each other presently and there's a lot of momentum and we'll to finalize these things in
the indefinite but near-term real soon like maybe stiffener yeah thanks for all the exciting news really waited for
Ephrata all the points that banging you mentioned were mostly related to point-to-point communication are there any news when there's like a network of devices for example today mostly you
divide advice you say okay one device is that channel not otherwise to the other channel is there going to be something like that this device can negotiate which feature to implement in an act like in a network of multiple devices
wants to take this it's very good question so we have a fraction of Engineers in the MMA who
love to do that but as you can imagine it complicates the whole matter a lot so it's at this point it's not really
considered as a feature but I think we should make sure that it cannot be excluded for the future so do not delay the standardization process even more
but it's definitely very useful and so I think it should come eventually I would want to have it I'm creating Network MIDI hardware so that would be very cool
to have it my recollection is that the original HD protocol working group devoted some time and thought to articulating those problems and trying to define a sort of a solution space
that work is not currently embodied in the items that Ben was talking about being wildly hypothetical perhaps there could be something like a network
controller profile that defines the capabilities that such an entity needs to have and then a language for implementing those capabilities but that is that would that represents future
work at this time but I think the path the path to enable that work is is under construction this is another case where I made I'll ask the technical standards
board people to correct me but related to that question is oftentimes those kind of things you're talking about are done through a host computer and it is possible that the people who control
that environment may solve that on their own is that an ambiguous and vague enough I'm just saying that manure Ian's answer
was that we had inside the MMA we had discussed those kind of things and thought that maybe we should solve that and maybe that is not where the solution should be maybe maybe I'm not arguing one or the other the host computer can
solve that but I don't know I mean those sort of network protocols can be implemented without support from the operating system as long as the operating system provides networking Network stack people can build protocols
like Oh like they build OSC for communicating between applications or between devices on a network and the operating system doesn't have to know about it whereas if we have a protocol
that is using USB the operating system has to support that protocol so these networking protocols are actually something that could be done without the the coordination of the operating system
and the device manufacturers people could write things that run on raspberry PI's and an application that runs on a Mac and another application runs on Windows somewhat Android they could all talk to each other tomorrow if they
agreed on a protocol Ryan Adam hi Phil mentioned his interest in intonation and I also have an interest in intonation
and tuning and I just wondered where things are at with in the near future with things like maybe an O have kind of have having its own frequency
information well I think I think it's fair to say that well we can't say anything specific about what commands or packets or what's in or what's out but I think that
support for arbitrary intonation is seen as something that supports markets that are beyond the twelve tone equal tempered you know cultural standards of
you know the Western music and I think that there's that ideas shared I'll throw out you know that that's one of the least controversial issues up most pretty much everyone goes oh yeah we
want to have fractional pitch so I think it's you can be fairly comfortable that there will be some kind of support but I can't be specific about what that would look like right there well-defined
approaches are under discussion for you know very fine pitch resolution and properties that are associated with notes or with voices and so you know
every every approach under consideration treats those as important considerations that should be implemented so I think I think we have time for one more question
you are mentioning there at the Mme that you were desiring more software engineers to participate and I was wondering if you would host a github or something where you could have resources
for libraries in various popular languages and platforms where people could adopt these protocols you can read the protocol sheets but I've been around for a while and not everybody parses MIDI perhaps appropriately and you run
it to kind of inter operational issues so if you had some example code or some common libraries that people would want to contribute to say in an open source or github that would be very useful so it's kind of wondering if you could
comment on that that's a very good question I'll leave you to answer in a second but I just want to point that well juice obviously is open source and it has an MP implementation for example that's that's there and any new signal
in changes in MIDI we'll do our best to continue to provide open source examples so you don't have to use our implementation but as you can see you know how it can work in practice we have
a github repository that's bent we're using internally that's open for Mme members so when we talk about how would we parse this kind of a packet we might you know people post codes like this is how I would do it it's like oh that
looks easy that looks hard so we're already exchanging code right now because software engineers tend to think in code sometimes a little more easily than they think in terms of the text documents so once but we can't make
those repositories public until the protocols public but once protocols are public I'm I'm pretty sure that there will be public repositories where we can exchange you know parsing libraries and
things I thought something that the MIDI Association wing of the MMA might yes so that's that's another another response or part of the response is that traditionally first of all the MIDI
Manufacturers Association is a horrible name because manufacturers implies hardware but we so I want to make it clear although it says manufacturers we mean companies that make MIDI products whether it's hardware or software but
even at that what we've realized after thirty years of doing this is that part of the success of MIDI is not just the manufacturers who build the products but also the users who use them because everybody who uses many products knows
they're using MIDI it's not like a USB where everybody has USP but you have no idea what it's doing you actually edit note ons and notice and you move controllers and you know that's what you're doing so the success of MIDI is not just the manufacturers
it's also the researchers and the educators and the composers everybody retailers even understand it and yet there's no forum there's no association for them so and in fact our organization
MIDI Manufacturers Association was never really concerned about them our focus was making sure that manufacturers built interoperable products having done that that's great except that we need some way to get out to all those other people who are using and explain that things we're doing like
we're doing here meeting and talking with you so in order to enable even better communication and fostering education we now started a second organization called the MIDI Association so we removed manufacturers it's a separate
organization from the MIDI Manufacturers Association you still have to qualify to become a member of the MIDI Manufacturers Association but the MIDI Association is free and open to anyone we took all the specifications that we
used to sell all the MIDI specs and put them up there and all you have to do is you go to the website MIDI org and you register and now you have access to all the websites those forums those tutorials there's blogs if you want to
write an article about something that you're doing if you want to communicate with people so that could become a developer resource as well where you would not have to be a MIDI Manufacturers member you could just be a developer and of course we'd still want
to encourage developers to join to participate in the standard-setting activities because we need the feedback in the input but it we certainly could do we could pretty much do anything on the midi association part of our
business operations that anybody wants including setting up github stuff and and distributing resources ok we're out of time thank you all very much for attending the developers please join the
Mme everyone please join the MIDI Association at MIDI org thank you very much and thank you to my distinguished panel of ex
End of transcript