I've been thinking about the steps of development goals to begin the development of the software needed to support a musical touch pad or maybe a better name for them would be midi-pad (we need to think of a better name). I am also looking for the needed resources of code and libs that already exist in Linux, that I should make available in one location for those that want to assist in the development to be used as references. I plan to setup another github.com project in my present account that at first will only contain parts of code that are not really parts of music-pad but are the needed code that will be needed as references to read and understand to help us to proceed. I am going to collect the parts of the kernel source of the drivers of at least one touch pad (the Logitech T650) and the modules that are linked back to the USB interface. I will also include the diagnostic tools code used to test a touch-pad that already exist in the Linux archives that should be useful to learn how it works. I will also include the code of some other projects that have many of the feature we are looking to control including:
virtual midi piano keyboard http://sourceforge.net/projects/vmpk/
This virtual midi piano keyboard even includes touch screen support in it's present state so maybe all we need is a branch of this as it basically does most of what we want to do already.
Minimal beginnings of code
The first working code attempt will be minimal that will simply read only the raw X position (horizontal position) of a touch-pad and convert that to a midi code and then possibly send this to midi with the rtMIDI lib tools, that will make it possible to be able to be linked to a software synth like Timidity and/or Qsynth or recorded on rosegarden or other software using the jackd and alsa midi tools for linking. This is just to learn the basics of both using the data stream from the touch-pad and to learn the linking to available libs and tools of rtMIDI, alsa, jackd and later to VMPK code. Even before linking I would start to play with the rtMIDI and the diagnostics for touch-pad by them self with simple print statements added and just driving a single key on and off to get the idea of how each works on it's own. At this point we will also begin to test time lag of the touch devices to determine if continued support for a chosen device is worth pursuing before we invest anymore time in the project.
Dreams of Adding features to VMPK to better support a midi-pad
Now I will jump ahead and use my imagination of some of the features that I think we could add over time and not necessarily in the order that I present them here. The order is dependent mostly on how difficult it would be to implement each added feature and also on the number people and what people have an interest in having and supporting. Also note some of these features can't be developed until we have devices that output valid pressure data in the structure. But I will note all the features I can think of here as this is just a wish list at this time anyway.
Vertical Y plane mapping methods
One feature I can't even fully fathom yet is how we plan to map the Y plane (the vertical position) on the touch-pad. As there are so many ways we could do this part. I would like to have some user configurable method to be able to break up the Y plane to scale it in as many columns as the user wanted at the time. It could be as we start with that the Y plane is only 1 column were the only data that is used from the pad is X position that would be used to control pitch or the note played with Y position having no influence on the music played or midi events. Or if you set Y to be active in say 2 planes then the lower plane could be one set of notes and the upper column could be setup to map to 1 octave higher or lower or some other controllable shift of frequency. The Y plane could also be configured as just adding aftertouch events that could be used to drag and bend individual notes in frequency or to add modulation or amplitude or other and also depending on the configuration of software synthesizer it is setup to drive. The Y plane could be expanded to any number of columns that could include 6 columns that would or could be more like a simulation of the six strings on a guitar. Also the Y plane could be broken into two sets of columns inside columns where the inner set of columns are after touch with the outer columns being sets of more note keys (frequency). I'm not sure I'm stating this part very clearly so I'll later explain this at some other time. As I said the possibilities here are so wide that I would want the software written so that more methods could be added for the use of both the X and Y plane without having to make the user rewrite the software to make changes or at least write the software simple enough to make software changes easy or possible with maybe plug-ins or ???
Horizontal X plane mapping methods
OK that's just the beginning of the Y plane. Now lets get to the easy part that I should have started first that would be the X plane (horizontal plane) of the touch-pad and how it will control the sounds or midi events generated. This should be easy as most of this I would assume has been already handled in the VMPK software that we already have. Without even looking at how VMPK works already, I'm going to imagine how it could be done. Simply the X plane would be configurable as to how many octaves or how many keys are supported across the touch-pad in teh X plane or basically how many rows are sensed as different notes or sub-notes ( I will explain sub-notes later as bending notes). We also need a way to shift the keyboard up and down in octaves or keys. The X axis should also have the ability to drag to allow note bending that should have the option to be enabled or disabled bending events. I'm not sure if we can bend individual notes with midi but I think we can as I just did a quick look at midi event codes and even as I was reading it wasn't totally sure. Also I'm not totally sure how the present qsynth and timitidy and other software synth are at responding to aftertouch so that will require a bit of research to see what kind of aftertouch we would want to send with sliding across the touch-pad in the X and Y axis. Also at some point we might want to edit some or one of the software synthesizers to support more aftertouch events that could now be useful as without a touch instrument there was never really any possible way to make use of them. The ability to use midi events to control filters must also be available some place but I'm not sure where to look for that yet. If that doesn't already exist we will want to create them or at least learn to link into them if they do exist. With filters we would want a way to adjust center frequency, the Q (bandwidth) and also the rates that those bandwidth values change as you drag. But filter control would probably best be controlled with the Y axis movement not X. OK and that's just the beginning of the X plane on the touch-pad or midi-pad as we might refer to them.
Z axis pressure sensitivity mapping methods
Now lets start thinking about the plane that we don't or won't even have yet until they develop a touch-pad device with Z axis or Pressure sensitivity. The pressure sensitive component can be used for at least two things one being midi velocity value of the note and two could also be used for aftertouch events. We will start here with the velocity component that would be read at the strike of the touch-pad note position. It would be used to manipulate the velocity value in the midi event. For one thing I would assume that no touch-pad is the same and no musician would want the velocity sensitivity the same and It would also depend on what instrument you had it proggramed to play. To make it simple to start we would just do simple scaling of the pressure sensed at strike with limits on both minimum and maximum velocity that it would convert to as sent as a midi event. But I think we need more than that. I think what is needed is some custom none linear line of different force to different midi velocity value as well as the upper and lower limits. This I guess could be done in two ways with exponential and also with a curve fit line on a graph that would be modifiable by the user in some GUI. You might also want sections of the keys to have different sensitivity than others or mapped to different curves over different sets of notes. OK that's a start on the velocity value sent with pressure.
Z axis pressure and aftertouch mapping methods
Now lets look at what could be added as far as aftertouch using the Z axis data. I guess the main thing to add for after touch is the threshold of how much more added force after the strike will begin to trigger aftertouch midi events. Also there might be some time value to add before aftertouch begins to register and also threshold of change over the original strike value before aftertouch begins to be effected. Also similar to velocity would require max min values sent and a scaling method that should be much the same type controls used in velocity above. I again would prefer to see custom curve fit graphic line used to generate the sensitivity scale used that would be controlled by the user. Also aftertouch should have the option to be enabled and disabled.
Standard piano keyboard like mapping support
As an added note I should mention that I had only really planed to map the touch pad into a square checkerboard like pattern that would be more like emulating a guitar and string positions. But some might prefer using a position mapping that more emulates a standard piano keyboard. I hope this has already been done in the VMPK software as I didn't have plans to support that method of mapping but I'll add a button in the GUI to enable it and disable it in hopes that someone wants to support it.
User interface control methods and GUI interface features
OK now we have an idea of what features we want. Now we need to think about how we want the user interface to look like to control all these features and other things that need added controls. Being a dreamer I think I should create the user interface first just to tease people into thinking that it does really already exist. I am most familiar with glade so I may make a prototype glade GUI that has a method to control at least some of the values above to at least get us started. Also note that this or these GUI screens or tabs would just be added to the already existing VMPK software GUI that already exists. So here I will start thinking about what buttons and sliders we should add to additional tabs if they don't exist already to this interface. I guess I'll just list them to start in hopes that most are self explanatory and for some I will separate the control name with a comma that will include a brief explanation of what it does:
Volume control slider, basic max midi velocity generated or direct control over ALSA or Pulseaudio volume or one for each
Enable disable output button, as we might use the touch-pad for other purposes other than music we might want to disable it from generating midi events at some points.
Min velocity slider
Max velocity slider
expo velocity slider
min aftertouch slider
max aftertouch slider
expo aftertouch slider
min frequency slider (in notes or Hz??)
max frequency sider
note offset box, plus or minus value that will shift the present active range of notes
X note graduation number box, the number of notes or sensitive row separation position
Y graduation number notes, number columns to divide Y axis into for multiple columns of notes
Y graduation number aftertouch, number to subdivide each note box into to drive aftertouch events
Y graduation mode setting, volume mode , position or drag aftertouch mode, multi row notes mode
Y key note offset for each row when multi row is active
Midi channel, I think all the main midi controls are already on the VPKB software so I won't add more here
I will continue to add to this list as more controls come to mind.
OK this is just a ruff draft that was created in less than 2 hours so we can edit or trash this at any time. Remind you I didn't even read the features of VMPK software before I wrote this to prevent me from being biased to there already written methods of control. I'm sure after I read and play with the VMPK (Virtual Midi Piano KeyBoard) that I will want to edit this to some degree to make the changes better fit with what they already have done. For all I know maybe nothing needs to be done to it. From the screen shots it already looks fantastic. I will later also think up other controls needed and other ways to provide the user more control in the way touch actions can be mapped to midi events or other creative stuff. Remind you it might be some time before the hardware devices needed to fully implement these ideas become available in the open market. Feel free to add your own 2 cents or feedback in any way as any feedback including criticism can only be a benefit at this point in time. Continue to dream or it won't become a reality.