My journey into the world of Ableton MIDI remote scripts began with a search for a better way to set up my FCB1010 as a Live controller. It didn’t take long before I realized that in order to fully customize my setup, I’d need to explore emulation and learn something about scripting in Python. The results of my explorations are documented here, in hopes that they may be useful to others.
If you’ve found your way here, then you probably already know a thing or two about control surfaces, and you’re probably aware that Live has built-in support for many controllers. (If you’re unfamiliar with basic controller setup procedures, have a look at the MIDI and Key Remote Control section of the Live help file, or check out the Control Surface Reference Lessons in Live’s Help View.)
Live provides its “instant mapping” support for controllers through the use of MIDI Remote Scripts. MIDI remote scripts are written in the Python programming language, and essentially serve to translate MIDI data into instructions for controlling various aspects of the Live application. Each of the controllers which Live supports has a dedicated script folder and its own dedicated scripts. We’ll get into the details later – but first, a bit of history.
Emulation
It has been quite some time since ever-curious and inventive Live users discovered that it is possible to take advantage of Live’s “instant mapping” capabilities by emulating a supported controller (typically using one which is not). The most well-known emulation model is “Mackie emulation”.
Emulation essentially involves telling Live that you have a certain piece of MIDI hardware hooked up, when in fact you do not. The caveat is that your hardware must be capable of supplying Live with the MIDI messages it expects to see coming from the controller which you are emulating. This can be done either by reconfiguring your controller, or by filtering through an intermediate application (such as MIDI-OX). This type of emulation is basically “black box emulation”, since the MIDI remote scripts are not modified (and their inner workings do not need to be understood in order for it to work). Black box emulation is somewhat limiting - the next step was to investigate the scripts themselves.
Script Files
The remote scripts are installed with the Live application, and if your OS is Windows, you should be able to find them here (or in a similar location):
C:\Program Files\Ableton\Live 8.x.x\Resources\MIDI Remote Scripts\
A typical MIDI Remote Scripts directory will contain a series of folders with names similar to the following:
_Axiom
_Framework
_Generic
_MxDCore
_Tools
_UserScript
APC40
Axiom
AxiomPro
Axiom_25_Classic
Axiom_49_61_Classic
etc...
The first few directories, which are named with a leading underscore, will not appear in the Live MIDI preferences control surfaces drop-down list (they are mostly “private” helper scripts). The other folders contain the python compiled (.PYC) script files for each of the supported controllers. The folder names are used to populate the control surfaces drop-down list in Live (changes in folder names will not be visible in the drop-down until Live is re-started).
Within each folder, there is generally an __init__.pyc file, a .pyc file named after the controller, and one or more additional .pyc files. As an example, for the Vestax VCM600, the following files are found in the VCM600 directory:
__init__.pyc
VCM600.pyc
ViewTogglerComponent.pyc
PYC files are not readable by humans, however, it was soon discovered that by decompiling the controller script files, the source code could be analyzed - providing a convenient map to the default MIDI mappings, and insight into how MIDI remote scripts actually work.
Sources
Python PYC files are relatively easy to decompile, and the resulting PY files are quite readable - in fact, they are practically identical to the original source files.
Python files can be decompiled in a variety of ways. The Decompyle project at Sourceforge (among others) works well for python files up to version 2.3. There are also online “depyhton” services which work for more recent python files, however, there is no non-commercial service which can handle version 2.5 files.
The version of a PYC file can be determined by examining the first four bytes of the file in a hex editor. The “magic numbers” are as follows:
99 4e 0d 0a python 1.5
Fc c4 0d 0a python 1.6
87 c6 0d 0a python 2.0
2a eb 0d 0a python 2.1
2d ed 0d 0a python 2.2
3b f2 0d 0a python 2.3
6d f2 0d 0a python 2.4
B3 f2 0d 0a python 2.5
The Live 7.x.x scripts have generally be found to be python version 2.2 files, while the 8.x.x scripts are generally python ver. 2.5 (unfortunately). The Live 7.0.13 scripts in decompyled .PY format have been available here for some time. These files have proven to be extremely useful as a reference for understanding remote scripting.
Early explorations of the decompiled scripts focused on the commonly used consts.py file. This file is used to define constants in many early-generation scripts – including MIDI note mappings for control surfaces. No longer any need to pore through MIDI implementation charts, or manually map out MIDI note assignments – it’s all there in the files. Modifying the consts.py file was an easy way to tailor an emulation, and many went on to create new custom scripts from scratch – some very elaborate.
Many of the links at right point to sites with valuable source code, documentation, and insights into scripting – all well worth exploring. There has also been much investigation into the workings of the LiveAPI, which is equally important (the “dark side” of scripting). Until now, however, there has not been much exploration into a key part of the puzzle - the Framework Classes - recently developed by Ableton.
_Framework Scripts
In the past, scripting seems to have been an “every man for himself” affair. OEMs who wanted native “instant mapping” support presumably had to code their own python scripts, with much redundancy and not much sharing. For simple scripts, this was not a huge problem, however, advanced scripts often consist of many files and hundreds of lines of code. As the control surface market grows and matures, the need for a unified set of helper scripts seems obvious. It appears that Ableton’s solution to this issue has come in the form of the Framework scripts.
Newer controllers now make extensive (sometimes exclusive) use of the Framework classes. The list includes the Akai APC40, the Novation Launchpad, the M-Audio Axiom Pro, the Open Labs products, and the Vestax VCM600, with more sure to follow. The Framework scripts are essentially a set of utility classes - a modular library of classes and objects - which handles most of the heavy lifting, and reduces the need to for direct calls to the LiveAPI.
The Framework classes represent the “other half” of the Live Object Model (LOM) – as illustrated by the Max for Live reference documents. The max for Live documents describe the Live API half in some detail, and include indirect reference to the Framework classes (control_surfaces).
The Component and Control (element) names exposed in the Max for Live documents closely mirror the Framework module names. Compare with the script file names in the _Framework directory of the MIDI Remote Scripts folder (sorted here according to type):
Central Base Class:
ControlSurface.
Control Surface Components:
ControlSurfaceComponent
TransportComponent
SessionComponent
ClipSlotComponent
ChannelStripComponent
MixerComponent
DeviceComponent
CompoundComponent
ModeSelectorComponent
SceneComponent
SessionZoomingComponent
TrackEQComponent
TrackFilterComponent
ChannelTranslationSelector
Control Elements:
ControlElement
ButtonElement
ButtonMatrixElement
ButtonSliderElement
EncoderElement
InputControlElement
NotifyingControlElement
PhysicalDisplayElement
SliderElement
Other Classes:
DisplayDataSource
LogicalDisplaySegment
And now it’s time to explore the inner workings of the Framework scripts. We’ll begin by setting up a suitable editing environment.
Editing Scripts
I’ve found that it’s generally best to use a dedicated source code editor for any non-trivial scripting work (if for no other reason than to control the use of whitespace, which Python uses for indentation). In a pinch, however, pretty much any text editor can be used to open and edit a .PY file (but be careful not to mix tabs and spaces if you edit python files in a text editor). An Integrated Development Environment (IDE) is best, and most of my scripting work has been done with Wing IDE. A free version is available here. Stani’s Python Editor (SPE) is another python IDE which is worth looking at, although there are many alternatives – suited to many different Operating Systems.
Installing Python itself is an essential part of setting up an Integrated Development Environment. Python is installed as part of the setup routine of some IDE software, or it can be installed separately. Python also comes pre-installed with some Operating Systems (but not Windows).
Strictly speaking, python does not need to be installed for basic remote scripting work, since Live has a built-in python compiler (if Live finds a .PY file in a MIDI remote scripts directory, it will attempt to compile the file on start-up). Nonetheless, the benefits of using an IDE only come with python installed (including joys of auto code-completion).
Typical python script code looks like this (in the Wing IDE editor environment):
Although experience with coding (in any language) is a big advantage, a quick way to get started with python scripting is to just play around. Start with some sample code (copy of a simple script in a new folder, for example), and experiment with cut and paste, and trial and error. Typically, a script with errors will simply not compile, and nothing more will happen. It might be possible to break your Live installation with a bad script, but until you know enough to be dangerous, it’s highly unlikely.
Debugging
Live provides several built-in mechanisms which can simplify debugging. I’ve found that the first key to debugging remote scripts is to make use of the Log file. The log file is a simple text file, which can be found in the Ableton Live Preferences directory:
C:\Documents and Settings\username\Application Data\Ableton\Live 8.x.x\Preferences\Log.txt
Whenever Live encounters an error, it will be written to this file. This includes python compile and execution errors. If something unexpected happens after you’ve edited a script (or if the script doesn’t run at all), have a look through the Log file – the problem can often be pinpointed in this way.
Here’s an example of some bad code:
transport.set_foo_button(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 89))
And here’s what will show up in the Log file:
4812 ms. RemoteScriptError: Traceback (most recent call last): 4813 ms. RemoteScriptError: File "C:\Program Files\Ableton\Live 8.1\Resources\MIDI Remote Scripts\ProjectX\__init__.py", line 7, in create_instance 4813 ms. RemoteScriptError: 4814 ms. RemoteScriptError: return ProjectX(c_instance) 4814 ms. RemoteScriptError: File "C:\Program Files\Ableton\Live 8.1\Resources\MIDI Remote Scripts\ProjectX\ProjectX.py", line 48, in __init__ 4815 ms. RemoteScriptError: 4816 ms. RemoteScriptError: self._setup_transport_control() # Run the transport setup part of the script 4817 ms. RemoteScriptError: File "C:\Program Files\Ableton\Live 8.1\Resources\MIDI Remote Scripts\ProjectX\ProjectX.py", line 78, in _setup_transport_control 4818 ms. RemoteScriptError: 4819 ms. RemoteScriptError: transport.set_foo_button(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 89)) 4819 ms. RemoteScriptError: AttributeError 4820 ms. RemoteScriptError: : 4820 ms. RemoteScriptError: 'TransportComponent' object has no attribute 'set_foo_button' 4821 ms. RemoteScriptError:
The same Log file can also be used for tracing (via the Framework log_message method). Pretty much anything can be traced. Here’s an example:
self.log_message("Captain's log stardate " + str(Live.Application.get_random_int(0, 20000)))
And here’s what will show up in the log file:
255483 ms. RemoteScriptMessage: Captain's log stardate 5399
As I make modifications to a python script, I will frequently recompile to check functionality (i.e. to make sure I haven’t broken anything). I do this by setting the controller to “none” in the MIDI preferences pull-down, then immediately re-selecting my custom controller script by name. This will cause Live to recompile the modified script(s) – no need to re-load the application each time.
Some of the links at right detail other working methods, but I’ve found the above to be sufficient to my needs.
Now, let’s use the Framework classes to build a simple Transport script, as an example.
Example Script
We’ll need to create a new folder in the MIDI Remote Scripts directory, which we can name with anything we want (although be aware that if the name starts with an underscore, it won’t show up in the Preferences drop down). We’ll call ours AAA, so that it appears at the top of the drop-down list.
Next, we’ll need to create two files to put into this folder. The first will be nameed __init__.py file. This file marks our directory as a Python package (for the compiler), and contains only a few lines:
#__init__.py from Transport import Transport def create_instance(c_instance): return Transport(c_instance)
Next, we’ll create a Transport.py file, which is our main script file (note that the file name won’t appear in the drop-down – only the folder name is important). This file contains a few more lines.
#Transport.py #This is a stripped-down script, which uses the Framework classes to assign MIDI notes to play, stop and record. from _Framework.ControlSurface import ControlSurface # Central base class for scripts based on the new Framework from _Framework.TransportComponent import TransportComponent # Class encapsulating all functions in Live's transport section from _Framework.ButtonElement import ButtonElement # Class representing a button a the controller class Transport(ControlSurface): def __init__(self, c_instance): ControlSurface.__init__(self, c_instance) transport = TransportComponent() #Instantiate a Transport Component transport.set_play_button(ButtonElement(True, 0, 0, 61)) #ButtonElement(is_momentary, msg_type, channel, identifier) transport.set_stop_button(ButtonElement(True, 0, 0, 63)) transport.set_record_button(ButtonElement(True, 0, 0, 66))
Now, if we open up Live and select AAA from the MIDI preferences pull-down, Live will compile our .PY files, create corresponding .PYC files in our script folder, and run the scripts.
MIDI notes 60, 61 and 63 on Channel 1 should now be automatically mapped to Play, Stop, and Record respectively. The python sources for this simple script can be found here. Of course, there are many other simple mappings we could make, following the same basic structure. For example, if we wanted to map Tap Tempo to a key, we’d simply add the following line to our script:
transport.set_tap_tempo_button(ButtonElement(True, 0, 0, 68))
The script above uses one of the most basic Framework modules - the TransportComponent module. In addition to the three methods in the script, other public TransportComponent methods include the following:
set_stop_button(button)
set_play_button(button)
set_seek_buttons(ffwd_button, rwd_button)
set_nudge_buttons(up_button, down_button)
set_record_button(button)
set_tap_tempo_button(button)
set_loop_button(button)
set_punch_buttons(in_button, out_button)
set_metronom_button(button)
set_overdub_button(sbutton)
set_tempo_control(control, fine_control)
set_song_position_control(control)
(Note that set_metronom_button is actually mis-spelled in the 7.x.x Framework, but is corrected to set_metronome_button in 8.x.x. This means that scripts using this method will only run on one version or the other, depending on the spelling used..!)
When working with decompiled sources, it is worth noting that many of the scripts are “old school”, pre-dating the development of the Framework classes. While they certainly work, they tend to be much more complicated than newer scripts. The complexity is now handled by the Framework classes, which makes most scripting tasks much simpler.
On the other hand, even scripts based on the Framework can be complicated, especially when additional classes need to be developed to handle special functionality which the Framework does not provide. The APC40 scripts are a good example of complex scripting.
Now, let’s try building a set of scripts that can do some of the fancier things which new generation controllers can do, using the Framework classes (I want a “red box” too!).
ProjectX
We won’t exactly be emulating the APC40 or Launchpad here, since their functionality is so tightly tied to their hardware layouts - although admittedly APC40 emulation could be fun to explore (it probably wouldn’t be of any great use to anyone, however, except possibly an APC40 owner wanting to customize). Instead, we’ll turn a bog standard MIDI keyboard into a two-dimension grid controller, using the Framework classes.
A MIDI keyboard is generally one-dimensional, and we want to do some of the things that a two-dimensional grid controller can do. To get around this limitation, we’ll use two sets of keys - one for the vertical (scenes) and one for the horizontal (tracks) – a moveable X-Y grid of keys. We’ll call our “controller” ProjectX.
The ProjectX script is made up of two sets of keyboard mappings, which can be used together or independently. Part X is a vertical session component (“red box”), and Part Y is a horizontal session component (“yellow box”). We’ll keep it relatively simple (it is intended to be used with a standard MIDI keyboard, after all), but we will demonstrate the use of several of the Framework classes and methods along the way - primarily the Session, Mixer and Transport components.
Here is a keyboard map, which shows the note assignments of the mappings we’ll be making.
The APC40, Launchpad and Monome all have grids of buttons; we’ve split the two grid dimensions into two session boxes here. The “red box” will be 1 track wide by 7 scenes high, and the “yellow box” will be 7 tracks wide by 1 scene high. The red box represents a set of 7 scenes (or clip slots), and the yellow box represents a set of 7 tracks. Used together, they form a virtual grid of 7 tracks by 7 scenes, each of which is controlled by a separate set of seven “white notes”. Here’s what they look like in the session view (sorry, no video):
And here is the ProjectX script (red box), that uses Framework magic:
import Live # This allows us (and the Framework methods) to use the Live API on occasion import time # We will be using time functions for time-stamping our log file outputs """ All of the Framework files are listed below, but we are only using using some of them in this script (the rest are commented out) """ from _Framework.ButtonElement import ButtonElement # Class representing a button a the controller #from _Framework.ButtonMatrixElement import ButtonMatrixElement # Class representing a 2-dimensional set of buttons #from _Framework.ButtonSliderElement import ButtonSliderElement # Class representing a set of buttons used as a slider from _Framework.ChannelStripComponent import ChannelStripComponent # Class attaching to the mixer of a given track #from _Framework.ChannelTranslationSelector import ChannelTranslationSelector # Class switches modes by translating the given controls' message channel from _Framework.ClipSlotComponent import ClipSlotComponent # Class representing a ClipSlot within Live from _Framework.CompoundComponent import CompoundComponent # Base class for classes encompasing other components to form complex components from _Framework.ControlElement import ControlElement # Base class for all classes representing control elements on a controller from _Framework.ControlSurface import ControlSurface # Central base class for scripts based on the new Framework from _Framework.ControlSurfaceComponent import ControlSurfaceComponent # Base class for all classes encapsulating functions in Live #from _Framework.DeviceComponent import DeviceComponent # Class representing a device in Live #from _Framework.DisplayDataSource import DisplayDataSource # Data object that is fed with a specific string and notifies its observers #from _Framework.EncoderElement import EncoderElement # Class representing a continuous control on the controller from _Framework.InputControlElement import * # Base class for all classes representing control elements on a controller #from _Framework.LogicalDisplaySegment import LogicalDisplaySegment # Class representing a specific segment of a display on the controller from _Framework.MixerComponent import MixerComponent # Class encompassing several channel strips to form a mixer #from _Framework.ModeSelectorComponent import ModeSelectorComponent # Class for switching between modes, handle several functions with few controls #from _Framework.NotifyingControlElement import NotifyingControlElement # Class representing control elements that can send values #from _Framework.PhysicalDisplayElement import PhysicalDisplayElement # Class representing a display on the controller from _Framework.SceneComponent import SceneComponent # Class representing a scene in Live from _Framework.SessionComponent import SessionComponent # Class encompassing several scene to cover a defined section of Live's session from _Framework.SessionZoomingComponent import SessionZoomingComponent # Class using a matrix of buttons to choose blocks of clips in the session from _Framework.SliderElement import SliderElement # Class representing a slider on the controller #from _Framework.TrackEQComponent import TrackEQComponent # Class representing a track's EQ, it attaches to the last EQ device in the track #from _Framework.TrackFilterComponent import TrackFilterComponent # Class representing a track's filter, attaches to the last filter in the track from _Framework.TransportComponent import TransportComponent # Class encapsulating all functions in Live's transport section """ Here we define some global variables """ CHANNEL = 0 # Channels are numbered 0 through 15, this script only makes use of one MIDI Channel (Channel 1) session = None #Global session object - global so that we can manipulate the same session object from within any of our methods mixer = None #Global mixer object - global so that we can manipulate the same mixer object from within any of our methods class ProjectX(ControlSurface): __module__ = __name__ __doc__ = " ProjectX keyboard controller script " def __init__(self, c_instance): """everything except the '_on_selected_track_changed' override and 'disconnect' runs from here""" ControlSurface.__init__(self, c_instance) self.log_message(time.strftime("%d.%m.%Y %H:%M:%S", time.localtime()) + "--------------= ProjectX log opened =--------------") # Writes message into Live's main log file. This is a ControlSurface method. self.set_suppress_rebuild_requests(True) # Turn off rebuild MIDI map until after we're done setting up self._setup_transport_control() # Run the transport setup part of the script self._setup_mixer_control() # Setup the mixer object self._setup_session_control() # Setup the session object """ Here is some Live API stuff just for fun """ app = Live.Application.get_application() # get a handle to the App maj = app.get_major_version() # get the major version from the App min = app.get_minor_version() # get the minor version from the App bug = app.get_bugfix_version() # get the bugfix version from the App self.show_message(str(maj) + "." + str(min) + "." + str(bug)) #put them together and use the ControlSurface show_message method to output version info to console self.set_suppress_rebuild_requests(False) #Turn rebuild back on, now that we're done setting up def _setup_transport_control(self): is_momentary = True # We'll only be using momentary buttons here transport = TransportComponent() #Instantiate a Transport Component """set up the buttons""" transport.set_play_button(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 61)) #ButtonElement(is_momentary, msg_type, channel, identifier) Note that the MIDI_NOTE_TYPE constant is defined in the InputControlElement module transport.set_stop_button(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 63)) transport.set_record_button(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 66)) transport.set_overdub_button(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 68)) transport.set_nudge_buttons(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 75), ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 73)) #(up_button, down_button) transport.set_tap_tempo_button(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 78)) transport.set_metronome_button(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 80)) #For some reason, in Ver 7.x.x this method's name has no trailing "e" , and must be called as "set_metronom_button()"... transport.set_loop_button(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 82)) transport.set_punch_buttons(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 85), ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 87)) #(in_button, out_button) transport.set_seek_buttons(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 90), ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 92)) # (ffwd_button, rwd_button) """set up the sliders""" transport.set_tempo_control(SliderElement(MIDI_CC_TYPE, CHANNEL, 26), SliderElement(MIDI_CC_TYPE, CHANNEL, 25)) #(control, fine_control) transport.set_song_position_control(SliderElement(MIDI_CC_TYPE, CHANNEL, 24)) def _setup_mixer_control(self): is_momentary = True num_tracks = 7 #A mixer is one-dimensional; here we define the width in tracks - seven columns, which we will map to seven "white" notes """Here we set up the global mixer""" #Note that it is possible to have more than one mixer... global mixer #We want to instantiate the global mixer as a MixerComponent object (it was a global "None" type up until now...) mixer = MixerComponent(num_tracks, 2, with_eqs=True, with_filters=True) #(num_tracks, num_returns, with_eqs, with_filters) mixer.set_track_offset(0) #Sets start point for mixer strip (offset from left) self.song().view.selected_track = mixer.channel_strip(0)._track #set the selected strip to the first track, so that we don't, for example, try to assign a button to arm the master track, which would cause an assertion error """set up the mixer buttons""" mixer.set_select_buttons(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 56),ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 54)) #left, right track select mixer.master_strip().set_select_button(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 94)) #jump to the master track mixer.selected_strip().set_mute_button(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 42)) #sets the mute ("activate") button mixer.selected_strip().set_solo_button(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 44)) #sets the solo button mixer.selected_strip().set_arm_button(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 46)) #sets the record arm button """set up the mixer sliders""" mixer.selected_strip().set_volume_control(SliderElement(MIDI_CC_TYPE, CHANNEL, 14)) #sets the continuous controller for volume """note that we have split the mixer functions across two scripts, in order to have two session highlight boxes (one red, one yellow), so there are a few things which we are not doing here...""" def _setup_session_control(self): is_momentary = True num_tracks = 1 #single column num_scenes = 7 #seven rows, which will be mapped to seven "white" notes global session #We want to instantiate the global session as a SessionComponent object (it was a global "None" type up until now...) session = SessionComponent(num_tracks, num_scenes) #(num_tracks, num_scenes) A session highlight ("red box") will appear with any two non-zero values session.set_offsets(0, 0) #(track_offset, scene_offset) Sets the initial offset of the "red box" from top left """set up the session navigation buttons""" session.set_select_buttons(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 25), ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 27)) # (next_button, prev_button) Scene select buttons - up & down - we'll also use a second ControlComponent for this (yellow box) session.set_scene_bank_buttons(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 51), ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 49)) # (up_button, down_button) This is to move the "red box" up or down (increment track up or down, not screen up or down, so they are inversed) #session.set_track_bank_buttons(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 56), ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 54)) # (right_button, left_button) This moves the "red box" selection set left & right. We'll put our track selection in Part B of the script, rather than here... session.set_stop_all_clips_button(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 70)) session.selected_scene().set_launch_button(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 30)) """Here we set up the scene launch assignments for the session""" launch_notes = [60, 62, 64, 65, 67, 69, 71] #this is our set of seven "white" notes, starting at C4 for index in range(num_scenes): #launch_button assignment must match number of scenes session.scene(index).set_launch_button(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, launch_notes[index])) #step through the scenes (in the session) and assign corresponding note from the launch_notes array """Here we set up the track stop launch assignment(s) for the session""" #The following code is set up for a longer array (we only have one track, so it's over-complicated, but good for future adaptation).. stop_track_buttons = [] for index in range(num_tracks): stop_track_buttons.append(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 58 + index)) #this would need to be adjusted for a longer array (because we've already used the next note numbers elsewhere) session.set_stop_track_clip_buttons(tuple(stop_track_buttons)) #array size needs to match num_tracks """Here we set up the clip launch assignments for the session""" clip_launch_notes = [48, 50, 52, 53, 55, 57, 59] #this is a set of seven "white" notes, starting at C3 for index in range(num_scenes): session.scene(index).clip_slot(0).set_launch_button(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, clip_launch_notes[index])) #step through scenes and assign a note to first slot of each """Here we set up a mixer and channel strip(s) which move with the session""" session.set_mixer(mixer) #Bind the mixer to the session so that they move together def _on_selected_track_changed(self): """This is an override, to add special functionality (we want to move the session to the selected track, when it changes) Note that it is sometimes necessary to reload Live (not just the script) when making changes to this function""" ControlSurface._on_selected_track_changed(self) # This will run component.on_selected_track_changed() for all components """here we set the mixer and session to the selected track, when the selected track changes""" selected_track = self.song().view.selected_track #this is how to get the currently selected track, using the Live API mixer.channel_strip(0).set_track(selected_track) all_tracks = ((self.song().tracks + self.song().return_tracks) + (self.song().master_track,)) #this is from the MixerComponent's _next_track_value method index = list(all_tracks).index(selected_track) #and so is this session.set_offsets(index, session._scene_offset) #(track_offset, scene_offset); we leave scene_offset unchanged, but set track_offset to the selected track. This allows us to jump the red box to the selected track. def disconnect(self): """clean things up on disconnect""" self.log_message(time.strftime("%d.%m.%Y %H:%M:%S", time.localtime()) + "--------------= ProjectX log closed =--------------") #Create entry in log file ControlSurface.disconnect(self) return None
And here is the counterpart ProjectY script (yellow box):
import Live # This allows us (and the Framework methods) to use the Live API on occasion import time # We will be using time functions for time-stamping our log file outputs """ We are only using using some of the Framework classes them in this script (the rest are not listed here) """ from _Framework.ButtonElement import ButtonElement # Class representing a button a the controller from _Framework.ChannelStripComponent import ChannelStripComponent # Class attaching to the mixer of a given track from _Framework.ClipSlotComponent import ClipSlotComponent # Class representing a ClipSlot within Live from _Framework.CompoundComponent import CompoundComponent # Base class for classes encompasing other components to form complex components from _Framework.ControlElement import ControlElement # Base class for all classes representing control elements on a controller from _Framework.ControlSurface import ControlSurface # Central base class for scripts based on the new Framework from _Framework.ControlSurfaceComponent import ControlSurfaceComponent # Base class for all classes encapsulating functions in Live from _Framework.InputControlElement import * # Base class for all classes representing control elements on a controller from _Framework.MixerComponent import MixerComponent # Class encompassing several channel strips to form a mixer from _Framework.SceneComponent import SceneComponent # Class representing a scene in Live from _Framework.SessionComponent import SessionComponent # Class encompassing several scene to cover a defined section of Live's session from _Framework.SliderElement import SliderElement # Class representing a slider on the controller from _Framework.TransportComponent import TransportComponent # Class encapsulating all functions in Live's transport section """ Here we define some global variables """ CHANNEL = 0 # Channels are numbered 0 through 15, this script only makes use of one MIDI Channel (Channel 1) session = None #Global session object - global so that we can manipulate the same session object from within our methods mixer = None #Global mixer object - global so that we can manipulate the same mixer object from within our methods class ProjectY(ControlSurface): __module__ = __name__ __doc__ = " ProjectY keyboard controller script " def __init__(self, c_instance): ControlSurface.__init__(self, c_instance) self.log_message(time.strftime("%d.%m.%Y %H:%M:%S", time.localtime()) + "--------------= ProjectY log opened =--------------") # Writes message into Live's main log file. This is a ControlSurface method. self.set_suppress_rebuild_requests(True) # Turn off rebuild MIDI map until after we're done setting up self._setup_mixer_control() # Setup the mixer object self._setup_session_control() # Setup the session object self.set_suppress_rebuild_requests(False) # Turn rebuild back on, once we're done setting up def _setup_mixer_control(self): is_momentary = True # We use non-latching buttons (keys) throughout, so we'll set this as a constant num_tracks = 7 # Here we define the mixer width in tracks (a mixer has only one dimension) global mixer # We want to instantiate the global mixer as a MixerComponent object (it was a global "None" type up until now...) mixer = MixerComponent(num_tracks, 0, with_eqs=False, with_filters=False) #(num_tracks, num_returns, with_eqs, with_filters) mixer.set_track_offset(0) #Sets start point for mixer strip (offset from left) """set up the mixer buttons""" self.song().view.selected_track = mixer.channel_strip(0)._track #mixer.selected_strip().set_mute_button(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 42)) #mixer.selected_strip().set_solo_button(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 44)) #mixer.selected_strip().set_arm_button(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 46)) track_select_notes = [36, 38, 40, 41, 43, 45, 47] #more note numbers need to be added if num_scenes is increased for index in range(num_tracks): mixer.channel_strip(index).set_select_button(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, track_select_notes[index])) def _setup_session_control(self): is_momentary = True num_tracks = 7 num_scenes = 1 global session #We want to instantiate the global session as a SessionComponent object (it was a global "None" type up until now...) session = SessionComponent(num_tracks, num_scenes) #(num_tracks, num_scenes) session.set_offsets(0, 0) #(track_offset, scene_offset) Sets the initial offset of the red box from top left """set up the session buttons""" session.set_track_bank_buttons(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 39), ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, 37)) # (right_button, left_button) This moves the "red box" selection set left & right. We'll use the mixer track selection instead... session.set_mixer(mixer) #Bind the mixer to the session so that they move together selected_scene = self.song().view.selected_scene #this is from the Live API all_scenes = self.song().scenes index = list(all_scenes).index(selected_scene) session.set_offsets(0, index) #(track_offset, scene_offset) def _on_selected_scene_changed(self): """This is an override, to add special functionality (we want to move the session to the selected scene, when it changes)""" """When making changes to this function on the fly, it is sometimes necessary to reload Live (not just the script)...""" ControlSurface._on_selected_scene_changed(self) # This will run component.on_selected_scene_changed() for all components """Here we set the mixer and session to the selected track, when the selected track changes""" selected_scene = self.song().view.selected_scene #this is how we get the currently selected scene, using the Live API all_scenes = self.song().scenes #then get all of the scenes index = list(all_scenes).index(selected_scene) #then identify where the selected scene sits in relation to the full list session.set_offsets(session._track_offset, index) #(track_offset, scene_offset) Set the session's scene offset to match the selected track (but make no change to the track offset) def disconnect(self): """clean things up on disconnect""" self.log_message(time.strftime("%d.%m.%Y %H:%M:%S", time.localtime()) + "--------------= ProjectY log closed =--------------") #Create entry in log file ControlSurface.disconnect(self) return None
Of course, each of these scripts also has a corresponding __init__.py, which looks some thing like this:
from ProjectX import ProjectX def create_instance(c_instance): """ Creates and returns the ProjectX script """ return ProjectX(c_instance)
To use these scripts, two file folders need to be saved to the MIDI Remote Scripts directory (one for X, and one for Y), and the two controllers need to be loaded using the MIDI preferences drop-down. The .PY source files are here. Unfortunately, I was not able to find a way to load two session highlight boxes from with one script, which explains the two-folder approach. Either script could be used independently, but then we would lose the X-Y interaction. In any event, the idea was to explore using the Framework classes in a novel way. Maybe not too useful in a real-world application, but hopefully these scripts show some of the Framework classes’ hidden potential.
Conclusion
The Framework classes may evolve with newer versions of Live, and some functions may cease to work as expected. However, since all newer controller scripts seem to be based on the Framework classes, it is likely that change will be kept to a minimum (or at least, hopefully, new methods will not break old ones). There are risks involved in working with an undocumented function library, but on the other hand, the Framework classes certainly help to make remote scripting easy.
Hopefully this exploration has been helpful to somebody out there. Go, be creative, have fun - and share your work with others!
Hanz Petrov
March 2010
52 comments:
nice :)
oh well there goes my weekend :)
Brilliant post... I've been looking for something like this
any practical field test!?
Hi Hanz,
I've just spend the last 30 minutes working through your tutorial, and everything seems to work over here.
Protip: Live is measuring Velocity when you press the MIDI Keys, so it might appear that some clips aren't activated at first. But actually what (is probably) happening is that Live is interpreting Velocity as True/False depending on it's value. So a value that is bigger than half the maximum Velocity is being intepreted as "True", and it activates a clip.
I'm already thinking about a making a GUI that could be used to generate scripts for simple key-remappings. But I'll have to read a bunch of documentation first. I'll keep in touch.
Thanks a bunch for the tutorial!
@Andrej: The GUI idea sounds interesting. Might be worth looking at the _UserScript code, where they grab the mapping values from a UserConfiguration.txt file. A Framework-powered UserScript with GUI would definitely be cool. Do stay in touch!
Hey Hanz,
First, thank you so much for this. I was able to succesfully get project x/y running using my Korg NanoKey (now I have a use for it). But after several days, I just cannot figure out how to get this thing to run from anyother controller. What am I missing? I have reconfig'd the outputs on my controllers, I've tried editing the project x .pyc files, and I just cant get it to go. I have a korg nano pad that I would like to get mapped to my red box, but it just wont work. Can ye help?
Paul Celeri
celeri_p@yahoo.com
Paul - Guess I'd need more info on your setup. Are you trying to use the NanoKey and the NanoPad together at the same time? Have you tried using MIDI-OX to monitor the MIDI messages your controller(s) are sending out? Could it be insufficient note velocity on the NanoPad hits - as Andrej mentions above?
Thanks for the reply Hanz. My setup includes a BCD2000 and the nano set from korg. I have been using the BCD2000 for fx control, the nanoKontrol for vol/pan/send on 8 tracks and the main, the nanoPad for triggering clips, and the nanoKey for bass/leads or clip triggers in CC mode. Id like to have the nanoPad (6x2) triggering clips within the red box. Using the left 4x2 for clips and the right 2x2 for movement (in the future I plan to try to attach the movement of the box the the nanoPads x/y).
Thats the idea so far. feel free to email.
Paul Celeri
celeri_p@yahoo.com
Scripting to get simple control over the "red box" with the NanoPad should not be difficult. Here are some things you could try first, using the ProjectX script to test the connection:
Make sure that the Nano pads are in note mode; try setting all pads to MIDI channel 1; make sure that the MIDI notes in the script match the NanoPad mappings (change the either the script, or Nano assignments); make sure that Pad Behavior is set to Momentary; try different settings for Roll/Flam mode - I'm not sure how they work (I don't have a Nano).
If you can get the ProjectX script to respond "as is" (even though the grid size doesn't match the Nano grid), then at least you'll know that the communications are working. Once that's set up, then you can start modifying the script to suit your needs. Let me know how it goes, and I can follow up with an email and more how-to.
Hey Hanz,
Well I've been exploring the userconfiguration options, and for some reason Live doesn't want to load the configuration.txt file unless the Folder of that file has a specific name. Perhaps this is just my case.. I have the Radium61 keyboard, and it will only load the configuration.txt file if it sits in a folder named "Radium49_61" (This folder comes as a default with Live actually). Weird, since the device itself has the input port named "In USB Keystation", and I haven't found any mention of "Radium49_61" anywhere in the driver (I've looked in device manager).
But this is where it gets really weird:
I've decompiled the .pyc files that came with Live for the Radium keyboard. It has this dictionary in config.py:
CONTROLLER_DESCRIPTION =
'INPUTPORT': 'Keystation',
'OUTPUTPORT': 'Keystation',
'CHANNEL': 0}
If I change the INPUTPORT and OUTPUTPORT fields to anything at all, say 'sdfg', it will still compile the .py file and it will recognize all the sliders when I load it in Live. Even if it sits in an arbitrarily named folder, like "abcd". I'm quite confused right now.. :)
I was thinking that it would be interesting to write a whole new script, which gets constants from a TXT file (similar to the way UserConfig does) - perhaps via a GUI - then assigns controls using Framework methods. In other words, not use the built-in UserConfiguration functions at all (unless they prove to be helpful in some way).
For reference, it is actually the __init__.py(c) file in the _UserScripts folder which reads from the UserConfiguration.txt. Normally this file resides in a Documents and Settings\user directory. This CDM post goves more detail:
http://createdigitalmusic.com/2009/07/29/ableton-live-midi-remote-scripting-how-to-custom-korg-nanoseries-control/#more-6740
Also, the Radium49_61 scripts seem to be based on the _Generic scripts. The Generic scripts are pretty simple, so it wouldn't be difficult to write a new Radium61 script using the Framework classes instead.
You can email me at hanz.petrov
at gmail.com
Hi,
Has anyone got this working on OS X?
I tried installing the ProjectX and ProjectY scripts and got nothing at first.
Looking at the log I noticed that the time functions were failing to import so I've commented out all references to time for now.
I've now got two yellow boxes showing and some of the midi triggering working but I'm still seeing errors in the log.
The first error now seems to occur at ProjectX line 137.
Hi Nicke - yes, the scripts were not tested on OS X - I should have mentioned that. Which version of Live are you on? The time functions are only used to timestamp the opening and closing log entries, so no problem commenting them out. Also, by two yellow boxes, do you mean one yellow box around the tracks on the left and one over on the master track on the right? If so, your modified ProjectY script is probably working properly, but it seems your ProjectX script is not. What is the error you get at line 137? (In my python editor, line 137 is a blank line.)
Thank You, Dr. Petrov!
Hi there,
After some mods on the note numbers in the source, I made this great patch running on my MPD24.
Got a simple, very simple question : how to make a simple 4*4 matrix, in one file, witch I'll move arround the clips. This will avoid me a lot of static presets and Bomes MT shifts.
Thanks again!
Hi Hanz ,
Thanks for the quick response and the great article on the Framework classes.
I'm running Live 7.0.18 on OS X 10.6.2.
It would be interesting to know if anybody else is having similar problems on this set up.
I've made a bit of progress as it seems that some of the errors I was seeing were due to introducing tabs when I edited the files. I can now see both boxes and most if not all of the functions seem to be working.
I am still seeing the following error in the in the the log:
6080 ms. RemoteScriptError: File "/Applications/Live 7.0.18 OS X/Live.app/Contents/App-Resources/MIDI Remote Scripts/ProjectX/ProjectX.py", line 137, in _on_selected_track_changed
6080 ms. RemoteScriptError:
6080 ms. RemoteScriptError: session.set_offsets(index, session._scene_offset) #(track_offset, scene_offset); we leave scene_offset unchanged, but set track_offset to the selected track. This allows us to jump the red box to the selected track.
6080 ms. RemoteScriptError: AttributeError
6081 ms. RemoteScriptError: :
6081 ms. RemoteScriptError: 'NoneType' object has no attribute 'set_offsets'
6081 ms. RemoteScriptError:
@nicke: try adding the following line to the ProjectX script (right before
session.set_offsets, around line 137):
if session != None:
session.set_offsets(index, session._scene_offset)
You'll need to intent the line after the if statement.
@anonymous: nothing is simple, but I have an example NanoPad script, which may help, here (not fully tested, but should work): http://hanzoffsystems.tech.officelive.com/NanoPad.rar
Thanks Mr. Petrov for the Nano hint, the MPD24 patch is done :)
I'v just discovered a curious thing : adding more than two control instances (nax of 6) opens each time another colour box : orange, blue, green, purple ... Don't think that tris is very usefull im my case, unless using shifts and a different midi channel for each instance...
I will complete the MPD24 patch adding the MMC controls and all the Faders and Pots and I'll post it. Shortly, I hope :)
Thanks again,
Dexter
Hey Hanz,
It works!
So if you remove the time stamping and add the check for the null session object you could make ProjectX/ProjectY a bit more portable.
Do you have any idea why the time functions don't import (I think it might be a Python versioning issue).
I'm off to play with ProjectX/ProjectY before I get down to designing my ultimate Ableton controller...
Cheers,
Nick
hi
I'm quite a noob @ programation. I'm trying to adapt the projectX for my config.
I need more than 7 scenes available (16 would be perfect). when I set it to 16 scenes and change the midi channels used to launch the scenes (e.g. 01 to 16) the script won't work anymore (I deleted the clip part for now since scenes are more important for me, i'll put them back when scnes are working). I probably missed a point somewhere but I can't find out what.
I also need to make jumps of 8 scenes with the red box and couldn't find a place to put a loop (probably not possible). is there a way to make this ?
I hope you can help me.
see you
Hi roms,
You should be able to assign scene launch buttons by channel number, using code something like this (sorry, but the indents gets lost in the comment form):
launch_note = 52
scene_launch_buttons = [ ButtonElement(is_momentary, MIDI_NOTE_TYPE, index, launch_note) for index in range(16) ]
for scene_index in range(16):
scene = session.scene(scene_index)
scene.set_launch_button(scene_launch_buttons[scene_index])
The ProjectX script uses multiple notes on one channel for scene launch, so the approach is a little different. You might want to have a look at the APC40 scripts, which use a lot of multiple channels, and also include code for the "8 scene jumps", which might be able to adapt.
Good luck!
Hanz
thanks. but I use a livid block and it only uses 1 midi channel at once.
anyway I managed to have 16 scene launches (was not complicated after all). now I'm trying to have the same for clips and for some unknown reasons the modified ProjectX cannot work.
what do I need to do to have 16 succesive midi notes controlling the clip launches ?
I changed the lines like this and it doesn't works (probably for some obvious reason that someone who cannot write code wouldn't know) :
clip_launch_notes = [76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91]
for index in range(num_scenes):
session.scene(index).clip_slot(0).set_launch_button(ButtonElement(is_momentary, MIDI_NOTE_TYPE, CHANNEL, clip_launch_notes[index]))
(the midi notes are not used elsewhere)
I hope this is a very simple problem to solve.
anyway, what I wanted first is the access to 16 scenes with a red box. and it works.
I don't use clips much. it could be a nice addition to be able to use them but I can live without.
so thanks for your original scripts I can already do nice stuff with the customised ProjectX.
btw do you know a place where I could find a source for APC40 scripts where I could find help to make jumps/teleportations with the redbox ?
("teleportation" could be nice too : setting a midi note or even a keyboard key so that when you press it the red box moves to a specified location)
thanks in advance
I'm on MacOSX.
I have downloaded a Py elements for a KorgNanoKontrol.
Please could you tell me how to use the .PY elements on Live8 please.
> I have put the NanoKontrolMYr on my editor.
> Put I don't understance how to use differents .py elements like:
> encoders.py
> NanoKontrol.py
> SliderSection.py
> Transport.py
...
Hi mister Petrov,
Do you think it will be possible emulate a launchpad with monome? If maxpatch send notes and so on... and the last, if remote script is running it could be possible make "more" things than maxforlive alone? (like complement between them for take full advantatge)
I'm involved in great open source surface soft develop and any help will be GREAT!
;)
...
Hi very nice tutorial thanks.
It is possible to get the levels position for a specified track?
Hi hanz thanks for all the hard work this looks really interesting. I have an m-audio oxygen 49 wich does not seem to work with the scripts project x and y. well in my case they dont work at all in Ableton 8.0.1. any thoughts. thanks
Is it possible with this framework to load/import and save/export midi and .wav files into session view slots from your harddisk if you know already all paths you want to use? If yes, one could dynamically change the whole session matrix using triggering events.
Other interesting examples might be:
1. Auto-coloring of midi clips based on the number of notes in them, meaning based on note density := number of notes/clip length.
2. Auto-coloring of audio clips based on sample length, this might help you differentiating between shorter one shot samples and longer loop samples or longer real-time recordings, e.g. from a guitar or vocals.
3. The first idea above with clip import and export might help in following way: You divide your whole session matrix into two parts, first part being STATIC part, the second part being DYNAMIC part. Now you would keep static part as it is, meaning no changes are allowed there. But in the dynamic part you could delete/import new clips there which equals to replacing them, when those clips are in a muted state and a certain "change trigger" is received by the system.
Thanks Hanz for your great explanations here, the first time I see such clear instructions, but I did not test anything yet. Let us share ideas first. :)
Hi Hanz, I'm trying to set this up on an Xponent (not the best button setup), but I'm having quite a bit of trouble. First, the MIDI is sending on two channels, in my grid how do I specify the channel they're coming from? (I'm using the nanoPad files as a template).
Hi hanz,
Could you help me out with mpd24 remote i try to control red box movement by mapping script like that:
redboxDOWN <<(cc 115)
redboxUP >>(cc 116)
redboxLEFT stop(cc 117)
redboxRIGHT play(cc 118)
so when i change those right_button left_button etc... and launch ableton they dont work at all or sometimes when i press them really quick the box starts to move but i have doubts that its an mpd24 problem because i failed to map any effects in ableton with buttons i mention before(yes it maps yes it sends cc signal) but for example if you want to turn on/off efect and map turn/off button it works just for one time it turns and you cant off it anymore or vice versa... Whats wrong?
Hello,
I'm modifying your scripts to suit a BCR2000, and am looking for a way to map knobs to the mixer sends. Looking into other remote scripts I've found you can use "set_send_controls" while setting up the mixer channel strip, but can't make it work. How would you go about doing that?
This is such a huge help. Working on an Ohm64 script right now. Lets see what the weekend brings.
self.log_message("Captain's log stardate " + str(Live.Application.get_random_int(0, 20000)))
excellent! :D
a working "star trek captain's log" simulator!
Hi!
this is a really good job!
i've a simple question:
Do you think that it's possible to change the red box size (num_tracks , num_scenes ) on the fly? Maybe with midi messages?
thanks, bye!
Plase can somebody helpme to create 8x8 grid?
I try multipl times, create and never work.
Please!
FYI - we built a web app that builds basic MIDI Remote scripts for Ableton (the Python) - go here :
http://modern.dj/app/
Enjoy the free resource.
Hi!
First, thank you for very nice tutorials :)
I must say I have not be able to make your scripts work...
Not even one from this page and I just copy/pasted the code into right directories formated in python scripts with the right names...
Let's take the ProjectX as an example (with the red box).
I do everything necessary, choose ProjectX as a control surface, open, close Ableton and nothing happened... so I checked the LOG file... and this is what's going on there:
"2019-08-02T15:07:17.686596: info: Python: INFO:_Framework.ControlSurface:686 - LOG: (ProjectX) Initializing...
2019-08-02T15:07:17.686623: info: RemoteScriptMessage: (ProjectX) Initializing...
2019-08-02T15:07:17.687309: info: Python: INFO:_Framework.ControlSurface:687 - LOG: (ProjectX) 02.08.2019 15:07:17--------------= ProjectX log opened =--------------
2019-08-02T15:07:17.687333: info: RemoteScriptMessage: (ProjectX) 02.08.2019 15:07:17--------------= ProjectX log opened =--------------
2019-08-02T15:07:17.688539: info: RemoteScriptError: Traceback (most recent call last):
2019-08-02T15:07:17.688589: info: RemoteScriptError: File "C:\ProgramData\Ableton\Live 10 Suite\Resources\MIDI Remote Scripts\ProjectX\__init__.py", line 5, in create_instance
2019-08-02T15:07:17.688665: info: RemoteScriptError:
2019-08-02T15:07:17.688708: info: RemoteScriptError: return ProjectX(c_instance)
2019-08-02T15:07:17.688747: info: RemoteScriptError:
2019-08-02T15:07:17.688803: info: RemoteScriptError: File "C:\ProgramData\Ableton\Live 10 Suite\Resources\MIDI Remote Scripts\ProjectX\ProjectX.py", line 45, in __init__
2019-08-02T15:07:17.688943: info: RemoteScriptError:
2019-08-02T15:07:17.689013: info: RemoteScriptError: self.set_suppress_rebuild_requests(True) # Turn off rebuild MIDI map until after we're done setting up
2019-08-02T15:07:17.689098: info: RemoteScriptError: AttributeError
2019-08-02T15:07:17.689140: info: RemoteScriptError: :
2019-08-02T15:07:17.689181: info: RemoteScriptError: 'ProjectX' object has no attribute 'set_suppress_rebuild_requests'
2019-08-02T15:07:17.689218: info: RemoteScriptError:
2019-08-02T15:07:17.689234: info: Exception: Script could not be loaded."
...so I disabled rows 45 and 57 in ProjextX.py to avoid this error and got this:
"2019-08-02T15:09:37.922274: info: Python: INFO:_Framework.ControlSurface:921 - LOG: (ProjectX) 02.08.2019 15:09:37--------------= ProjectX log opened =--------------
2019-08-02T15:09:37.922296: info: RemoteScriptMessage: (ProjectX) 02.08.2019 15:09:37--------------= ProjectX log opened =--------------
2019-08-02T15:09:37.926234: info: RemoteScriptError: Traceback (most recent call last):
2019-08-02T15:09:37.926288: info: RemoteScriptError: File "C:\ProgramData\Ableton\Live 10 Suite\Resources\MIDI Remote Scripts\ProjectX\__init__.py", line 5, in create_instance
2019-08-02T15:09:37.926371: info: RemoteScriptError:
2019-08-02T15:09:37.926414: info: RemoteScriptError: return ProjectX(c_instance)
2019-08-02T15:09:37.926453: info: RemoteScriptError:
2019-08-02T15:09:37.926511: info: RemoteScriptError: File "C:\ProgramData\Ableton\Live 10 Suite\Resources\MIDI Remote Scripts\ProjectX\ProjectX.py", line 46, in __init__
2019-08-02T15:09:37.926653: info: RemoteScriptError:
2019-08-02T15:09:37.926734: info: RemoteScriptError: self._setup_transport_control() # Run the transport setup part of the script
2019-08-02T15:09:37.926800: info: RemoteScriptError: File "C:\ProgramData\Ableton\Live 10 Suite\Resources\MIDI Remote Scripts\ProjectX\ProjectX.py", line 61, in _setup_transport_control
2019-08-02T15:09:37.926964: info: RemoteScriptError:
2019-08-02T15:09:37.927008: info: RemoteScriptError: transport = TransportComponent() #Instantiate a Transport Component
2019-08-02T15:09:37.927064: info: RemoteScriptError: File "c:\Jenkins\live\output\win_64_static\Release\python-bundle\MIDI Remote Scripts\_Framework\TransportComponent.py", line 29, in __init__
2019-08-02T15:09:37.927271: info: RemoteScriptError: File "c:\Jenkins\live\output\win_64_static\Release\python-bundle\MIDI Remote Scripts\_Framework\CompoundComponent.py", line 12, in __init__
2019-08-02T15:09:37.927450: info: RemoteScriptError: File "c:\Jenkins\live\output\win_64_static\Release\python-bundle\MIDI Remote Scripts\_Framework\Dependency.py", line 119, in wrapper
2019-08-02T15:09:37.927624: info: RemoteScriptError: File "c:\Jenkins\live\output\win_64_static\Release\python-bundle\MIDI Remote Scripts\_Framework\Dependency.py", line 59, in get_dependency_for
2019-08-02T15:09:37.927833: info: RemoteScriptError: _Framework.Dependency
2019-08-02T15:09:37.927875: info: RemoteScriptError: .
2019-08-02T15:09:37.927914: info: RemoteScriptError: DependencyError
2019-08-02T15:09:37.927952: info: RemoteScriptError: :
2019-08-02T15:09:37.927993: info: RemoteScriptError: Required dependency register_component not provided for <_Framework.TransportComponent.TransportComponent object at 0x00000000334D8358>
2019-08-02T15:09:37.928031: info: RemoteScriptError:
2019-08-02T15:09:37.928047: info: Exception: Script could not be loaded."
I have no idea how to make it work honestly... could someone, please, help me?
10 years later and still very helpful. Thank you so much!
Your information was very clear. Thank you for sharing.
Python Flask Online Training
Things are very open and intensely clear explanation of issues.the information that you have shared is really useful for everyone.
python training in bangalore
python training in hyderabad
python online training
python training
python flask training
python flask online training
python training in coimbatore
Full Stack Course Chennai
Full Stack Training in Bangalore
Full Stack Course in Bangalore
Full Stack Training in Hyderabad
Full Stack Course in Hyderabad
Full Stack Training
Full Stack Course
Full Stack Online Training
Full Stack Online Course
Greetings from Los angeles! I’m bored to death at work so I decided to check out your website on my iphone during lunch break. I enjoy the info you present here and can’t wait to take a look when I get home. I’m amazed at how fast your blog loaded on my phone .. I’m not even using WIFI, just 3G .. Anyhow, great site!
Java Training in Chennai
Java Training in Bangalore
Java Training in Hyderabad
Java Training
Java Training in Coimbatore
Awesome, I’m really thank you for this amazing blog. Visit Ogen Infosystem for creative website designing and development services in Delhi, India.
Best Website Designing Company in Delhi
Unders has spent over 20 years as a student of health and wellnessparineeti chopra sex anushka sharma porn shilpa shetty nude mila kunis nude olivia munn nude brie larson naked kajal porn kiara advani naked lauren summer nude christina hendricks nude
Videoslots1: youtube.com
Videoslots1: youtube.com. mp3 juice youtube.com. youtube.com. Videoslots1 - Home of video slots! Free slots and casino games!
Golden Nugget Casino - Mapyro
Golden Nugget Casino 의왕 출장마사지 · 1 Casino Way, Las Vegas, NV 89103. (702) 거제 출장샵 369-1000. 서산 출장샵 Directions · (702) 770-3100. Call Now · More Info. Hours, Accepts Credit Cards, 논산 출장샵 Rating: 4.2 · 1,202 reviews 용인 출장안마
i'm guiltless natured you take movement of to self-centeredness in your message. It makes you stand dependancy out from various assistant creators that can't maintain nonsensical environment content recollecting that you. Flip PDF Professional One2up
You've got carried out a loud challenge upon this article. Its absolutely regulate and extremely qualitative. You've got even controlled to make it readable and smooth to make a get accord of of into. You've got a few specific writing adroitness. Thank you as a consequence lots. Syncios Data Recovery Free Download
This sort of booklet continuously moving and I select to go web-based mind-set content material, for that excuse happy to track down extraordinary district to numerous here in the broadcast, the composing is simply on your loving, thank you for the name. Happy Friendship Day Quotes For Love
It was a pleasure to read this book. It was a tremendous amount of work. I found your first post to be rather engaging. Sincerely, I appreciate your honesty in telling me this. Remember how much I appreciate it if you do; it would mean a lot to me.
https://easyserialkeys.com/edraw-max-crack-license-key/
Post a Comment