[chimera-dev] Multi-touch and Chimera
goddard at cgl.ucsf.edu
Tue Apr 8 12:39:58 PDT 2008
Chimera is written in Python and C++. The C++ is used for OpenGL
rendering and for optimizing calculations. The event loop and user
interface is all in Python. The touch events and the actions they
trigger should definitely be in Python. It looks very reasonable to use
Boost Python to wrap the C++ touchlib library or parts of it to make it
accessible to Python.
We use an in-house wrapping program (WrapPy). Wrapping C/C++ for Python
is often done with package called SWIG.
We use the Tcl/Tk event loop using the Python/Tk interface module
called Tkinter. You can put custom Tk event types such as touch events
in the event queue but I haven't figured out a way to do it. The
trouble is that you typically will get an event from a callback in C/C++
code. I recently tried passing such an event (for a Space Navigator
6-dof input device) to a Python routine. It resulted in crashes because
Tkinter is designed so that only it can call Python code. It has some
complicated thread locking code that releases a Python lock and my event
handler called into Python without having that Python lock, tried to do
some Tk call and crashed. Probably the Python lock can be acquired in
the C/C++ callback, but then if that Python code does Tk calls to add an
event there may be additional problems with Tkinter's Tk lock which is
inaccessible. So some work is needed to figure out how to get the C/C++
callback events into the Tk event loop. Once that is solved everything
can be done in Python with event handlers processing those Tk events to
cause any desired Chimera actions.
Gergely Nagy wrote:
> Hi Tom,
> Thanks for your reply.
> Our multi-touch interface won't work with touch-pads, since it is
> based on another technology.
> The user will need to touch the actual object on the screen since
> there won't be any pointers. The interaction is direct between the
> user finger and the programs objects. So basically when the user
> touches a menu item for example, there is a sort of mouse clicked
> event at the given coordinates on the screen.
> Talking about events, there exists a library written in C++ called
> Touchlib (http://nuigroup.com/touchlib/) that pass those touch events
> to any program written in C++ and this is what we would like to use.
> Of course in this case we will need to wrap that code to be able to
> use it in python. I thought using Boost
> (http://www.boost.org/users/index.html) for that.
> I was asking you about the sources because at the end we should have
> something compiled and working. It doesn't matter if its not the
> latest version, what would be important is that we will be able to
> demonstrate the interface working. Would it be possible?
> Concerning the atom-atom distance, the user would zoom in on the part
> he is interested in, after he would choose the appropriate display
> form and then he would change from moving-zooming mode to selection
> mode and he would touch the atoms he would like to know the distance
> between. So you are right, visual feedback would be necessary, as it
> was the case with a mouse click.
> So to sum up the interaction events, there would be something like
> finger_down, finger_up, dragging. And also these events are dispatched
> by touchlib, so we would need to implement them in Chimera.
> What do you think about it?
> Thanks a lot for your help!
> 2008/4/8, Tom Goddard <goddard at cgl.ucsf.edu
> <mailto:goddard at cgl.ucsf.edu>>:
> Hi Greg,
> Sounds neat. Will a multi-touch interface work with a standard
> laptop touch-pad?
> The source code for Chimera is on the web though it isn't up to date.
> All of the functions you are interested in are implemented in
> Python code which is included in the distributed Chimera. The
> Python code is in files (with ".py" suffix) in directories under
> or on Mac under directory Chimera.app/Contents/Resources/share.
> But it is quite difficult to find the Chimera Python routines you
> need. Our programming reference guide (automatically generated
> from code comments) is not too useful but we provide programming
> examples that might help.
> The easiest approach is to ask us by email for the routines you
> need, e.g. list all chain identifiers, select chain given id,
> color equivalent to actions menu, move models, rotate models, show
> atom-atom distance, find atom under mouse, draw label.... Eric
> Pettersen is the most familiar with that code and what simple
> routines already exist to do those things. He is on vacation now
> but expected back around Thursday.
> You are interested in atom-atom distances. Will the user choose
> an atom through the touch interface? It seems that would require
> visual feedback in the Chimera window -- equivalent of the mouse
> pointer. I suppose you could take over the mouse pointer, or you
> could create a new "cursor" implemented perhaps as a Chimera 2d label.
More information about the Chimera-dev