We need to change the way vertices are generated for windows. We currently have a addWindowGeometry screen function that takes care of this. It can be wrapped by plugins and it can generate any vertices and texture coordinates it wants. I'd like to change it so that the region for which to generate vertices and texture coordinates will be specified elsewhere and it doesn't have to be constrained to the dimensions of the texture. The reason for this is that when adding support for input transformations we need to be able to transform not only the coordinates within the visible part of the window but any coordinate that can be reported in events that are generated with respect to the window. Another thing that we need to changed is the way animations are initiated and performed. Plugins are currently sort of hijacking the addWindowGeometry function when they feel appropriate without the code that initiated the animation knowing anything about it. A better model would be to have the code that can cause an animation to happen also be the code that initiates the animation. E.g. instead of having the wobbly plugin deform the window geometry while the move plugin is moving a window, the move plugin should initiate an animation object and be responsible for making the animation advance. - David