Tooltips are an incredibly useful interface paradigm to know an application. They are the mapping between the visual control and the application specific action associated t
To paraphrase Einstein, a touch should be as simple as it needs to be, but no simpler.
The underlying problem here is not touch, but state. What state is the object in when you touch it? Does touch change its state? How is the change in state reflected to the user?
From the design perspective, trying to encompass all actions in a single touch will work only in the simplest cases. For any more useful application, a first touch may change state, and that state can be reflected by changes in an object's image, by tooltops (even transient tooltips that go away after a set time), or by other means. Touching a selected object should have different meaning than touching a non-selected object. This needs to be a discoverable aspect of the interface.