Multitouch is going to be the next big thing. Natural User Interfaces (NUI) are going to change the way we use client applications soon. You think it’s not true? See the iPhone hype. This is not a very good phone, feature wise. It is lacking very basic features. But being able to control your phone using multiple fingers was a real killer feature.
More recently, Microsoft released the Surface computing device, which is also using multitouch. Multiple users can use multiple fingers to control the device. Build in Windows Presentation Foundation, these application get input from the Surface device input API and transforms can be applied to the elements on the screen according to the fingers movements (or Gestures). The mapping of the finger movements to mathematical transforms can be complex and tedious.
Similarly, Windows 7 ( the next operating system developed by Microsoft, currently in beta) supports multitouch input natively. This opens the door to a new generation of devices (cheaper than a Surface, for example the HP Touchsmart) will bring multitouch computing to the masses, and not just on phone devices.
This just calls for a higher level API allowing to easily integrate complex gestures into an application. Like every technology developed by IdentityMine, this was developed with the developer-designer workflow in mind and will facilitate the development of rich, natural user interfaces.
The whole story is available here. We are all very excited about this major step and personally, I cannot wait to develop using this package. Congratulations to the team who did an amazing work developing this!