Last saturday I went to Bologna with two colleagues (@robb_casanova as frontend artisan/dev and @emme_giii as mobile UX guru), and together we participated in a contest called HackReality. http://www.whymca.org/evento/whymca-hack-reality-bologna-04-02-2012
Our intent was to mix up some native and web based technologies to transform a mobile phone into a brush, and a wall into a virtual canvas where a user, with its Android phone, could draw multi colored traits.
We used some NFC tags sticked on a paper-made palette:
We used my Galaxy Nexus as a "brush": by tapping the phone on a NFC tag on the palette we changed the paint color. Then we used the accelerometer data to detect accelerations on X and Y axis.
Data were sent to a server running socket.io on top of Node.js. We finally implemented a canvas and we used Processing language to perform the actual drawing.
The source code for the client-side part of project is available at github: https://github.com/nalitzis/hackday_client
If I'll have time, in next posts I will explain how the NFC part works.