The SenseWorld DataNetwork makes the sharing of data between collaborators easier, faster, and more flexible.
In order to make the data from the wireless sensor nodes available to several collaborators on a project simultaneously, we developed the SenseWorld DataNetwork. It is intended to facilitate the creation, rehearsal and performance of collaborative interactive media art works, by making the sharing of data (from sensors or internal processes) between collaborators easy, fast and ﬂexible. Our aim is to support multiple media practices and allow different practitioners to use the software to which they are accustomed. The framework is intended to support coordinated collaboration with real-time data and multiple media types within a live interactive performance context.
The ﬁnal design criteria were to:
- Tight integration with the wireless sensing platform.
- Allow reception of data from any node by any client (subscription).
- Allow transmission of data to any node by any client(publication).
- Restore network and node conﬁguration quickly.
- Be usable within heterogeneous media software environments.
- Enable collaboration between heterogeneous design practices.
- Enable efﬁciency of collaboration within the limited timeframe of rehearsals
The framework’s core is implemented in SuperCollider and currently there are clients for SuperCollider, Max/MSP, Processing, PureData, and C++.
The framework defines an OSC namespace for exchanging data, as well as methods for managing the connection between the host and client via UDP, by working around the idiosyncrasies of the OSC-implementations in many common programs for interactive work.