Today is the opening of the newest piece created in collaboration with Luc Courchesne, as part of the Posture Platform project. The installation is a commission for the BMO Corporate Art Collection, and will be on display in Toronto until November 2011. The piece is titled You Are Here, and provides visitors with an immersive exploration environment of downtown Toronto. A 360 degree projection system (Panoscope) is used in conjunction with an iPhone, allowing users to fly around, watch immersive videos, and discover alternate spaces / realities. Visit the project website by clicking the image below:
Marker-based augmented reality is pretty fun to play with, and starting to gain popularity. In order to let everyone see what it’s all about and play with this technology, we’re creating an application called MixAR. The aim is to allow users to lay out, create and record augmented reality scenes using their phones. We’re currently in the process of raising some capital and proofing some of the interaction concepts. See the project description on Kickstarter:
Along with artist / game designer, Paul Warne, we have launched a new design and development studio in Montreal, called Hololabs. This partnership has the synergy required to translate esoteric research into engaging interactive experiences for the wider audience. With a focus on mixed reality and interaction design, Hololabs is deploying several new works over the year, starting with augmented reality applications on the iPhone.
The first development, produced for Montreal recording artist Empire ISIS, is in fact ready to try and can be found on the App Store. Stay tuned for more.
Audio Graffiti is a multi-user music installation that explores new modes of sonic interaction, afforded by the latest in locative technologies. Several mobile users may create and explore a gradually evolving collage of audio recordings, “stuck” to real walls in urban environments.
The piece can be deployed in an outdoor environment (using GPS tracking), or in an indoor space as seen in the video below. Equipped with a wireless headset and tracking device, participants can “tag” or “spray” sound onto the wall. We provide several small musical instruments, which can be used along with one’s voice, to add sounds to the collaborative musical mix. The installation is seeded with some pre-existing sonic material, which allows participants to synchronize rhythmically, and maintains cohesion over time. All user-contributed sounds slowly fade away, resulting in an ever-evolving musical piece.
As users moves about, they also experience a changing sonic perspective of the localized sounds, based on their particular location. So users not only create audio content, but they also participate actively in the encounter (remixing) of sonic material. Participants who are waiting their turn in the staging area may watch a real-time 3D visualization of the installation, which shows avatars of each player walking amongst virtual sound sources.
This installation was filmed at the 12th Biennial Arts and Technology Symposium in the lobby of the Ammerman Center for Arts and Technology (Connecticut College, March 4-6, 2010).
Breaking the Ice was successfully launched for the Cultural Olympiad of the Vancouver 2010 Olympics. The installation combines telepresence with game-like 3D interaction, in order to foster a dialogue between Olympic visitors and the distant city of Montreal. The audiences of both locations are excited to have the ability to ‘break the ice’ with their compatriots, and socialize about the Olympic games.
The project was a collaborative effort of many groups at the Society for Arts and Technology [SAT], including industrial designers, production staff, and the PropulseART research team. At the heart of the technology, lies the SPIN Framework, which maintains the distributed 3D state, and renders the graphical interface. The Scenic software system was used for transmission of audio & video signals between both locations using the high-bandwidth CANARIE network.
I’ve started to work on a project called New Terrain of Apparition (NTA) with Luc Courchesne. The goal is to connect several hemispheric projection environments (called Panoscopes), so that users can meet and interact in a networked virtual environment. Several cameras are used to capture each user from various angles, allowing an accurate image to be displayed for each relative viewing angle in the virtual world. This means that even in an immersive 360 degree display, users can look each other in eyes and have a real time video chat. Audio and video transmission is handled by Scenic, and management of the virtual environment is handled by the SPIN Framework.
Below is a video of the first prototype, which will be exhibited as part of CODE Live 1 at the Vancouver Olympics:
We recently created an installation to demonstrate our component of the PropulseART project at the Society for Arts and Technology. The goal of the project at large is to connect remote concert venues with high quality video and multi-channel audio. Several open source software components have been released, under the name of Scenic, which manage real time transmission of audiovisual data. Our Clickable Space authoring suite allows for the 3D modelling of performance spaces so that users from multiple locations can share and interact with each others stages.
Below is a proof-of-concept video about how Clickable Space operates:
This Monday, May 11th, I will be presenting a course on “Art, GPS and mobility” as part of the SAT[Transform] educational series. Although the topics may vary according to audience feedback, below is a overview of the topics that may be presented:
> Inspiration for interactive mobile applications:
intro to mixed & augmented reality
examples of locative media projects (past & present)
> Geospatial data:
Existing viewers, rendering engines, and geotagged data
Location-based content delivery and interaction
intro to OpenStreetMap, Google Earth, etc.
> Mapping & GIS:
projections and representations (lat/long – UTM)
geocaching: waypoints/tracks/routes and exchange formats
open source software solutions
> User tracking:
about GPS receivers, technologies, and conversion between formats
robustness with other inputs: accelerometers, compasses, cameras, fiduciary markers, etc
On March 6th 2009, we held a networked event where DJs and VJs in Montreal and Vancouver performed simultaneously. Moreover, audience members in each location were able to contribute material in real time with their cell phones using Raw Materials, a software developed by Mike Wozniewski and Alexandre Quessy at the SAT. Below is a documentary video created by Mo Simpson that describes the event: