Thursday, February 4, 2010


The Sample Suit

A Multi-Modal Platform for Augmented Reality Gaming

This is the work I have done for my first project in my Computational Principals for Media Arts class. I'm posting it here because I may use it as the platform for my solo Graduate Student project for the class

For this unit, I chose to explore the medium of sampling by designing a

multi-modal augmented reality costume named the Sample Suit. The Sample Suit enables the wearer to mediate 5 of their senses while they interact with the world.The suit employs a single computational element, a head mounted iPhone 3Gs to facilitate mediation of sight and sound via a digital sampling app (still in development). The senses of taste and smell are mediated by means of a palette of tastes and scents stored in various compartments of the suit. In the sample suit, the sense of touch is considered a special case. Most if not all of the body's surface is covered by the costume damping haptic engagement. Below I will discuss in more the suit's mechanisms and their relationship to the wearer's sensory perception. I will also describe an example scenario that utilizes sample suits as the primary interface

in a multi-player augmented reality game called "Digital Plague".

The 5 Senses

Sight:The sense of sight is mediated by the iPhone directly. The wearer sees the outside world via the screen of a head-mounted iPhone. The image on the screen is a sampled stream of data coming directly from the iPhone's camera on the other side. The app delivering the video signal is capable of altering the sample rate of the video, the resolution of the image, the depth of the color as well as other digital graphic effects. The app also can provide heads up display graphics that are rendered over the top of the video image. These overlay graphics can provide the wearer with GPS data as well game state information and score.

Sound:The sense of sound is also directly mediated by the iPhone. The wearer will have ear bud headphones that stream audio that is recorded by the microphone in the iPhone. Like with the video signal, the app will be able to control the sampling of the audio signal. Sample frequency, and bit depth will be adjustable through the app. To prevent other unmediated audio from reaching the wearer, sound deadening earphones will cover each the wearer's entire ears.

Taste:The sense of taste is mediated by an array of representative flavorful pockets designed into a tasteful vest. The flavor pockets are arranged in a two-dimensional array across the front of the wearer's vest. Each taste pocket contains a uniquely formulated flavor. Considerations have been made to prevent spoilage of the sample flavors and the spectrum of tastes has been concentrated in the sweet range. Jelly beans have been selected for their unique combination of properties, including small, discrete size, large variety of flavors, ease of acquisition and affordability. To experience a bite of flavor, the wearer simply reaches down into a specific pocket and picks up an individual jelly bean. The wearer then pla

ces the jellybean in their mouth and chews.

Smell:
The sense of smell is mediated by a string of test tubes worn around the wearer's neck. Each test tube contains spices and oils with strong and varied scents. The set of scents is a combination of kitchen spices and perfume oils. To access a particular scent, the wearer counts down from the initial tube an locates the sample to smell. The wearer then uncorks the tube and lifts it to their nose and sniffs.

Touch:
The sense of touch is mediated by the fabric of the costume itself. Most of the body is covered by the fabric of the costume and the hands are covered with gloves. The fabric and gloves provide a layer of protection from the skin's sensitive surface. While the wearer interacts with the world the suit acts to dampen haptic signals effectively filtering the touch the wearer receives.

Example Scenario:

"Digital Plague" is a prototype game scenario based on the Sample Suit Augmented Reality Gaming Platform.

"Digital Plague" is a multi-player augmented reality game designed to be played in public spaces filled with other non playing people. The object of the game is to locate other players in the crowd using GPS and vocal messaging. Finding other players is made difficult by mediation of all of the player’s senses. The visual and audio signals are seriously degraded by the digital plague so all the player sees of the world around them is low res textures and all the player hears is noisy sounds. Overlaid atop the low res visual is rendered a GPS map of other players in the game. Once you log in to the game, your GPS coordinates are broadcast to other players. Overlaid over the audio noise players can project their voices. The amplitude of the voices is scaled based on the distances between players. When two players eventually meet up, they remove their gloves and shake hands establishing authentic human contact in a world lain to waste by the digital plague. Taste and smell queues are used to express game states and send messages between players at a distance and to celebrate connection and build rapport between characters in contact.

While the entire Sample Suit is recommended for immersive game play, the "Digital Plague" app and headphones are the minimum requirements to play. The other components of the Sample Suit will be communicated to players as "do it yourself" (DIY) peripherals that conform to a standard addressing scheme. A website forum will collect example implementations in the spirit of open source development.

Appendix 1:
Flavors in the tasteful vest
Left
1. Lemon
2. Toasted Marshmallow
3. Coconut
4. Orange Sherbet
5. Blueberry
6. Watermelon
7. Sour Pear
8. Lemon Lime
9. Green Apple
10. Pina Colada
11. Sour Blueberry

Right
1. Cinnamon
2. Orange
3. Bubblegum
4. Peach
5. Strawberry Cheesecake
6. Cherry
7. Cotton Candy
8. Sour Raspberry
9. Sour Strawberry
10. Licorice
11. Juicy Pear
12. Sour Watermelon

Appendix 2:
Scents in scent belt
1. Cumin
2. Brownie
3. Coffee
4. Apple Cider Vinegar
5. Peptol Bismol
6. Oregano
7. Honey
8. Rubbing Alcohol
9. Black Pepper
10. Cinnamon
11. Robitussin
12. Garlic
13. Horseradish
14. Peanut Butter
15.
16.
17.
18.
19.
20.
21.
22.
23.

Appendix 3:
iPhone Development

The Sense Suit is built around a single iPhone 3GS and a customized application called Rez Goggles. The Rez Goggles app is designed to adjust the resolution, sample rate and bit depth of the incoming video signal and portray the result on the iPhone's screen. The Rez Goggles app also supplies a component that adjusts the sample rate, and bit depth of the incoming audio signal before delivering it to the headphones.

Currently the Rez Goggles is still in development. I have acquired an iPhone developer’s license, created a partitioning profile on the phone and built a sample camera app which does work with still images. Being new to iPhone development, I have had trouble with accessing the camera for video among other things.

So as of this release the Rez Goggles app is not ready. This is a project that I intend to complete as the example game scenario is one I am working on for my independent project in my Human Computer Interface class with Win Burleson.

No comments:

Post a Comment