Wednesday, February 10, 2010

Noobie Blogging Blues

So this is my first blog and I feel like I should be putting more down beyond the classwork and my projects... More like incidental ideas, experiences and epiphanies... Most of that stuff makes it into my paper journals so here I'll attempt to render some of that work.

Make games to play while crying or sad, catharthis games

HCI T-shirt ideas:
"How many passwords does it take you to get thru 1 day?"
"How many passwords do you have?"
"What's your password?"

Slow Computing idea
Slow mail re brand snail mail.... Make an email site that simulates the delay inherent in ground based mail delivery... make it take 3 + days to get there... alert the receiver that it is coming... visualize the route it takes from server to server but stretch the route over the target time period. You can watch it's progress like a UPS or FEDEX delivery. Make the email come with a visual envelope that you can save with the email... enable visual attachments as well as envelope decorations

Data Loss
Lost a hard drive the other day full of movies... some back ups of work but mostly movies and tv shows... A severe dent in the archive maybe 800 gigs but you cant really cry over losing someone else's data.

Digitization is Destruction

Decompressing after Wednesday's class

So our group met again on Monday. Most of us brought ideas and we spent a lot of time just listing them... I tried to limit each explanation to 5 minutes but with 9 ideas we took up the better part of an hour. The 9 ideas were as follows:

Dish Washing Game

Name That Tune

Double Bass Anti-Hero

Participation TV

Food Pantry Helper

Art Thief

Art Collector

Real Guitar Hero


Art Space Ship

During our first in class ideation and sketch presentation, we spent way too much time with ideation, a small amount of selection and similarly very little time on implementation... I was hoping to change this with time limits on ideation, but I think we still lacked in the selection department. I asked everyone for their 2 favorite ideas and it came out

3 votes for Art Collector, 3 for Participation TV and 2 for the Food Pantry Helper... We then decided that the Participation TV had more excitement.

In class tonight the first presentation described a formal selection criteria and I thought that was something we were definitely missing. This became obvious to me because when we selected Participation TV, I became the defacto domain expert. It was not until the Tuesday night that I posted my take on the project and when I did it was amidst other emails full of opting out phrases like:

"my friend who is also in HCI that we need not show a demo tomorrow. That is not required" and

"I think we are confused about the expected group participation for tomorrow night"

I really should have posted my take Monday night... I feel like we lost significant buy-in from the stakeholders between Monday and Tuesday. I spent Wednesday before class building props and getting the cameras and software talking then running it into school. Everything was built between 8 and 10 am and 1 and 4 pm Wednesday before class. It was definitely not feeling much like a group project.

We met briefly before class and figured out roles in the sketch and headed into class. I was sure other groups were going to present something substantial and knew we had to as well. After class we decided to do an additional Friday meeting to correct what happened this week. We discussed time commitments to the class... Grad students agreed on 10hrs/week and our undergrad agreed to do 5. I hope I'm not blowing our cover with this candid report. It's in the spirit of finding the means to tap all of our potential to the max. On Friday we're going to revisit our selection methodology and try to clarify our project choice with a more formalized criteria. It's not that I'm unhappy with Participation TV it's just that it's more important to get our engine firing on all cylinders.

Participation TV

We selected the Participation TV project for our Wednesday presentation. Here's is my take that I posted to the group on Tuesday. Also to the right is one of our cardboard cameras.

We can use the projector to display a theoretical performance on stage... I suggest we have 2 cardboard cameras on tripods with one of them actually wired with laptop and camera. We should also have a cardboard mixing station.

Here's an example of how the sketch could run.

Short explanation before sketch:
We have designed a participant based video documentation system for use in live performance venues which gives the audience members the tools and ability to document the show their seeing in an immediate and collective way for streaming live or archiving for posterity.

Scene 1:
Team 8 announces an innovation in audience participation and event documentation coming to a performance space near you. Introducing Participation TV. Let's see how it works.

Cut to scene 2:
(participant 1 approaches a camera chair while looking up at the performance on the screen)
participant 1:
Wow look I can sit at this cool camera chair and film the show while I watch! It's got effects and everything! I can even enter my name and it automatically goes into the credits... This is gonna be awesome!

Cut to scene 3:
(participant 1 and participant 2 are both in camera chairs filming the action on the screen... definitely enjoying themselves)
With Participation TV, your performances get live action archived by the volunteers right out of your audience. Participation TV makes documentation easy and fun. But wait there's more:

Cut to scene 4:
(participant 3 walks up to the mixing deck in the middle row, participants 1 and 2 are still busy filming away)
participant 3:
Wow I get to be the mixer tonight. This show is awesome and I'm gonna mix it up right. I think it will be cool to cut those hi-rez chair cams with all of the low-rez smartphone streams coming over bluetooth. Talk about audience participation!

Cut to scene 5:
(participants 1 and 2 keep filming and participant 3 is live mixing while the show continues on)
With Participation TV you can archive all live signals separately as well as stream the live mix out to other monitors or live over the internet. Participation TV helps you help your audience capture your events.

cut to scene 6:
(viewer at home watches mixed signal live on their home computer)
What a great show! It almost feels like I'm really there. What will Team 8 think of next?

suggested to do list:
create storyboards (cover at least the 6 scenes mentioned above)
construct cardboard cameras (coordinate with whoever's doing the camera screen shot)
construct cardboard mixer (coordinate with whoever's doing the mixer screen shots)
create screen shots:
camera user interface including (login screen, live shot, user controls and effects)
think netbook or laptop screen

mixer interface screen maybe 3 or 4 iPads networked together touch screen interface
could simply be another laptop or computer with multiple screens

end user at home interface (laptop screen) could show mixed signal with the independent
signals selectable (could be shown on the laptop in the sketch)
create performance video for the screen(we have some footage)
write up the sketch(document the technology) include shots of all of the props, shots of the sketch
storyboards, preparation materials for the sketch, screen shots, team member photos and bios...
Team 8 logo... be elaborate(this piece can be delivered after the sketch is done via our journals)

Who can do what:
I would like to build the cameras I can use a laptop and webcam.

Example images from the venue and video
I've attached some pics sorry they're not great,
so far documentation has not been our strong suit

here's some videos as well
- Show quoted text -

Friday, February 5, 2010

Week 3 Presentation

Here's a slideshow I made for our group's presentation for week 3. This was our first deliverable since the in class sketching experiment and it was a bit rough around the edges. Paul seems to be coasting in towards graduation with a confidence that defies his potential engagement. Don has been overly worried about the form of our output, trying to get a formal wrapper around the whole thing so it could be presented in a rational formal context. I rebelled against his slide show feeling like we didn't have a whole lot decided about the meaning of our collaboration and that a looser approach seemed intuitive. I was compelled to describe our experience of the process as opposed to what we mean already... I feel like we're just getting together, the lights turned on and the rapport established... sure we're brushing around ideas, but if feels more like general directions than scenarios. This next project has us each coming to the meeting with 3 scenarios like we tried to do in class during the first sketch. Hopefully we can move on to idea selection and get to designing the experience during this meeting, giving us a couple days to implement some good features to the sketch... I've already got a head mounted display. I feel like my class combination this semester is really good.

Thursday, February 4, 2010

The Sample Suit

A Multi-Modal Platform for Augmented Reality Gaming

This is the work I have done for my first project in my Computational Principals for Media Arts class. I'm posting it here because I may use it as the platform for my solo Graduate Student project for the class

For this unit, I chose to explore the medium of sampling by designing a

multi-modal augmented reality costume named the Sample Suit. The Sample Suit enables the wearer to mediate 5 of their senses while they interact with the world.The suit employs a single computational element, a head mounted iPhone 3Gs to facilitate mediation of sight and sound via a digital sampling app (still in development). The senses of taste and smell are mediated by means of a palette of tastes and scents stored in various compartments of the suit. In the sample suit, the sense of touch is considered a special case. Most if not all of the body's surface is covered by the costume damping haptic engagement. Below I will discuss in more the suit's mechanisms and their relationship to the wearer's sensory perception. I will also describe an example scenario that utilizes sample suits as the primary interface

in a multi-player augmented reality game called "Digital Plague".

The 5 Senses

Sight:The sense of sight is mediated by the iPhone directly. The wearer sees the outside world via the screen of a head-mounted iPhone. The image on the screen is a sampled stream of data coming directly from the iPhone's camera on the other side. The app delivering the video signal is capable of altering the sample rate of the video, the resolution of the image, the depth of the color as well as other digital graphic effects. The app also can provide heads up display graphics that are rendered over the top of the video image. These overlay graphics can provide the wearer with GPS data as well game state information and score.

Sound:The sense of sound is also directly mediated by the iPhone. The wearer will have ear bud headphones that stream audio that is recorded by the microphone in the iPhone. Like with the video signal, the app will be able to control the sampling of the audio signal. Sample frequency, and bit depth will be adjustable through the app. To prevent other unmediated audio from reaching the wearer, sound deadening earphones will cover each the wearer's entire ears.

Taste:The sense of taste is mediated by an array of representative flavorful pockets designed into a tasteful vest. The flavor pockets are arranged in a two-dimensional array across the front of the wearer's vest. Each taste pocket contains a uniquely formulated flavor. Considerations have been made to prevent spoilage of the sample flavors and the spectrum of tastes has been concentrated in the sweet range. Jelly beans have been selected for their unique combination of properties, including small, discrete size, large variety of flavors, ease of acquisition and affordability. To experience a bite of flavor, the wearer simply reaches down into a specific pocket and picks up an individual jelly bean. The wearer then pla

ces the jellybean in their mouth and chews.

The sense of smell is mediated by a string of test tubes worn around the wearer's neck. Each test tube contains spices and oils with strong and varied scents. The set of scents is a combination of kitchen spices and perfume oils. To access a particular scent, the wearer counts down from the initial tube an locates the sample to smell. The wearer then uncorks the tube and lifts it to their nose and sniffs.

The sense of touch is mediated by the fabric of the costume itself. Most of the body is covered by the fabric of the costume and the hands are covered with gloves. The fabric and gloves provide a layer of protection from the skin's sensitive surface. While the wearer interacts with the world the suit acts to dampen haptic signals effectively filtering the touch the wearer receives.

Example Scenario:

"Digital Plague" is a prototype game scenario based on the Sample Suit Augmented Reality Gaming Platform.

"Digital Plague" is a multi-player augmented reality game designed to be played in public spaces filled with other non playing people. The object of the game is to locate other players in the crowd using GPS and vocal messaging. Finding other players is made difficult by mediation of all of the player’s senses. The visual and audio signals are seriously degraded by the digital plague so all the player sees of the world around them is low res textures and all the player hears is noisy sounds. Overlaid atop the low res visual is rendered a GPS map of other players in the game. Once you log in to the game, your GPS coordinates are broadcast to other players. Overlaid over the audio noise players can project their voices. The amplitude of the voices is scaled based on the distances between players. When two players eventually meet up, they remove their gloves and shake hands establishing authentic human contact in a world lain to waste by the digital plague. Taste and smell queues are used to express game states and send messages between players at a distance and to celebrate connection and build rapport between characters in contact.

While the entire Sample Suit is recommended for immersive game play, the "Digital Plague" app and headphones are the minimum requirements to play. The other components of the Sample Suit will be communicated to players as "do it yourself" (DIY) peripherals that conform to a standard addressing scheme. A website forum will collect example implementations in the spirit of open source development.

Appendix 1:
Flavors in the tasteful vest
1. Lemon
2. Toasted Marshmallow
3. Coconut
4. Orange Sherbet
5. Blueberry
6. Watermelon
7. Sour Pear
8. Lemon Lime
9. Green Apple
10. Pina Colada
11. Sour Blueberry

1. Cinnamon
2. Orange
3. Bubblegum
4. Peach
5. Strawberry Cheesecake
6. Cherry
7. Cotton Candy
8. Sour Raspberry
9. Sour Strawberry
10. Licorice
11. Juicy Pear
12. Sour Watermelon

Appendix 2:
Scents in scent belt
1. Cumin
2. Brownie
3. Coffee
4. Apple Cider Vinegar
5. Peptol Bismol
6. Oregano
7. Honey
8. Rubbing Alcohol
9. Black Pepper
10. Cinnamon
11. Robitussin
12. Garlic
13. Horseradish
14. Peanut Butter

Appendix 3:
iPhone Development

The Sense Suit is built around a single iPhone 3GS and a customized application called Rez Goggles. The Rez Goggles app is designed to adjust the resolution, sample rate and bit depth of the incoming video signal and portray the result on the iPhone's screen. The Rez Goggles app also supplies a component that adjusts the sample rate, and bit depth of the incoming audio signal before delivering it to the headphones.

Currently the Rez Goggles is still in development. I have acquired an iPhone developer’s license, created a partitioning profile on the phone and built a sample camera app which does work with still images. Being new to iPhone development, I have had trouble with accessing the camera for video among other things.

So as of this release the Rez Goggles app is not ready. This is a project that I intend to complete as the example game scenario is one I am working on for my independent project in my Human Computer Interface class with Win Burleson.

Wednesday, February 3, 2010

So our HCI group met on Monday to discuss the class and the presentation we were asked to provide in the next class. It sounds like kind of a status report from our group with respect to the group project. We spent some initial time discussing the class in general and then got down to business. The "ideation funnel" seemed to be a good initial point of reference for our brainstorm. I found a bunch of funnel images and this one seemed to fit with my understanding of where we might be in the design process.

At first it seemed like Don and I were hogging the floor with i
deas and problems and heuristics and goals. Paul was listening but quiet. He wasn't sure where all of this was going and had some basic concerns and questions. For example:

How much of this do we have to actually make, given our finite time and "no resources"?
An answer to this referred to the Wizard of Oz talk from class as
well as suggesting that the more we can make the better. I also felt the need to clarify that we have lots of resources. I mentioned the computers and iphones as well as a performance venue downtown, multiple art spaces, large groups of people we could test our ideas on.

What is Game as Life Life as Game?
As a volunteer dev team member last term, I was able to explain my understanding of the GALLAG system as well as a short list of scenarios I've been cooking up for it. (i.e. a dishwashing community game, a creative productivity game, a community spaceship gam
e, an artspace game, an AR game called Digital Plague and an art collector game.)

I felt like we are at the beginning of a process, getting the lights turned on, getting familiar with each other, our backgrounds, our skill base, setting up regular meeting times and getting the ideas flowing.

After addressing Paul's concerns he began to flow too. By the end of our first meeting, the conversation was really balanced across the table. We met for longer than planned and I felt energized and inspired.

Monday, February 1, 2010

And so it begins...

Today I finally got around to setting up the blog for my HCI class. It is my hope that this blog persists beyond the class but we'll see how it goes. The name of the blog is "Digital Plague" which comes from a series of ideas related to the role that computers play in our lives. Yes, I do have a unique viewpoint... Sometimes I tell my AI students that I work towards understanding Artificial Intelligence so that I can fight effectively on behalf of the humans when the human/robot war comes :).

Imagine that the computer is already a life form... not so much an organism but more like a mold one that is spreading across the surface clustering around nodes of human expression and interaction. Every day more and more computational elements are coming online and staying there. Will we live to witness human limbs atrophy and perish as the interface becomes complete? Let us hope for metal legs in the bargain.