Friday, April 17, 2009

The Final Result of "Hand it"

Our Team PDF: Click Here

We did not deviate too much from our original idea of "Hand it". It still turned out to be a rhythm game based on space. The User has to collect matter from the universe which is metaphorized by the "hands" triggering notes over portals and in the end result that matter is summed up and translated into images. These images depict what kind of universe the user has created. The basic concept is utilizing the four corners of the screen to hit notes and collect matter to build up a universe.



Instead of using gloves to detect colour for this project, we ended up using toilet cleaners because the colours they emanated were much more solid than that of gloves. These toilet cleaners would be used to hit the notes on the screen by moving them around towards the portals. Many different patterns could be used to hit the notes which allows for a lot of variability (and exercise).

I would say the hardest part of this project was the initial connection of MAX/MSP to Flash. We had to use a java based server to open a gateway in order for data to flow in from MAX to Flash. This also presented some other problems with our Flash. We had to re-code our entire script from AS3 to AS2 in order for the connection to work properly because it apparently only supported AS2. Furthermore, the communication of MAX/MSP to Flash created memory issues also. The information from max would exponentially grow on the Flash side and the information would not clean itself up, even the data that has already been passed. We had to rid of some storage functions to allow for more efficient memory uses in our main program, else, the game would crash in early stages of game play. The programming itself was tedious because for every note that is created on the screen, a new function had to be called, so it expanded our code to lengths of hundreds of lines just for that.

This is our Video Blog to explain the main components of our game. It includes an interview with a very prestigious game designer, Cody Church, who gave us very enlightening feedback to better our game:

Click Here for Video





Code from Flash:


For score detection:

stageMC.onEnterFrame = function() {

//TOPLEFT SCORE CHECK
if ((HANDL._x > 60 && HANDL._x <> 50 && HANDL._y <> 60 && HANDR._x <> 50 && HANDR._y < 130)) {
hitS.gotoAndStop(2);
if (s1._x > 60 && s1._x<140) {

pointS += 1;
ScoreS.text = pointS;
}

if (s2._x > 60 && s2._x<140) {

pointS += 1;
ScoreS.text = pointS;
}
} else {
hitS.gotoAndStop(1);
}
//TOPRIGHT SCORECHECK
if ((HANDL._x > 500 && HANDL._x <> 50 && HANDL._y <> 500 && HANDR._x <> 50 && HANDR._y < 130)) {
hitP.gotoAndStop(2);
if (s3._x > 500 && s3._x < 580) {

pointP += 1;
ScoreP.text = pointP;
}

if (s4._x > 500 && s4._x < 580) {

pointP += 1;
ScoreP.text = pointP;
}
} else {
hitP.gotoAndStop(1);
}

//BOTTOMLEFT SCORECHECK
if ((HANDL._x > 60 && HANDL._x <> 350 && HANDL._y <> 60 && HANDR._x <> 350 && HANDR._y < 430)) {
hitM.gotoAndStop(2);
if (s5._x > 60 && s5._x<140) {

pointM += 1;
ScoreM.text = pointM;
}

if (s6._x > 60 && s6._x<140) {

pointM += 1;
ScoreM.text = pointM;
}
} else {
hitM.gotoAndStop(1);
}

//BOTTOMRIGHT SCORECHECK
if ((HANDL._x > 500 && HANDL._x <> 350 && HANDL._y <> 500 && HANDR._x <> 350 && HANDR._y < 430)) {
hitO.gotoAndStop(2);
if (s7._x > 500 && s7._x < 580) {

pointO += 1;
ScoreO.text = pointO;
}

if (s8._x > 500 && s8._x < 580) {

pointO += 1;
ScoreO.text = pointO;
}
} else {
hitO.gotoAndStop(1);

}

};


CODE for Shooting STUFF to the corners


//FUNCTION TO SHOOT THE STARS to the CORNERS.
function topLeft1() {
var Tx1:Tween = new Tween(s1, "_x", None.easeIn, Stage.width/2, 0, 2, true);
var Ty1:Tween = new Tween(s1, "_y", None.easeIn, Stage.height/2, 0, 2, true);
}

function topLeft2() {
var Tx1:Tween = new Tween(s2, "_x", None.easeIn, Stage.width/2, 0, 2, true);
var Ty1:Tween = new Tween(s2, "_y", None.easeIn, Stage.height/2, 0, 2, true);
}

function topRight1() {
var Tx1:Tween = new Tween(s3, "_x", None.easeIn, Stage.width/2, 640, 2, true);
var Ty1:Tween = new Tween(s3, "_y", None.easeIn, Stage.height/2, 0, 2, true);
}

function topRight2() {
var Tx1:Tween = new Tween(s4, "_x", None.easeIn, Stage.width/2, 640, 2, true);
var Ty1:Tween = new Tween(s4, "_y", None.easeIn, Stage.height/2, 0, 2, true);
}

function bottomLeft1() {
var Tx1:Tween = new Tween(s5, "_x", None.easeIn, Stage.width/2, 0, 2, true);
var Ty1:Tween = new Tween(s5, "_y", None.easeIn, Stage.height/2, 480, 2, true);
}

function bottomLeft2() {
var Tx1:Tween = new Tween(s6, "_x", None.easeIn, Stage.width/2, 0, 2, true);
var Ty1:Tween = new Tween(s6, "_y", None.easeIn, Stage.height/2, 480, 2, true);
}

function bottomRight1() {
var Tx1:Tween = new Tween(s7, "_x", None.easeIn, Stage.width/2, 640, 2, true);
var Ty1:Tween = new Tween(s7, "_y", None.easeIn, Stage.height/2, 480, 2, true);
}

function bottomRight2() {
var Tx1:Tween = new Tween(s8, "_x", None.easeIn, Stage.width/2, 640, 2, true);
var Ty1:Tween = new Tween(s8, "_y", None.easeIn, Stage.height/2, 480, 2, true);
}

topRight2();
bottomRight1();
bottomLeft2();

//FUNCTIONS THAT MOVE THE HANDS


moveLEFT(10,10);
moveRIGHT(501,100);

//HANDL.startDrag();
//TEST PURPOSE
stageMC.onKeyDown = function() {
//topLeft1();
//var Tx1:Tween = new Tween(s7, "_x", None.easeNone, 0, 190, 3, true);

};

Tuesday, March 10, 2009

Final Project Rough Idea

“Hand It”

We chose to design and program a rhythm game that allows for much user-to-interface interactivity. The main idea is that the screen is divided into four different sections. In each section, a thin rectangle would move from the bottom of the screen to the top, and the user would have to make contact with the rectangle in the specified area to acquire points. The method of contact used will be with the user’s hands. Each hand will be covered with a glove taped with Infrared Tape which will help MAX/MSP detect and produce a result on the screen. This will be accomplished with the aid of a Nintendo Wii controller. To avoid cheating, the hand must be moved out of the specified section before moving to the next one; this helps eliminate points gained for simply wildly waving the gloves to catch all the moving rectangles. Our inspiration came from the original Japanese rhythm game called “Para Para Paradise”. Similarly, in this game, dots move up on the screen to different areas. The player moves their hands into the specified area and removes it to activate it.
Timeline:

Week 10: Project idea pitch and initial planning.
Week 11: Initial interface testing.
Week 12: Gameplay testing.
Week 13: Project Completion.
Week 14: Project Documentation.

Thursday, March 5, 2009

Sketch 2 Revised



For our sketch, we decided to create an interactive flower that corresponds to our movements. Out project revolves around the camera detecting only the "white" on the screen by using subtractive elements. The more white there is, the larger the threshold, and thus the larger the flower grows. We have about 8 stages of flower growth, starting from a seed to the final stage of a full blown flower.

The user's energy had to be substantial enough to produce a full grown flower. This is a subliminal metaphor which means that a flower needs energy to grow, and the more it gets closer to a full blossomed stage, the more energy it requires. The flower in later stages is hard to sustain and is very sensitive, and that is why the final image of the blossom will decrement if not enough energy is applied to it.

This is the seedling stage of the flower:










This is the blossomed flower:













Research


Our project was derived from another theory: Sommerer/Mignonneau’s The Interactive Plant Growing (1992).Their installation is about the growth of plants. People can becomepart of the installation by touching the real plants and affect the
plants inside the screen.

Friday, February 20, 2009

MAX/MSP, JITTER + SKETCH 2


MAX/MSP and Jitter


Well, we started out very confused about this new program. The syntax seemed to
be a but complicated even though it was in visual form and there was not much
coding, if any, to it. So at first, sketch 2 seemed completely out of reach for us.
However, after understanding the fundamentals of MAX/MSP and the relationship thereof to Jitter, the pieces of mystery began to unfold slowly. We started to
grasp the basic concept of max/msp and eventually came up with a doable idea for sketch 2.

Sketch 2

For Sketch 2, we decided to integrate our webcams with movement and colour
to produce a different result such as changing images. We chose to have 4 thresholds,
so that when the user interacts with the camera, the more intense the movement or the change in color will trigger new images to span across the Jitter output. This idea seemed complex at first, and it was mainly the syntax that troubled us to begin with, but after a while we figured out the interactions of boxes in the code and put
it together to produce already working code so we could begin prototyping. The result
will be much more complicated than a simple movement of the body to trigger a change in an array of pictures, but will include more creative and interactive means of
transitioning images.

Monday, January 12, 2009

IAT 320

Alfred Darakjian
Ada11@sfu.ca – 301041807

IAT 320 Week 1 Assignment

Reading 1:

Summary: In essence, the author tries to compare the human body, mind and computers together. The mind works synonymously with the body as stated and computers are able to interpret human behaviour and give feedback. This is where HCI (Human computer interaction) comes in; this is defined as the safe and usable interaction between humans and computers. They used an example of the body in cyberspace, much like in the movie matrix, where our minds can be free by the use of the internet, and not limited by only our bodies.

Associative Phrase: Embodiment.

Quote: “The body should not be forgotten or separated from the subject in the new media design, because body is an essential part of our existence.”

Reading 2:

Summary:
The main idea of this reading describes the complexity of human senses. For example, design uses affordances to increase the usability of an object just by looking at the object and guesses what it can do. Furthermore, when we look at an object using our senses, one sense triggers the next and you can't do anything about it. When you touch it you eventually will see it. This is how our senses function.

Associative Phrase: Sensoriality.

Quote: “Take a scrubbing brush for example. Just by looking at it I know what would happen if I put it in my mouth without actually doing it.”

Brain Storm



IAT 320 Week 2 Assignment

Alfred Darakjian
301041807
ada11@sfu.ca

Application of new Fiber and Malleable Materials for
Agile Development of Augmented Instruments and
Controllers

Summary:

In the world of materials, there are new types being introduced every year to either increase the efficiency of some type of machine, to make the environment friendlier, and for cheaper production costs. This paper summarized just a few example of different materials used for different types of Augmented instruments and controllers. The development occurs in many examples, such as in the Capacitate footswitch, which can be prototyped because of the efficient fabric used for it (Figure 7). The point of this paper shows the variability/versatility of the instruments and controllers with the aid of many types of fabrics and materials. The efficient application of new materials needs a new curriculum. This is based on the emerging design patterns and needs some context where the experience of fiber/malleable materials artists can be combined with that of material scientists/application developers.



Quote:

New fiber and malleable materials present interesting challenges and potential beyond the rapid prototyping advantages described here.



Associative Word:

Agile Development


Arduino Lab Activity




In the Arduino lab, we created a simple yet effective program which set 4 different modes for the LED:

MODE 1: LED is OFF.
MODE 2: LED is ON.
MODE 3: LED is blinking fast.
MODE 4: LED is blinking slow.

Process for Project:








IAT 320
Alfred Darakjian
301041807
Week 5 Reading

Interactivity between humans and computers is a growing phenomenon. Tracking people has been seen as another importance in society; taking real-time information from real world people on their daily basis produces the best results. With that, Computer Vision algorithms are commonly used in these interactive activities. There are many techniques which map people’s emotions, activities, gestures and many more by simply using repeated actions of humans to create solid algorithms. The reliability of the video quality is important for the vision algorithms. A well designed physical environment helps with concise tracking, this in essence, allows for better software use. All in all, as the years progress, more and more complicated virtual interaction schemes are arising to replace old and out-dated ones.

Quote: "the idea was to track a tragic social phenomenon which was not being counted — that is, doesn't count".

Associative Word: Computer vision.


Code for Sketch 1:



Arduino Code:

int potPin = 2; // select the input pin for the potentiometer
int potPin2 = 3;
int ledPin = 13; // select the pin for the LED
int ledPin2 = 12;
int val = 0; // variable to store the value coming from the sensor
int val2 = 0;

void setup() {
pinMode(ledPin, OUTPUT); // declare the ledPin as an OUTPUT
Serial.begin(9600);
Serial.println("Systems On");
}

void loop() {

val = analogRead(potPin); // read the value from the sensor
Serial.println(val);
val2 = analogRead(potPin2);
Serial.println(val2);

if (val > 100) {
Serial.print("Object Detected on Sensor 1");
digitalWrite(ledPin, HIGH); // turn the ledPin on
} else {
digitalWrite(ledPin, LOW); // turn the ledPin off
}
if (val2 > 100) {
Serial.println ("Object Detected on Sensor 2");
digitalWrite(ledPin2, HIGH); // turn the ledPin on
} else {
digitalWrite(ledPin2, LOW); // turn the ledPin off
}
}