Thursday, March 19, 2015

What worked:
-The program ran without a hitch but had a slow loading speed due to the data
-Each of the boxes worked well together to contain their individual sensors and were not visually jarring to the viewer
- The push box distracted the viewer from how their actions were being recorded but was able to activate properly every time
- The viewer was immediately aware that they controlled the content and kept the sensor active to see the images on screen
-Although it took a few attempts the viewers were able to determine the proper sensor to use as the prompts changed
- The viewers were able to pick up on the intentional similarity to the song Technologic by Daft Punk

What didn't work:
- The twist sensor looked more functional instead of having an intuitive function that would be cross-applicable for every variation of the action
- To see all the possibilities for each sensor it took quite a while before a full cycle was visible
- The gif playback was not smooth and it would take a few repeats to see every frame

Reception of the project:
- I believe the project was well received
- Most participants connected with the material and found the piece engaging enough to see the limits of the programming

Expansion;
- The suggesting of adding more sensors to overwhelm the viewer was a great suggestion and would absolutely help the project
- The possibility of combining two or more sensors would add depth to the project and content 

Tuesday, February 24, 2015

Final Project update

Final project: Thursday/ Friday (code and test sensors)
Code:
Prompt:
Text center
Push it
Twist it
Open it
(find other words for these terms)


Reward:
Take sensor input and display image
Pot: sense from pot= twist motion
Flex: start flex then when a lower value is seen create action
PID: detect motion show image
Pool of data
Spin:
Swing fails
Merry go round on motor
Box related:
It’s a trap
Whats in the box
Dick in a box
Push related
Scooter down the stairs

Lab day (finish building sensor housing )
Build
3 sensors
Flex
Box
Drilled hole
PID
Wood
Hole covered with symbol

Pot
Wood
Spring (return spring)
Nails
Screws

Tuesday, February 17, 2015

SEMIOTIT

Semiotic interaction: this is an experiment of the interpretation of symbols based upon common language prompts. Each input device will have a symbol or symbols directly related to the proper use of the device. The box will be the simplest prompt and symbol the understand due to the symbol drawing upon pre existing symbols related to the opening of a box. The potentiometer will be built up into a small plate sporting a square nob. This nob will have symbols similar to an arrows that point in the direction that nob must be turned to register input. The final device will be a 20mm x 40mm sheet of wood with a hole drilled in the center which will house the IR Sensor. The symbol will guide the user to move their hand over the sensor without explicitly stating where the sensor resides.

The game will tell the user to interact with a specific object first by using very obvious language and as each stage progresses the language will change and become more complicated and convoluted but still reference the same action


This is an experiment to ascertain the depth of knowledge and the semiotics of language pertaining to describing action

Tuesday, February 3, 2015

Bop it with gif reactions

Aurduino will function as a controller with a flex sensor, pentameter and button

When the user performs the proper action after being prompted via the screen an video clip is played corresponding to their actions. If they fail to do requested action the game stops and restarts. Each time they doing something right they are rewarded with a score and funny fail gif.

The receiver will constantly loop looking for an osc message that is packed from the controller.

Topics: Internet physicality

https://www.youtube.com/watch?v=qA-aTZJNB1w#t=13

Sunday, January 25, 2015

Project possibilities

Over the last few labs we have been worked with OSC. We have successfully sent and received data and triggered user initiated events using two Arduinos. I cannot really see creating anything very interesting that incorporates user interaction other than pre-packaged information that can be sent by a verity of push button or trigger sensors.

I'm looking forward the seeing the capabilities of the xbee device. If possible I would like to use it to intercept packets from devices that are connected to the network. My overall idea is to create a passive flow of information that can be collected and used to create a visual or physical reaction not explicitly triggered by the user.

From what I have seen so far, data can be sent directly to the xbee device and does have the capabilities for interception. Although a project that creates reactions based on signal strength is a distinct possibility.

(Reading Xbee RSSI with Arduino)
https://davidbeath.com/posts/reading-xbee-rssi-with-arduino.html

Tuesday, January 6, 2015

Quarter Begins

First brush with OSC programming and active communication between computers. There is a possibility of integrating touch OSC and Lemur as external phone controllers. Project ideas could range from sensor capture on event, or human event. No significant problems or implementation ideas.