(click on image for more)
This is the muscle sensor project. (continues below)

Highlighting the potentials for muscle sensor usage, I created a completely wearable device that would attach to the arm and visually affect the real-time (Kinect) webcam image using a Particle API attached to a JSON client for Processing Script, and a lithium battery. The flexion of the full arm (from the wrist and the forearm pulling in towards the body) widens the visual threshold of red throughout the webcam video, and the opposite pull away from the body with the forearm decreases the red surface area. The muscles in the wearer's body mitigates moving image results.
Design and execution by: Anais Morales
Repo here
This is the buzzing chair project. We created a simple device that is able to use positional data to detect the movement of the wheel of a Herman Miller Caper Stacking chair
(School of the Art Insitute of Chicago’s school chairs). With this data, we were able to ‘visualize’ this chair's movement through sound with the use of a piezo-buzzer
and two IoT devices: the Particle Boron, and Xenon. We experimented with the use of this input device (accelerometer), because we were curious about the rotational
data we’d be able to receive, which might have told us about our use of the chair while at school.
When not moving, there would be no sound-- when moving, sound would play.
The set up of this circuit also altered the way the chair was sat in, which further poses the question of ‘chair behavior’, and its potential use as a controller.
in collaboration with Boyoung Nam & Jungyun Koh - Make/Move Experiments in Urban Mobility Research Studio
Technology designed and execution by: Anais Morales
This is a website designed and crafted be my for my art portfolio/art career. It was done before everyone else also started to use full pages for images, but to me it was appropriate as I was trying to deliver the aesthetic experience as close to being in the same room with the work. It relies on JQuery for transitions and design effects and is a static website. My subdomains are git CI/CD version controlled
Viewable
here
Portrait is an interactive art installation that is completed with the viewer's participation. When one passes the work, or stands to view in front of it - it completes itself and emitts the LED light component to the work. It runs on an ultrasonic sensor that is connected to a microcontroller, to acuate the LED light component. It is made of industrial steel and vacuum formed thermoplastic, designed and executed by me. It was shown at Sullivan Galleries in Chicago, and a second version was shown in Berlin, Germany at my art exhibition, "Moving Parts of an Effigy", where walls were built to produce an enclave for added viewing and experiential effect.
Using Yahoo! Finance API I created a dashboard using Pyplot and Dygraphs, using Shinylabs app from Rstudio and isolated its interactive graph component pace the rest of the Shinylabs.
Live
here
Repo
here
S-line
This is an app I designed which hinged on wearable technology and would accept and track users' biometrics. I designed the User Interface with frames for the mobile web app that would communicate to the user about their
biometers and goal markers, which would alter the behavorial pattern of the wearerer which is its goal intent.