DRXYA

motion tracking robot

Process

The DRXYA robot is calibrated to change its colour and position based on the actions of a user scanned with a Kinect sensor. The on-screen Processing visualization relays the information (a series of commands and coordinates) via serial communication to the Arduino, which in turn activates the both the lighting sequence of the addressable neopixel LED strips and the movement of the stepper motors in X and Y directions. With the presence of a user within 1.5m from the facade, the colour of the central ring changes from a rainbow display to blue. When the user’s hand is raised, the colour of the inner ring changes to an animated purple sequence. During this phase, the user is able to control the position of the ring in X and Y through the translation of the Processing sketch’s Cartesian coordinates to that of the built facade. This vertical CNC device has many implication responsive facades, interior partition light systems, 3D mapping, and visual communications.

Contact

Type: research project | physical computing, responsive facades

 

Developed at: IaaC | Institute for advanced architecture of Catalonia

 

Date: March 2014, Barcelona, Spain

 

Team: Ramin Shambayati

            Robert McKaye

            Luca Gamberini

            Christoffer Ryan Chua

            Tou Foo Wen Shan

            Sahil Sharma