I am currently working on various designs around wearable technology and fashion.
I am interested in energy harvesting, body aware technologies, smart materials.
I creatw designs which fuse fashion and technology rather than using fashion as a vessel for technology — further information is forthcoming
This project was inspired by the Oculus Rift
and the odd disembodied experience it provides:
Entering a virtual world using the Oculus Rift usually leaves you without a physical body.
You are reduced to a floating camera.
I wanted to explore what it would be like to provide the user with an avatar which they can control.
I created an inertial motion-capture system in order to control a simple game written in Processing.
While Heidegger describes technology as something which is in a sense greater than humans,
following its own laws and logic, Merleau-Ponty describes objects, which become part of us.
According to Merleau-Ponty perception is an active process, and we use our tools as mediums for perceiving the world. Objects thus become embodied.
This concept fascinates me. I wanted to turn the idea around.
If objects become a part of us, can we become part of objects?
I did a large number of interviews on this subject, talking to friends and colleagues and trying to systemize their relations to the tools they use.
I finally found that Don Ihde had created a taxonomy related to what I was trying to achieve. Ihde describes different types of relations we can have with objects and classifies them as (non-mutually exclusive) alterity, background, hermeneutic and embodied relations.
Equipped with this vocabulary I was interested in a more empiric approach. I decided to investigate the sensation of touch in a telerobotic system.
For more information, visit the Kickstarter page, and the project page.
It is not uncommon for people to be carrying multiple devices with them at any given moment.
(While some of us might limit their mobile devices to a phone and maybe a tablet or a laptop, others go completely overboard :-D...). Either way, these devices do not 'play well' together.
With DisplayPointers, I try to figure out how things could be different and what kind of seamless interactions between devices are possible.
DisplayPointers is an interactive system which explores how to use display devices as pointing devices. Imagine pointing at an icon and instead of the application opening on the main display the application I will open on the pointing device. Imagine using your phone as a lens to zoom in on certain features of a map.
I created a simple hardware setup for enabling these interactions in addition to a series of demo-apps.
This was my first exploration into a potential telepresence application. I was interested in creating clothing which senses the pose and gestures of the body, in order to control a remote avatar.
I explored various sensing techniques and ended up working with stretch sensors.
These sensors fascinated me, as they could be set up in parallel to our muscles.
With WristFlicker I use stretch sensors to capture the movements of the wrist; in the same way as the flexor andextensor muscles contract and expand together with the pronator teres create the motion of the wrist, a series of flex sensor contracts and expands which allows me to measure the wrists motion.
The vision behind this project was to create a modular, individually customizable general purpose digital device.
This is a collaboration between Yann Leretaille and myself.
The hardware implementation consists of the Raspberry Pi Interface and a 'motherboard'.
The motherboard is host to individual modules which expand the functionality of the Raspberry Pi
A sample of plugin modules (or Berries):
(Clockwise from top left) A GPIO module, an analog input module, a button module and an Arduino compatible ATmega328p module.
Each module has an additional ATtiny microcontroller which sends an instruction-set to the Raspberry Pi,
specifying communication protocols, hardware ID and functions of the corresponding module.
This allows a user to directly access the higher level functions of the module.
We designed a programming environment intended to allow people otherwise not experienced with physical computing
to effortlessly create interactive devices. It is a graphical interface, which creates nodes for each input and each output to the system.
Depending on which modules are connected, it will load the corresponding nodes. Users can create logic nodes, and connect them using a
logic-flow approach. Below you can see a demo of an early mock-up of this environment:
In the above demo, I have three outputs (lets assume they are connected to lamps) and two inputs (lets assume they are light-switches).
The first lamp is turned on if the first switch is turned on. The second lamp is only turned on if *both* switches are turned on (due to the 'and' operator)
and the last lamp is controlled by the second switch.
Below is a screenshot of the circuits Yann and I designed:
A Flock of Birds is an art exhibit, which explores the coupling of sensing, actuation, and interaction in a folded paper substrate.
When folding paper, the act of folding is the input. The fold acts as the algorithm: it computes the output, translating it from the input. Finally we have the final shape, which is the output of the interaction.
The origami dove, which flaps its wings when its tail is pulled, is an example of a more complex folding interaction. We use a flock of these origami doves and two speacial "leaders". By pulling the leaders tails, the wings of the leaders flap and in addition the flock will react and copy the leaders movement.
This art piece was the result of the COCA201 (Computing & Create Arts) course at Queens University.
There are various ways of actuating materials. I find materials which self actuate way more compelling than actuating materials with an external motor. Shape memory alloys (SMAs), or bimetals, change their shape depending on their temperature. There are various types of SMAs which behave differently. Here I was trying to familiarize myself with basic Nitinol, integrating it into paper.
These concepts were later re-explored by my friends Antonio Gomes and Andrea Nesbitt, who used similar methods in actuating MorePhone.
This actually is an idea which factors into to all of my projects.
Often the resources required to solve a problem are right before your very eyes.
They are simply hidden, because they have other pre-assigned functions. A big step in problem solving is seeing things for what they are rather for what they are declared to be. For this project I used a round structural object to create a lamp.
Download full CV as pdf or click here for interactive visualization