140

Encode Engage

Embed Size (px)

DESCRIPTION

Final Collection of MArch 2013 Thesis. Encode Engage is a study and experimentation of the relationships between people, technolgy, and space.

Citation preview

Page 1: Encode Engage
Page 2: Encode Engage
Page 3: Encode Engage

ENCODEENGAGE

Page 4: Encode Engage
Page 5: Encode Engage

Acknowledgments

Poster

Abstract

Thesis Question

Vocabulary

Research Essay

Methodology

Future Visions

Experiments

Prototypes

Final Proposal

Documentation

Fabrication

Exhibition

Conclusion

Bibliography

Appendix

CONTENTS

Page 6: Encode Engage
Page 7: Encode Engage

Thank you to my advisors and peers.

A special thanks to:

Zenovia Toloudi Rob Trumbour Alex Cabral Jen Lee-Michaliszyn

ACKNOWLEDGMENTS

Page 8: Encode Engage

POSTER

Page 9: Encode Engage

// th

is c

onst

ant w

on’t

chan

ge.

It’s

the

pin

num

ber

// o

f the

sen

sor’s

out

put:

cons

t int

pin

gPin

= 7

; // t

he p

in th

at th

e se

nsor

is a

ttac

hed

toin

t mot

orPi

n =

9; /

/ the

pin

that

the

fan

is a

ttac

hed

toin

t spe

ed =

0 ;

int l

ed =

11;

// t

he p

in th

at th

e LE

D is

att

ache

d to

int b

righ

tnes

s =

0; //

how

bri

ght t

he L

ED is

int f

adeA

mou

nt =

5;

// h

ow m

any

poin

ts to

fade

the

LED

by

void

set

up()

{ /

/ ini

tializ

e se

rial

com

mun

icat

ion:

Ser

ial.b

egin

(960

0);

pin

Mod

e(m

otor

Pin,

OU

TPU

T);

pin

Mod

e(le

d, O

UTP

UT)

;} vo

id lo

op()

{ ana

logW

rite

(led,

bri

ghtn

ess)

;

// c

hang

e th

e br

ight

ness

for

next

tim

e th

roug

h th

e lo

op:

bri

ghtn

ess

= br

ight

ness

+ fa

deAm

ount

;

// r

ever

se th

e di

rect

ion

of th

e fa

ding

at t

he e

nds

of th

e fa

de:

if (

brig

htne

ss =

= 0

|| br

ight

ness

==

120)

{

fade

Amou

nt =

-fa

deAm

ount

; }

// w

ait f

or 3

0 m

illis

econ

ds to

see

the

dim

min

g ef

fect

del

ay(1

00);

// e

stab

lish

vari

able

s fo

r du

ratio

n of

the

ping

, /

/ and

the

dist

ance

resu

lt in

inch

es a

nd c

entim

eter

s: l

ong

dura

tion,

inch

es, c

m;

// T

he P

ING

))) is

trig

gere

d by

a H

IGH

pul

se o

f 2 o

r m

ore

mic

rose

cond

s. /

/ Giv

e a

shor

t LO

W p

ulse

bef

oreh

and

to e

nsur

e a

clea

n H

IGH

pul

se:

pin

Mod

e(pi

ngPi

n, O

UTP

UT)

; d

igita

lWri

te(p

ingP

in, L

OW

); d

elay

Mic

rose

cond

s(2)

; d

igita

lWri

te(p

ingP

in, H

IGH

); d

elay

Mic

rose

cond

s(5)

; d

igita

lWri

te(p

ingP

in, L

OW

);

// T

he s

ame

pin

is u

sed

to re

ad th

e si

gnal

from

the

PIN

G)))

: a H

IGH

// p

ulse

who

se d

urat

ion

is th

e tim

e (in

mic

rose

cond

s) fr

om th

e se

ndin

g /

/ of t

he p

ing

to th

e re

cept

ion

of it

s ec

ho o

ff of

an

obje

ct.

pin

Mod

e(pi

ngPi

n, IN

PUT)

; d

urat

ion

= pu

lseI

n(pi

ngPi

n, H

IGH

);

// c

onve

rt th

e tim

e in

to a

dis

tanc

e i

nche

s =

mic

rose

cond

sToI

nche

s(du

ratio

n);

cm

= m

icro

seco

ndsT

oCen

timet

ers(

dura

tion)

; S

eria

l.pri

nt(in

ches

); S

eria

l.pri

nt(“

in, “

); S

eria

l.pri

nt(c

m);

Ser

ial.p

rint

(“cm

”);

Ser

ial.p

rint

ln();

del

ay(1

00);

{

if

(inc

hes

<=36

)

{

an

alog

Wri

te(m

otor

Pin,

255

);

}

els

e {

anal

ogW

rite

(mot

orPi

n,0)

;

} } } lo

ng m

icro

seco

ndsT

oInc

hes(

long

mic

rose

cond

s){ /

/ Acc

ordi

ng to

Par

alla

x’s

data

shee

t for

the

PIN

G)))

, the

re a

re /

/ 73.

746

mic

rose

cond

s pe

r in

ch (i

.e. s

ound

trav

els

at 1

130

feet

per

// s

econ

d).

This

giv

es th

e di

stan

ce tr

avel

led

by th

e pi

ng, o

utbo

und

// a

nd re

turn

, so

we

divi

de b

y 2

to g

et th

e di

stan

ce o

f the

obs

tacl

e. /

/ See

: htt

p://

ww

w.p

aral

lax.

com

/dl/d

ocs/

prod

/acc

/280

15-P

ING

-v1.

3.pd

f r

etur

n m

icro

seco

nds

/ 74

/ 2;

} long

mic

rose

cond

sToC

entim

eter

s(lo

ng m

icro

seco

nds)

{ // T

he s

peed

of s

ound

is 3

40 m

/s o

r 29

mic

rose

cond

s pe

r ce

ntim

eter

. /

/ The

pin

g tr

avel

s ou

t and

bac

k, s

o to

find

the

dist

ance

of t

he /

/ obj

ect w

e ta

ke h

alf o

f the

dis

tanc

e tr

avel

led.

ret

urn

mic

rose

cond

s / 2

9 / 2

;

ENC

OD

E_EN

GA

GE

Rya

n Ka

hen

Arch

MAr

ch 2012

Mot

her

Com

pone

nt

Conn

ectio

n W

ire

5V D

C B

rush

less

Fa

n Ac

tuat

or

Ligh

t Em

ittin

g D

iode

s(L

EDs)

Ardu

ino

Mic

roco

ntro

ller

0.25

in. P

lexi

Gla

ssSt

ruct

ure

Rin

gs

0.25

in. P

lexi

Gla

ssSt

ruct

ure

Rib

s

Ligh

twei

ght I

nflat

able

Fabr

ic S

kin

Pass

ive

Infr

ared

(PIR

)M

otio

n D

etec

tion

Sens

or

Conn

ectio

n W

ires

Elec

tret

Mic

roph

one

Sens

or

Dau

ghte

r Co

mpo

nent

0.25

in. P

lexi

Gla

ssSt

ruct

ure

Rin

gs

0.25

in. P

lexi

Gla

ssSt

ruct

ure

Rib

s

Ligh

twei

ght I

nflat

able

Fabr

ic S

kin

Ligh

t Em

ittin

g D

iode

s(L

EDs)

1:1

Det

ail M

othe

r Co

mpo

nent

with

one

of t

hree

Dau

ghte

r Co

mpo

nent

s

Conn

ectio

n W

ires

1:4

Thr

ee S

tate

s D

iagr

am A

map

ping

of t

he v

ario

us s

tate

s th

at

occu

r w

ith e

ngag

ing

the

obje

ct.

Our

cur

rent

dig

ital

age

has

cons

ider

ably

affe

cted

the

way

s in

whi

ch w

e op

erat

e as

hum

ans.

Inf

orm

atio

n te

chno

logi

es h

ave

incr

ease

d th

e sp

eed

of o

ur c

ities

thr

ough

the

way

s w

e ac

cess

, sh

are,

and

com

mun

icat

e da

ta.

The

use

of m

obile

tec

hnol

ogy,

spec

ifica

lly t

he s

mar

t ph

one,

has

bee

n a

key

com

pone

nt i

n th

is p

rogr

essi

on i

nto

this

dig

ital

age.

The

se t

echn

olog

ies

have

bec

ome

situ

ated

with

in o

ur d

aily

liv

es,

caus

ing

a sh

ift i

n th

e w

ay w

e en

gage

with

bot

h ou

r sp

ace

and

one

anot

her.

Thro

ugh

the

stud

y an

d ex

peri

men

tatio

n of

sen

sori

al t

echn

olog

ies,

thi

s th

esis

loo

ks t

o br

idge

the

gap

be

twee

n th

e vi

rtua

l and

the

phy

sica

l. O

ur c

ities

are

em

bedd

ed w

ith s

ensi

ng t

echn

olog

ies,

col

lect

ing

envi

ronm

enta

l, so

cial

, and

infr

astr

uctu

ral d

ata

used

as

a w

ay t

o m

onito

r ou

r ci

ties,

ens

urin

g sa

fety

and

effi

cien

cy.

Whi

le t

hese

tec

hnol

ogie

s ar

e al

read

y si

tuat

ed w

ithin

our

urb

an f

abri

c, w

e as

the

use

rs

of t

he c

ity d

o no

t ha

ve a

dire

ct r

elat

ions

hip

with

the

m. W

e be

com

e th

e ob

serv

ed r

athe

r th

an b

ecom

ing

a pa

rtic

ipan

t in

our

city

. Rat

her

then

hav

ing

our

embe

dded

tec

hnol

ogie

s si

mpl

y co

llect

dat

a, t

hey

can

be u

sed

to c

reat

e an

env

ironm

ent

that

bot

h re

cogn

izes

and

res

pond

s to

us

as t

he u

sers

. Thr

ough

a

dial

ogue

initi

ated

by

an in

put/

outp

ut s

yste

m w

e ca

n cr

eate

a n

ew r

elat

ions

hip

betw

een

peop

le,

tech

nolo

gy,

and

arch

itect

ure.

Thr

ough

the

med

ium

of

inst

alla

tion,

a n

ew a

rtifi

cial

atm

osph

ere

is c

reat

ed e

ncou

ragi

ng c

urio

sity,

act

ive

part

icip

atio

n, a

nd e

xplo

ratio

n w

ithin

the

fab

rica

ted

envi

ronm

ent.

Page 10: Encode Engage

ABSTRACT

Our current digital age has considerably affected the ways in which we operate as humans. Information technologies have increased the speed of our cities through the ways we access, share, and communicate data. The use of mobile technology, specifically the smart phone, has been a key component in this progression into this digital age. These technologies have become situated within our daily lives, causing a shift in the way we engage with both our space and one another. Through the study and experimentation of sensorial technologies, this thesis looks to bridge the gap between the virtual and the physical. Our cities are embedded with sensing technologies, collecting environmental, social, and infrastructural data used as a way to monitor our cities, ensuring safety and efficiency. While these technologies are already situated within our urban fabric, we as the users of the city do not have a direct relationship with them. We become the observed rather than becoming a participant in our city. Rather then having our embedded technologies simply collect data, they can be used to create an environment that both recognizes and responds to us as the users. Through a dialogue initiated by an input/output system we can create a new relationship between people, technology, and architecture. Through the medium of installation, a new artificial atmosphere is created encouraging curiosity, active participation, and exploration within the fabricated environment.

Page 11: Encode Engage

Fig. 1CirriformFuture Cities Lab

Page 12: Encode Engage
Page 13: Encode Engage

Encode /en ‘kōd/ - To convert information to a digital form

Engage /en ‘gāj/ - To occupy, attract, or involve as if to capture interest or attention

Page 14: Encode Engage

RESEARCH ESSAY

Page 15: Encode Engage

“The computer no longer needs to adapt to the user because the opposite is true.”

We are living in an era where society is immersed in a digital world, where we perceive our surroundings by means of technology. The rapid development of technology has created a wave in our culture. Just as the automobile changed the way our world worked, from the implementation of new infrastructure and increased mobility, computers are creating a similar effect. The automobile increased the speed in which our world moved. In the same way, the computer is increasing the speed of our lives. We are easily able to access and share information as well as instantaneously communicate with one another. These advancements in technology, specifically mobile technology, has shaped the way in

which we, the users, interact on a social level. “The Computer no longer needs to adapt to the user because the opposite is true.”1 Knowing the advancements in technology, we need

to ask ourselves how we can continue to build our urban context to create a response to the growth in our digital age. Just as we use technology to communicate with one another, we can use a similar means to communicate with our space, creating an architecture that responds to our social participation.

1 Conrad (p. 63)

Page 16: Encode Engage

Responsive Environments

The utilization of technology in our built context has allowed for the design of responsive environments. Lucy Bullivant defines responsive environments as “spaces that interact with the people who use them, pass through them or by them”. 1 These become environments that engage the user, redefining their experience of the space. Through the use of tech elements, the interaction becomes that of a digital realm. This creates an understandable bridge between the virtual world and the physical. As the user senses or creates the input, they in return experience the output. “The power of the responsive environments in this book is precisely that they are not purely reactive or entirely predetermined. Both they and their users learn from experience and redefine their sense of place.”2 The back and forth dialogue between the user and the system is where the success of responsive environments lies. Datagrove by Future Cities Lab (Fig. 2) is an interactive installation designed for the Zero1 Biennial in San Jose, CA. “Datagrove thrives on Information from its urban environment.”3 Collecting data from users and streaming Twitter feeds; Datagrove is a responsive architecture creating presence in both physical and virtual space. Through the need of an initial stimulus, whether environmental or human induced, the response becomes the result that is understood. These environments create a connection with the user as the user connects with the system.

1 Bullivant (p. 7)2 Bullivant (p. 17)3 Future Cities Lab

Page 17: Encode Engage

“Both they and their users learn from experience andredefine their sense of place.”

Fig.2DatagroveFuture Cities Lab

Page 18: Encode Engage

Interactive Technologies

When designing in a digital era, knowledge of technology is needed, including tools, fabrication methods, materials, and peripheral technologies. The use of this technology has begun to change the way architects and designers use and think about materials. Manufacturing tools, including computer numerical control (CNC), laser cutting, vacuum forming, and three-dimensional printing allow rapid prototyping for testing and creating components. Interactive architecture is borrowing technologies from other fields. Sensors and actuators are applied to create a high-tech system resulting in an interaction between the user and the architecture. “Currently a change is taking place in interactive media whereby increased emphasis is being placed on designing and creating interfaces, experiences, and software that are customizable, re-programmable, and adaptable.”1 These tech elements allow for a creation in the input and output results. Through the use of software and programmable entities the designer has control over the system, creating interactions that speak to the digital age. Michelle Addington and Daniel Schodek, authors of Smart Materials and Technologies for the Architecture and Design Professions, speak of the multiple ways of achieving these tech systems. By an understanding of material properties and tech system capabilities, designers are able to push technology further to create a new interactive architecture.

1 Fox and Kemp

Fig. 3Light DriftHoweler + Yoon

Page 19: Encode Engage

“The issue of controlling physical change is central to issues of design and construction techniques, kinetics, and maintenance, as well as issues of human and environmental information gathering.”2 Michael Fox and Miles Kemp’s Interactive Architecture introduces tools used in creating responsive systems. Elizabeth Diller and Ricardo Scofidio created an interactive wearable technology, Braincoat (Fig. 4), for their Swiss Expo Blur Building. The coat uses information from the wearer to create a new form of communication within the installation. Rather then direct face-to-face communication, light is used based on compatibility, ranging from antipathy to affinity, of people determined by the initial questionnaire filled out upon entering the Blur Building. These designed systems need a way to receive and control the information from its context, whether environmental or social. The first part of the responsive system is the input data. Sensors are used to recognize the information and send it to the next part of the system. Sensors can be placed within two categories, contact based and non-contact based. Contact based sensors deal with direct information exchange, including touch, moisture, pressure, or wind. Non-contact based sensors read information based on presence. These include infrared, sonar, accelerometer, light, and microphones. This sensory information then needs to be processed through the micro controller to create a response. Micro controllers are similar to a computer we would use, but rather then performing multiple tasks, it is designed to design one very well. “A micro controller is especially good at three things: receiving information from sensors, controlling basic motors and other kinetic parts, and sending information to other computers. They act as an intermediary between the digital world and the physical world.”3

2 Fox and Kemp (p. 73)3 Fox and Kemp (p. 78)

Fig. 4Blur BraincoatDiller Scofidio

Page 20: Encode Engage

Experience of Technologies

As we enter the world of ubiquitous computing, we need to understand how these technologies will alter our experience of the natural and built environments. Erik Conrad, in his article Embodied Space for Ubiquitous Computing, speaks of how ubiquitous computing has grown exponentially. “The average American already owns twenty or more computers.”1 A computer in these terms is an object that contains information processing components, such as televisions, microwaves, and cell phones. We tend to think of these technologies as solely tools, but we need to understand their effects on our culture. As these technologies are becoming built into our environments, these ubiquitous systems alter our social interactions, saying all interactions with computers are at some level social. Conrad mentions how we tend to think in a Cartesian way, meaning the properties of physical objects are quantifiable ones. However the meanings we attribute to space are mainly based on qualitative, sensory experiences. This creates a duality between the qualitative and quantitative experience we have, forcing a pull between the two to understand our experiences of space. “Social space reconciles the physical and the mental, concrete and abstract, and if we consider all interactions with the computer systems ‘social’, then these interactions also have potential to be places where the physical and mental co-mingle.”2 These human-computer interactions can merge the two outlooks on space, creating a bridge between the sensory and tangible, the virtual and the physical.

1 Conrad (pgs. 61-62)2 Conrad (p. 63)

Page 21: Encode Engage

“If we consider all interactions with the computer systems ‘social’, then these interactions also have potential to be places where the physical and mental co-mingle.”

Fig. 6

Page 22: Encode Engage
Page 23: Encode Engage

3736

Fire Sensors for elevators are located on every floor and in the motor room.

lighting Sensors turn lights off in rooms automatically if no motion is detected.

Water Sensors, Carbon Dioxide Sensors, and Surveillance Systems monitor building occupancy, which can be used to optimize energy use.

Advanced Smoke Detectors can sense smoke levels and temperature and communicate them to firefighters who are within 100 feet, via wireless sensors installed on the firefighters’ air tanks. In return, the firefighters are also being tracked: the smoke detectors automatically map the position of firefighters within range and communicate their location to the on-call incident commander.

Emergency Management & Communications Sensors detect abnormal biological, chemical, and radiological conditions.

Pavement Sensors keep track of road conditions. they’re most often found on bridges because they have the tendency to freeze first.

Underwater bridge Pier Sensors monitor the structural safety of bridges, especially older ones. their use became widespread after the interstate highway 35w bridge, which had only been visually inspected, collapsed over the Mississippi river in Minneapolis on aug. 1, 2007.

River Sensors measure water conditions, currents, and levels, and report them to the u.s. geological survey.

Entryway Sensors screen and record people entering the building.

Anenometers measure wind speed and pressure.

Accelerometers measure building movement.

Audio Sensors detect gunfire. Police are alerted and surveillance video can be immediately transmitted.

Traffic Controllers switch light signals when pavement sensors alert them that a vehicle is waiting.

Motion Detectors function as part of security systems.

Parking Sensors report how many parking spots are in use and charge for parking accordingly. as this data is released, apps are being developed to help drivers find parking spaces remotely.

Inlaid Pavement Sensors can detect cars, motorcycles, and bicycles. they communicate with the traffic controllers wirelessly.

Weather and air quality sensors

Triangulation Systems Sensors are crucial for any automatic door system.

light Sensors turn street lights on as it gets darker.

MEASURINg THE CITYthe inFrastructure oF data collection

with passive and active sensors embedded throughout its infrastructure, the city is already sentient. cities are sensoring up to the gills, monitoring traffic, people, and weather. chicago is one of the most sensored cities...

Imag

e cr

edit

: Fli

ckr

use

r Jo

hn

W. I

wan

ski

THE NEW CITY

Inaba_2012

Our urban environment is filled with sensors that record data and monitor our city. Looking at how many sensors are in our city, none of these give anything back to the user in real time interactions. These embed-ded sensors collect data on environmental information, traffic data, and monitoring people to ensure safety and efficeincy on our cities. Looking at these embedded technologies, one must question how we can use these sensors to create a responsive architecture that recognizes us but also responds to us.

Page 24: Encode Engage
Page 25: Encode Engage

As technologies develop, professions adapt utilizing these advancements in the progression of their field. Architecture, stuck in past ideologies, is still playing catch up to these technologies. Few are taking strides forward, using new means of tooling and design methods. As technology progresses from past discoveries, architecture can advance, through the use of integrated technologies and new means of production to situate itself within our digital age. In today’s society, the cultural norm is to have the latest trends in technology. With these rapid advancements, the demand and desire for technology is increasing. As more people own and use these technologies, they become ubiquitous. Due to this, our cities are becoming smart. Mobile cell phones offer information instantaneously in the palm of our hands, at anytime time and place. This use of technology is reshaping our culture in the way we perceive, not only ourselves, but also the space we occupy. By embedding our urban context with interactive technologies such as sensors, lighting, and kinetic systems, we can create an environment that recognizes and responds to our social participation. Through the design of responsive environments, architecture can embrace the technological movement, and redefine itself in today’s culture.

Conclusion

Page 26: Encode Engage
Page 27: Encode Engage

In our current digital culture, how can we fabricate a new relationship between people, technology, and architecture?

Page 28: Encode Engage

SOCIALLY RESPONSIVEARCHITECTURE

INTERACTION

USE

PHYSICALNONPHYSICAL

SENSORY VISUAL

TACTILE

AUDITORY

CULTURE

OF THE PEOPLE

OF THE CITY

CONNECTIVITY

MOBILITY

INFORMATION

MOBILE TECHNOLOGYTABLETS

SMART PHONES

SOCIAL MEDIA

INTERNET

NETWORKING

AWARENESS

FACTORS

ENVIRONMENTAL

HUMAN

TECHNOLOGICAL

THOUGHTPERCEPTIONMEMORYUSE

SITE CONTEXTURBAN PUBLIC SPACEBUILT

USEDISPLAY EASE

FUNCTION TIME

SPACE

MOVEMENT KINETICARCHITECTURE

CHANGE

SHAPEFORM

CHARACTER

MIND MAPPING

Page 29: Encode Engage

Mind Mapping

Mind mapping was used for the preliminary thesis topic discovery, as a method to organize thoughts, and explore many avenues of interest. By focusing on responsive architecture, I developed keywords that led to my further research and development on the topic. Some of these main keywords were connectivity, interaction, and mobility.

SOCIALLY RESPONSIVEARCHITECTURE

INTERACTION

USE

PHYSICALNONPHYSICAL

SENSORY VISUAL

TACTILE

AUDITORY

CULTURE

OF THE PEOPLE

OF THE CITY

CONNECTIVITY

MOBILITY

INFORMATION

MOBILE TECHNOLOGYTABLETS

SMART PHONES

SOCIAL MEDIA

INTERNET

NETWORKING

AWARENESS

FACTORS

ENVIRONMENTAL

HUMAN

TECHNOLOGICAL

THOUGHTPERCEPTIONMEMORYUSE

SITE CONTEXTURBAN PUBLIC SPACEBUILT

USEDISPLAY EASE

FUNCTION TIME

SPACE

MOVEMENT KINETICARCHITECTURE

CHANGE

SHAPEFORM

CHARACTER

MIND MAPPING

Page 30: Encode Engage
Page 31: Encode Engage

Lost in iPhone City

Our smart phones have become the new lens in which we view society. In the palm of our hands lies a mobile computer that allows for information to be accessed at any moment. This smart technology is becoming the new way of controlling our social being, but will it be the new way in which we control our city?

Page 32: Encode Engage
Page 33: Encode Engage

kahendesign.wordpress.com

Page 34: Encode Engage

METHODOLOGY

Page 35: Encode Engage

My methodology, or design procedure, looked towards a means of a bridging the gap between our physical realm and our digital world. My initial interest in mobile smart technology and research of embedded technologies within our built environment led to an exploration of a new relationship between people, technology, and space. My methodology was a three-tiered system consisting of visions for a future city, explorations in technological systems, and prototyping these systems as responsive objects. The future city visions looked at different ways our space can sense us as the users and output a direct response. These visions tested interventions through scale, responses, and location. The technology based experiments looked to the tooling used in creating a responsive object. A micro controller was used as the means of creating a closed loop system. Through a study of sensors and actuators, I was able to collect real time data and translate it into an immediate response. Each of these tests became building blocks, allowing for a library of systems that can be applied towards a larger, more complex system. These experiments were applied to fabricated prototypes where they sensed specific data creating an output of various responses, including light and movement. Similar to the encoded tests, each tier of my methodology acted as an item in a closed loop system where each component would control the process of another.

Page 36: Encode Engage

Responsive System Diagramming

Through the study of responsive systems, set rules needed to be applied. To be considered an active feedback systems, both input and output data needs to be translated. Sensors are used in collecting the input data, while actuators, lights, or speakers can produce an output response. The means of collecting data needs to be predetermined in order to compute that information into a responsive outcome. Looking at different input scenarios, including communication data, human presence, or environmental data can dictate an appropriate response, creating a dialogue with our space. Each scenario has its own corresponding sensory component, but the compilation of these sensors and output devices can create a dynamic system with visual, haptic, and spatial elements.

Page 37: Encode Engage

Environmental Sensing of SunlightIntervention in Space Intervention Responds Sunshading

Proximity Sensing of User EngagementIntervention in Space Intervention Responds with Change in Shape

Data Sensing of Mobile TechnologiesIntervention in Space Intervention Responds with Light Display

Page 38: Encode Engage

Intervention Responds with Change in Shape

A kinetic output response is applied to the system where proximity sensing is used to detect human presence. As people trigger the sensor, the component changes form, creating a new overhead condition within the given space.

Page 39: Encode Engage
Page 40: Encode Engage

Intervention Responds with Light Display

A light output is produced based on the input data of the users. Light is used as a means to communicate data visually within the system. As people inhabit the space, the illuminated response they create enlivens the space for them as well as others.

Page 41: Encode Engage
Page 42: Encode Engage

Intervention Responds Sunshading

The responsive system can also produce indirect responses. As it changes shape it can serve as a functioning system. The system has the possibility to act as shelter from environment, creating sunshading or a rain screen.

Page 43: Encode Engage
Page 44: Encode Engage
Page 45: Encode Engage

Future Vision of Technology in Architecture

As we interact with our current mobile technology (smart devices), a future architecture will also create an integrated relationship between our technology and our space. An architecture that have the capability to recognize and respond fits within the constant flux of our current digital culture. The real time change of spatial qualities, directed by people, creates a new experience of space. These responsive environments are meant to redefine our sense of place, both spatially and socially, through a playful construct. They have the ability to enhance the atmosphere of a space, creating a more dynamic public place within our cities.

Page 46: Encode Engage
Page 47: Encode Engage

This proposal looked to a suspended system along Winter and Summer Street in Downtown Crossing. By developing a kinetic system, the form of each component would change based on the location of a person in space. Light is used in these representations to highlight the movement of the system. Responding to a person’s place, the system also works in a temporal manner. The light is delayed leaving behind a moment in time even after the user has passed. The user not only can alter the shape of the system, but also leave a momentary instance in space. The system becomes a way to reshape our perception of the space based on human interaction.

Page 48: Encode Engage
Page 49: Encode Engage
Page 50: Encode Engage
Page 51: Encode Engage

Stages of Canopy Movement

Set within Dewey Square, in Boston’s Downtown District, this proposal looked at the interactions with and architectural object. As the user engages the object, it would respond through the transformation between different states. Each state would correspond to a spatial quality fitting the needs of the social experience happening in the area. Beginning with the object in its first state, it would resemble a column or a pillar. As a person engages it, the object would draw itself up creating a canopy system. This scenario in Dewey Square can create shelter for the open plaza, allowing for a destination in the busy downtown area.

Page 52: Encode Engage
Page 53: Encode Engage
Page 54: Encode Engage
Page 55: Encode Engage

A further investigation of the change in states of the object as users engage. This scenario continues to look as the original column like state and kinetic transformation as people inhabit the space around. As people engage, the armatures grow outward greating a space within the object itself. Rather then just a canopy system, the object can create enclosures to fit the needs of its users. Each object can respond to the density of people in real time, allowing for multiple social encounters to happen throughout the space, each with their one defined areas of engagement.

Page 56: Encode Engage
Page 57: Encode Engage
Page 58: Encode Engage
Page 59: Encode Engage

The Sensory Engagements

Using a series of sensors, users can engage with the installation. Based on both location in place and physical engagements, people can create a dialogue with their build environment through their interaction with the technologies.

Proximity_By establishing a range for the sensor,the object creates its own “personal space”, initiating a response within the installation.

Haptic Sensing_Using capacitive sensing technologies, physical engagement can be used to create a response. In this scenario the the central core is touched, it allows the user to drive the actuator altering the size od their immediate space.

The Sensory Engagements

Using a series of sensors, users can engage with the installation. Based on both location in place and physical engagements, people can create a dialogue with their build environment through their interaction with the technologies.

Proximity_By establishing a range for the sensor,the object creates its own “personal space”, initiating a response within the installation.

Haptic Sensing_Using capacitive sensing technologies, physical engagement can be used to create a response. In this scenario the the central core is touched, it allows the user to drive the actuator altering the size od their immediate space.

Page 60: Encode Engage
Page 61: Encode Engage

A Study in Sensor Technologies

As part of my methodology, I used technology as a means for creating a response. A micro controller was used as a tool in conducting a series of sensory input/output systems. This study allowed for the exploration of emerging technologies, becoming a new medium in the architectural field. Each experiment provided experience in wiring and coding tech components to physically create a system that responds. Each test became a building block that allowed for the further investigation of this tool to be applied to a more complex sensorial system.

Page 62: Encode Engage

Arduino Test_1

LED Blink_Introduction to Arduino_initial understanding of coding and wiring

The script states that the LED light will turn on for one second, then turn off for one second. This action is set on a loop creating a consistent blinking of light

Page 63: Encode Engage

Arduino Test_2

Proximity Sensing_Incorporating a PING))) sensor to create a feedback loop system

The script states that the when the sensor detects an object within its set range, it initiates the response. This can be used as a means to detect the presence of a person as they engage an object.

Page 64: Encode Engage

Arduino Test_3

Passive Infrared Sensing (PIR)_Incorporating a PIR sensor to create a feedback loop system.

The script states that the PIR sensor will measure the change in heat levels in its range to initiate a response. This is used to detect motion in the space triggering a response as movement occurs.

Page 65: Encode Engage

Arduino Test_4

Arduino Controlled Servo MotorUsing a servo motor as an actuator to translate an output response

The script states that the based on the input data, the motor will turn between 0 and 180 degrees. This was used in creating a linear actuator to drive a system along the fabricated axis through its different states.

Page 66: Encode Engage

Arduino Test_5

Capacitive Sensing_Using capacitive sensing to create a response based on a haptic input.

The script states that as the set pad is touched it will recognize a change in electrical current flow. If the flow is greater then a set amount, it will trigger the output response to happen.

Page 67: Encode Engage

Arduino Test_6

LED Fade in Parallel_Using multiple LEDs wired in parallel to fade in and out through various degrees of brightness

The script states that the LEDs will loop through a determined level of brightness. Beginning in an off state, the light will move through five states turning brighter until it reaches a level of 50% brightness. This allows for a fading light that can be applied to a specific state of the object.

Page 68: Encode Engage

Arduino Test_7

Arduino Controlled DC Fan_Using a fan as the actuator to translate the output response

The script states that as the sensor is triggered on, it initiates the fan to turn on. This will begin to inflate the skin of the object. Using air as a means of changing state offers a softer atmospheric quality to the object rather than mechanized movement

Page 69: Encode Engage

Arduino Test_8

Electret Mic_Using an electret microphone to translate communication data into an output response.

The script states that as sensor collects sound data if the level is greater then a determined amount then to trigger the output response. Here light is used to output the sound data creating a pulsating response that reacts to speech happening around the mic.

Page 70: Encode Engage
Page 71: Encode Engage

Responsive System Prototyping

My methodology continued through the application of the Arduino experiments by applying them to functioning prototypes. These prototypes became tools in exploring how these technology tests can be applied to an architectural setting. Through the development of these responsive objects, implementations and tectonics were discovered.

Page 72: Encode Engage

// this constant won’t change. It’s the pin number// of the sensor’s output:const int pingPin = 7;const int ledPin = 13;

void setup() { // initialize serial communication: Serial.begin(9600); pinMode(ledPin, OUTPUT);}

void loop(){ // establish variables for duration of the ping, // and the distance result in inches and centimeters: long duration, inches, cm;

// The PING))) is triggered by a HIGH pulse of 2 or more microseconds. // Give a short LOW pulse beforehand to ensure a clean HIGH pulse: pinMode(pingPin, OUTPUT); digitalWrite(pingPin, LOW); delayMicroseconds(2); digitalWrite(pingPin, HIGH); delayMicroseconds(5); digitalWrite(pingPin, LOW);

// The same pin is used to read the signal from the PING))): a HIGH // pulse whose duration is the time (in microseconds) from the sending // of the ping to the reception of its echo off of an object. pinMode(pingPin, INPUT); duration = pulseIn(pingPin, HIGH);

// convert the time into a distance inches = microsecondsToInches(duration); cm = microsecondsToCentimeters(duration); Serial.print(inches); Serial.print(“in, “); Serial.print(cm); Serial.print(“cm”); Serial.println(); delay(100); { if (inches <= 36) { digitalWrite(ledPin, HIGH); } else { digitalWrite(ledPin, LOW); } }}

long microsecondsToInches(long microseconds){ // According to Parallax’s datasheet for the PING))), there are // 73.746 microseconds per inch (i.e. sound travels at 1130 feet per // second). This gives the distance travelled by the ping, outbound // and return, so we divide by 2 to get the distance of the obstacle. // See: http://www.parallax.com/dl/docs/prod/acc/28015-PING-v1.3.pdf return microseconds / 74 / 2;}

long microsecondsToCentimeters(long microseconds){ // The speed of sound is 340 m/s or 29 microseconds per centimeter. // The ping travels out and back, so to find the distance of the // object we take half of the distance travelled. return microseconds / 29 / 2;}

Page 73: Encode Engage

Responsive Prototype_1

The responsive system incorporated a proximity sensor and LED to create an interactive object. The PING))) sensor was given a set range that triggered the LED on as someone entered the determined distance. This was a test to observe how people interacted with both the object and the space around as it responded to their location to it.

Page 74: Encode Engage
Page 75: Encode Engage

Prototype Study Models_

Page 76: Encode Engage
Page 77: Encode Engage

Prototype Study Models_

Page 78: Encode Engage

* Adapted from code by Tom Igoe* http://itp.nyu.edu/physcomp/Labs/Servo* */

/** Adjust these values for your servo and setup, if necessary **/int servoPin = 2; // control pin for servo motor int minPulse = 1170; // maximum servo speed clockwiseint maxPulse = 1770; // maximum servo speed anticlockwiseint turnRate = 75; // servo turn rate increment (larger value, faster rate)int refreshTime = 20; // time (ms) between pulses (50Hz)

/** The Arduino will calculate these values for you **/int centerServo; // center servo positionint pulseWidth; // servo pulse widthint moveServo; // raw user inputlong lastPulse = 0; // recorded time (ms) of the last pulse

void setup() { pinMode(servoPin, OUTPUT); // Set servo pin as an output pin centerServo = maxPulse - ((maxPulse - minPulse)/2); pulseWidth = centerServo; // Give the servo a stop command Serial.begin(9600); Serial.println(“Arduino Serial Continuous Rotation Servo Control”); Serial.println(“ by Orfeus for GRobot.gr”); Serial.println(“ Press < or > to move, spacebar to center”); Serial.println();}

void loop() { // wait for serial input if (Serial.available() > 0) { // read the incoming byte: moveServo = Serial.read();

// ASCII ‘<’ is 44, ASCII ‘>’ is 46 (comma and period, really) if (moveServo == 44) { pulseWidth = pulseWidth + turnRate; } if (moveServo == 46) { pulseWidth = pulseWidth - turnRate; } if (moveServo == 32) { pulseWidth = centerServo; }

// stop servo pulse at min and max if (pulseWidth > maxPulse) { pulseWidth = maxPulse; } if (pulseWidth < minPulse) { pulseWidth = minPulse; } // Show me the keys I pressed //Serial.print(“Key pressed: “); //Serial.println(moveServo);

//print pulseWidth back to the Serial Monitor (comment to undebug) Serial.print(“Pulse Width: “); Serial.print(pulseWidth); Serial.println(“us”); }

// pulse the servo every 20 ms (refreshTime) with current pulseWidth // this will hold the servo’s rotation and speed till we told it to do something else. if (millis() - lastPulse >= refreshTime) { digitalWrite(servoPin, HIGH); // start the pulse delayMicroseconds(pulseWidth); // pulse width digitalWrite(servoPin, LOW); // stop the pulse lastPulse = millis(); // save the time of the last pulse }}

Page 79: Encode Engage

Responsive Prototype 2_

Utilizing a servo motor to raise and lower the structure creating different states of space as it is engages. This prototype was a test in both tech components and scale of the object.

Page 80: Encode Engage
Page 81: Encode Engage

Prototype 2 In Motion Drawings_

The responsive prototype was documented through a series of drawings highlighting its different states. This scenario tests the object as free standing on a central column. Its movement is mapped as the armatures move from its column state to the canopy state.

Page 82: Encode Engage
Page 83: Encode Engage

Prototype 2 In Motion Drawings_

The responsive prototype was documented through a series of drawings highlighting its different states. This scenario tests the object as a suspended object. Its movement is mapped as the armatures move from its column state to the canopy state.

Page 84: Encode Engage

System Components_The overall system system works as one unit, but it is comprised of several smaller systems working togeth-er. One being the central core, another is the structural elements, and the last is the skin of the system.

The Core_This central system houses the ‘brain’ of the installation. The microcontroller, the motor, and the gear train are all housed within this system. The core also acts as the structure, rooting the installation in place. This core eliminated the need for a dependancy on a substructure or existing conditions. The core also allows for a new means of interaction. As the systems moves in response with the user, it reveals the core. This new artifact can then be a new tool of communication between the user and the architecture.

Page 85: Encode Engage

System Components_The overall system system works as one unit, but it is comprised of several smaller systems working togeth-er. One being the central core, another is the structural elements, and the last is the skin of the system.

Structural Elements_As the system recieves the input data from the sensor, it needs to turn that into an output response. The kinetics of the instal-lation rely on components having the ability to move and reshape their structure. The skeletal members work as a join to create the movement between object and canopy. They can be further studied of how the joinery works and its dominance in the system, looking at material and scale of components.

Armatures_Option 1 uses long elements to achieve movement result-ing in large canopy space.

Armatures_Option 2 uses several small compnantus to achieve movement resulting in a more dynamic transition between states

Page 86: Encode Engage

System Components_The overall system system works as one unit, but it is comprised of several smaller systems working togeth-er. One being the central core, another is the structural elements, and the last is the skin of the system.

The Skin_The installation has a skin that wraps around the armatures. It is the element that works with the movement. As the system moves, the skin will alter its appearance between the changes from one state to the other. Materiality is a key factor in this effect. The use of fabric can be useful for its properties. It is lightweight and have the capabilities of distorting its original shape. The skin would also act as an open loop system where it can intergrate each component into one readable installation. As the user engages with one object, they will not only effect their immediate space but they will effect the space of sur-rounding objects.

Page 87: Encode Engage

System Components_The overall system system works as one unit, but it is comprised of several smaller systems working togeth-er. One being the central core, another is the structural elements, and the last is the skin of the system.

Scaled Space_The frame that the armatures attach to can be scaled to create a space within the installation. This would reverse the response of the installation from creating a canopy as it is engaged to creating an enclosure as users are within its proximity. This situation deals with social interactions, creating scenarios of interaction within the boundaries set by the installation

Page 88: Encode Engage

const int pingPin = 7; // the pin that the sensor is attached toint motorPin = 9; // the pin that the fan is attached toint speed = 0 ;int sensorPIN = 0;int led = 11; // the pin that the LED is attached toint brightness = 0; // how bright the LED isint fadeAmount = 5; // how many points to fade the LED by

void setup() { // initialize serial communication: Serial.begin(9600); pinMode(motorPin, OUTPUT); pinMode(led, OUTPUT); digitalWrite(led, LOW);}

void loop(){

// establish variables for duration of the ping, // and the distance result in inches and centimeters: long duration, inches, cm;

// The PING))) is triggered by a HIGH pulse of 2 or more microseconds. // Give a short LOW pulse beforehand to ensure a clean HIGH pulse: pinMode(pingPin, OUTPUT); digitalWrite(pingPin, LOW); delayMicroseconds(2); digitalWrite(pingPin, HIGH); delayMicroseconds(5); digitalWrite(pingPin, LOW);

// The same pin is used to read the signal from the PING))): a HIGH // pulse whose duration is the time (in microseconds) from the sending // of the ping to the reception of its echo off of an object. pinMode(pingPin, INPUT); duration = pulseIn(pingPin, HIGH);

// convert the time into a distance inches = microsecondsToInches(duration); cm = microsecondsToCentimeters(duration); Serial.print(inches); Serial.print(“in, “); Serial.print(cm); Serial.print(“cm”); Serial.println(); delay(100); { if (inches <= 5) { analogWrite(motorPin, 255); analogWrite(led, 255); } else { analogWrite(motorPin,0); analogWrite(led, brightness); } // change the brightness for next time through the loop: brightness = brightness + fadeAmount;

// reverse the direction of the fading at the ends of the fade: if (brightness == 0 || brightness == 120) { fadeAmount = -fadeAmount ; } // wait for 30 milliseconds to see the dimming effect delay(100); } { if(analogRead(sensorPIN) > 600) digitalWrite(led, HIGH); else digitalWrite(led, LOW);// delay(250); }}

long microsecondsToInches(long microseconds){ // According to Parallax’s datasheet for the PING))), there are // 73.746 microseconds per inch (i.e. sound travels at 1130 feet per // second). This gives the distance travelled by the ping, outbound // and return, so we divide by 2 to get the distance of the obstacle. // See: http://www.parallax.com/dl/docs/prod/acc/28015-PING-v1.3.pdf return microseconds / 74 / 2;}

long microsecondsToCentimeters(long microseconds){ // The speed of sound is 340 m/s or 29 microseconds per centimeter. // The ping travels out and back, so to find the distance of the // object we take half of the distance travelled. return microseconds / 29 / 2;}

Page 89: Encode Engage

Responsive Prototype_3

The responsive system compiled several experiments including proximity sensing, LED outputs, and a DC fan actuator. This prototype tested the scale of the architectural object. At a smaller component scale, this prototype was designed to be dispersed in a field condition. This model was designed to house the electronics within, driving the form to fill the hierarchal needs of easy system. The brain (Arduino) was centralized with a bulge, the sensor was pulled down reaching out towards the people while the fan was pulled up to draw in air to inflate.

Page 90: Encode Engage
Page 91: Encode Engage

Responsive Prototype_3

The LEDs were wired in parallel to incorporate multiple lights per system. The fade in and out code was incorporated to highlight the resting state of the object. While no activity is being sensed, the lights would fade resembling breathing. As the object senses an object, it turns the LEDs to full brightness as if the object woke up.

Page 92: Encode Engage
Page 93: Encode Engage

Responsive Prototype_3

The responsive system tested a skin that would be inflated and deflated as the system sensed the presence of a user. Here the fan is being tested to inflate the bag.

Page 94: Encode Engage
Page 95: Encode Engage

Responsive Prototypes_

These prototypes were investigations of how sensory technologies can be embedded within architectural objects to create a responsive environment. Each test pushed the implementation of technology, adding complexity to each system, in a way that produced a more dynamic outcome. The third prototype was expanded further in refining the technology within and testing it within a field condition. As each Arduino test acted as a building block for the next study, the prototypes too left the opportunity for expansion and further investigation of responsive technologies within architecture.

Page 96: Encode Engage

FINAL PROPOSAL

Page 97: Encode Engage

Applying Technologies to Architecture

Based on the past prototypes exploring sensor technologies within a fabricated system, this final proposal was an extension of prototype 3. This proposal included the use of multiple sensors used to collect data based on different human engagements. The PIR sensor collected data based on location in space. An electret mic was used to translate speech to an output response. LEDs were used as an output as well as a fan actuator used in inflating and deflating the components’s skin. This installation was tested as a field condition where multiple components were constructed to work as a unified system.

Page 98: Encode Engage

Mot

her

Com

pone

nt

Conn

ectio

n W

ire

5V D

C B

rush

less

Fa

n Ac

tuat

or

Ligh

t Em

ittin

g D

iode

s(L

EDs)

Ardu

ino

Mic

roco

ntro

ller

0.25

in. P

lexi

Gla

ssSt

ruct

ure

Rin

gs

0.25

in. P

lexi

Gla

ssSt

ruct

ure

Rib

s

Ligh

twei

ght I

nflat

able

Fabr

ic S

kin

Pass

ive

Infr

ared

(PIR

)M

otio

n D

etec

tion

Sens

or

Conn

ectio

n W

ires

Elec

tret

Mic

roph

one

Sens

or

Dau

ghte

r Co

mpo

nent

0.25

in. P

lexi

Gla

ssSt

ruct

ure

Rin

gs

0.25

in. P

lexi

Gla

ssSt

ruct

ure

Rib

s

Ligh

twei

ght I

nflat

able

Fabr

ic S

kin

Ligh

t Em

ittin

g D

iode

s(L

EDs)

1:1

Det

ail M

othe

r Co

mpo

nent

with

one

of t

hree

Dau

ghte

r Co

mpo

nent

s

Conn

ectio

n W

ires

Page 99: Encode Engage

Mot

her

Com

pone

nt

Conn

ectio

n W

ire

5V D

C B

rush

less

Fa

n Ac

tuat

or

Ligh

t Em

ittin

g D

iode

s(L

EDs)

Ardu

ino

Mic

roco

ntro

ller

0.25

in. P

lexi

Gla

ssSt

ruct

ure

Rin

gs

0.25

in. P

lexi

Gla

ssSt

ruct

ure

Rib

s

Ligh

twei

ght I

nflat

able

Fabr

ic S

kin

Pass

ive

Infr

ared

(PIR

)M

otio

n D

etec

tion

Sens

or

Conn

ectio

n W

ires

Elec

tret

Mic

roph

one

Sens

or

Dau

ghte

r Co

mpo

nent

0.25

in. P

lexi

Gla

ssSt

ruct

ure

Rin

gs

0.25

in. P

lexi

Gla

ssSt

ruct

ure

Rib

s

Ligh

twei

ght I

nflat

able

Fabr

ic S

kin

Ligh

t Em

ittin

g D

iode

s(L

EDs)

1:1

Det

ail M

othe

r Co

mpo

nent

with

one

of t

hree

Dau

ghte

r Co

mpo

nent

s

Conn

ectio

n W

ires

Page 100: Encode Engage

1:1

Com

pone

nt in

Mot

ion

A ti

me

laps

e of

the

obje

ct a

s it

infla

tes

and

defla

tes

Page 101: Encode Engage

1:1

Com

pone

nt in

Mot

ion

A ti

me

laps

e of

the

obje

ct a

s it

infla

tes

and

defla

tes

Page 102: Encode Engage

// this constant won't change. It's the pin number// of the sensor's output:const int pingPin = 7; // the pin that the sensor is attached toint motorPin = 9; // the pin that the fan is attached toint speed = 0 ;int led = 11; // the pin that the LED is attached toint brightness = 0; // how bright the LED isint fadeAmount = 5; // how many points to fade the LED by

void setup() { // initialize serial communication: Serial.begin(9600); pinMode(motorPin, OUTPUT); pinMode(led, OUTPUT);}

void loop(){ analogWrite(led, brightness);

// change the brightness for next time through the loop: brightness = brightness + fadeAmount;

// reverse the direction of the fading at the ends of the fade: if (brightness == 0 || brightness == 120) { fadeAmount = -fadeAmount ; } // wait for 30 milliseconds to see the dimming e�ect delay(100);

// establish variables for duration of the ping, // and the distance result in inches and centimeters: long duration, inches, cm;

// The PING))) is triggered by a HIGH pulse of 2 or more microseconds. // Give a short LOW pulse beforehand to ensure a clean HIGH pulse: pinMode(pingPin, OUTPUT); digitalWrite(pingPin, LOW); delayMicroseconds(2); digitalWrite(pingPin, HIGH); delayMicroseconds(5); digitalWrite(pingPin, LOW);

// The same pin is used to read the signal from the PING))): a HIGH // pulse whose duration is the time (in microseconds) from the sending // of the ping to the reception of its echo o� of an object. pinMode(pingPin, INPUT); duration = pulseIn(pingPin, HIGH);

// convert the time into a distance inches = microsecondsToInches(duration); cm = microsecondsToCentimeters(duration); Serial.print(inches); Serial.print("in, "); Serial.print(cm); Serial.print("cm"); Serial.println(); delay(100); { if (inches <=36) { analogWrite(motorPin, 255); } else { analogWrite(motorPin,0); } }}

long microsecondsToInches(long microseconds){ // According to Parallax's datasheet for the PING))), there are // 73.746 microseconds per inch (i.e. sound travels at 1130 feet per // second). This gives the distance travelled by the ping, outbound // and return, so we divide by 2 to get the distance of the obstacle. // See: http://www.parallax.com/dl/docs/prod/acc/28015-PING-v1.3.pdf return microseconds / 74 / 2;}

long microsecondsToCentimeters(long microseconds){ // The speed of sound is 340 m/s or 29 microseconds per centimeter. // The ping travels out and back, so to �nd the distance of the // object we take half of the distance travelled. return microseconds / 29 / 2;}

Arduino Code

The code written to create the responses

Initial Pin Inputs for each sensor and actuator

Pin Setup

Loop_Telling each component to react and what to react to

Page 103: Encode Engage

1:4 Three States Diagram A mapping of the various states that

occur with engaging the object.

Page 104: Encode Engage
Page 105: Encode Engage

Intimate SpaceFrom Touching - 18”

Personal SpaceFrom 18” - 4’

Social SpaceFrom 4’ - 8’

Public SpaceGreater than 8’

Personal Space of the Object

Using ranges based on the levels of human personal space,the object takes on a personal space of its own.

Using the dimensions given, it can detect presence as one enters its personal space

PIR Sensor

Detection Range

Page 106: Encode Engage
Page 107: Encode Engage

Translating Communication

Using the electret microphone, the microcontroller can processthe frequency of talking into a light pulse.

Through this response, the installation creates a dialogue through the remapping of communication that occurs in physical space

A study in Processing to test the variable pulses that ca occur when

paired with a microphone.

Page 108: Encode Engage
Page 109: Encode Engage

Parametric Organization

Using a scripted definition each mother objectbecomes an attraction point, allowing for a series of

daughter components to become arranged around it.

Attraction Point Selection Grid Dimensioning Attraction point Computation Component Mapping

Page 110: Encode Engage
Page 111: Encode Engage

Fabrication Process_

Each component was 3D modeled using a radial rib structure. These ribs were nested and laser cut on 0.25” plexiglass. They were constructed by slipping the pieces together using their notched joints. A series of three systems were constructed. Each system consisted of one main object and three corresponding daughter components.

Page 112: Encode Engage
Page 113: Encode Engage

Fabrication Process_

The Arduino brains were wired and encoded to be placed within the central portion of each main component. Every object received 4 LEDs that were wired in parallel to translate light responses across the field.

Page 114: Encode Engage
Page 115: Encode Engage

Fabrication Process_

Here one full system of the entire prototype installation in completed. The main central component houses the micro controller, 4 LEDs, a PIR sensor, and a 5V DC fan. Three daughter components are wired to the main system and translate output responses that occur from the sensors centrally housed.

Page 116: Encode Engage
Page 117: Encode Engage

Final Exhibition_

An exhibition was curated as the final presentation of our thesis. My installation was displayed along with final drawings. The format of the exhibition was set in 3 Acts each with 3 students in each. It offered the opportunity to both display my work for a larger audience and present my ideas initiating a discourse conversation on integrated technologies within architecture

Page 118: Encode Engage
Page 119: Encode Engage

Final Exhibition_

An exhibition was curated as the final presentation of our thesis. My installation was displayed along with final drawings. The format of the exhibition was set in 3 Acts each with 3 students in each. It offered the opportunity to both display my work for a larger audience and present my ideas initiating a discourse conversation on integrated technologies within architecture

Page 120: Encode Engage
Page 121: Encode Engage

WHY ?Why push for this technology in architetcure?

Page 122: Encode Engage

DISCUSSION

Page 123: Encode Engage

This thesis looks toward a future architecture that pushes the use of technologies as they become embedded within our culture. We have grown to be dependent on our technology, specifically our mobile smart devices, and we now have immediate access to information in the palm of our hands. As these technologies continue to develop, they become more of a prosthetic to our bodies then an additive item. The real time data that these devices offer, raise the question of why our physical space does not respond to us in a similar fashion.

My research for this thesis looked to others who are designing responsive architecture interventions. Based on this research, I gained an understanding of how they incorporated technology into their architecture. Many projects are surface oriented where a building façade, wall panel, or overhead system is a responsive element. Other implementations of responsive technologies are installations that become a spectacle. While some responsive systems are purely playful and evoke a sense of wonder in the space, others work towards functionality. These functional projects look towards environmental stimuli, including façade panels that react to sunlight, or overhead canopies creating a shelter from sun or rain. My interests in responsive architecture lie outside of environmentally driven projects and towards an architecture that responds directly to people. Since people are the inhabitants of space, we should have the ability to communicate with our architecture and have it communicate back in real time. This thesis looked at the balance between playfulness and functionality, creating an awareness of these technologies in architecture to understand the future possibilities they have for our future spaces.

My experimentation throughout this thesis process allowed me to explore the many possibilities sensory technologies have in the future of architecture. I gained an insight on how others approach this design problem in architecture which informed my approach to how architecture can be responsive. I began by looking at the ways in which we use our smart phones, as an individual object that can act as a controller. I then moved towards looking at architecture as a machine that can perform with us. This development looked towards a softer approach of responsive architecture and began to add a life like quality to these installations. If we look towards architecture with embedded intelligence, it should be able to both respond to us and to other architecture. This thought process unveiled the ideas of bio-mimicry, where natural phenomena informs characteristics of fabricated systems. This led to my interest in creating a fully ‘living’ architectural environment through the use of embedded technology and a thoughtful tectonic material relationship as a responsive system.

Page 124: Encode Engage

BIBLIOGRAPHY

Page 125: Encode Engage

Addington, Michelle, and Daniel Schodek. Smart Materials and Technologies for the Architecture and Design Pro-fessions. Oxford: Architectural, 2005.

Beesley, Philip. Kinetic Architecture & Geotextile Installations. N.p.: Riverside Architectural, 2007. Print.

Bullivant, Lucy. Responsive Environments Architecture, Art and Design. London: V&A Publications, 2006. Conrad, Erik. “Embodied Space for Ubiquitous Computing.” Responsive Architecture Subtle Technologies 2006. N.p.: Riverside Architectural, 2006. 60-63.

Fox, Michael, and Miles Kemp. Interactive Architecture. New York: Princeton Architectural, 2009.

Inaba, Jeffery. “Sensorial City.” Adaptation Architetcure, Technology, and the City: 22-23. 2012

Shepard, Mark, ed. Sentient City. Cambridge: MIT, 2011.

Page 126: Encode Engage
Page 127: Encode Engage

Blog URL

Kahen Design<http://kahendesign.wordpress.com/>

Precedent URLs

Future Cities Lab< http://www.future-cities-lab.net/>

Howeler and Yoon< http://www.mystudio.us/>

Diller and Scofidio< http://DSRNY.COM/>

Bjarke Ingels< http://big.dk/#projects>

Page 128: Encode Engage

APPENDIX

Page 129: Encode Engage
Page 130: Encode Engage
Page 131: Encode Engage

NeoPlayformZ_

As our final presentation, we but on an exhibition to display our thesis work. The exhibition was structured under the theme of a play. The exhibition had three acts to organinize the projects. Act I was titled Networked Terrains. This collection of projects centered around utopian architectural ideas ranging from wireless networks to rooftop inhabitation. Act II was titles Reactive Arrangements. These projects look towards ways of transformable architecture and restructuring ways in thinking of spatial organizations. Act III was titles Augmented Interludes. This collection of projects tested different environments that can be created through various mediums including dream states, technology, and the dark. We constructed display panels to be arranged in specific arrangments that corresponded with each act. The exhibition itself became a performance as each act was taken down and reassembled.

Page 132: Encode Engage
Page 133: Encode Engage

NeoPlayformZ Act III Augmented Interludes_

During this final act, Augmented Interludes, there are three explorations of heightened social experiences manifested through very different methodologies and resulting in intriguing and temporal experiences for the users. The first part, Architecture Asleep, uses a theoretical and scientific approach to define a waking architectural reality through the use of dream functions and elements. The second part, Encode_Engage, activates the users’ senses with a responsive and reflexive architecture that explores both tectonics and technology. The final part, Social Darkness, explores the social and sensorial benefits of engaging with people, food, and architecture in complete darkness. These three theses present new social and experiential architectures that specifically engage the users in a moment of time in order to heighten their sense of place and self.

Page 134: Encode Engage
Page 135: Encode Engage

Special Topics Studio Fall 2012_

This installation is a student-designed project for the fall 20120 Special Topics Studio. The work is the product of a collaboration between our studio class (Samantha Altieri, Viviana Bernal, Erblin Bucaliu, Katherine Bujalski, Brittany Carey, Kristen Giannone, Ryan Kahen, Mark Morin, Bao Nguyen, Samantha Partington, Charles Simmons, Liem Than, Robert Trumbour {instructor}, Alex Cabral). The installation is in response to an intensive 10-day travel component conducted at the start of the semester, including visits to design and fabrication studios in New York City, the landscape of Big Bend, Texas and to Marfa, Texas to see the work of artist Donald Judd.

The studio offered the opportunity to work at a one to one scale. Through prototyping and fabricating systems at full scale, we were able to encounter and problem solve issues not seen in previous studio courses. Working though the design schemes, we had the opportunity to focus specifically on certain aspects of design. I dealt with components of lighting to be integrated within the systems, focusing on new fabrication techniques with the CNC machine and cast moldings. I also used generative design programs to design algorithmic solutions for the system at both the scale of an individual component and the populated field.

Page 136: Encode Engage
Page 137: Encode Engage

Steel Conduit Pipe(Flattened Ends)

Metallic Two-Hole Strap

5/16” x 1-1/4” GalvanizedSteel Carriage Bolt

1/2” x 10’ PVC Vertical Member

3/4” .020 Type 304Stainless Steel Strapping

1/2” x 3’ PVC Vertical Member

Metallic Two-Hole Strap(Fastened with 5/16” Galvanized

Steel Carriage Bolt)

1/2” x 10’ PVC Vertical Member

7/8” x 20” SteelThinwall Conduit Pipe

Resin Cast Light Casing

PVC Piping

Halogen Light Bulb

Wiring for Light

Drilled Hole in PVC to House Wire

Page 138: Encode Engage
Page 139: Encode Engage

SITE PLAN

A

A

N

4

82

Page 140: Encode Engage