8
Proxemics Awareness in Kitchen As-A-Pal: Tracking Objects and Human in Perspective Dipak Surie, Berker Baydan, Helena Lindgren User Interaction and Knowledge Modeling Group Dept. of Computing Science, Umeå University Umeå, Sweden {dipak, mcs10bbn, helena}@cs.umu.se Abstract—Spatial relationships or proxemics play an important role in how humans interact with other people and objects in an environment, yet spatial relationships are not thoroughly exploited within smart environments. Objects designed to be aware of its proxemics facilitate implicit and explicit interaction with humans. Kitchen As-A-Pal is an interactive smart kitchen that provides an infrastructure for sensing and modeling proxemics among objects and human using a sonar network and RFID technology. Position, movement, identity and location are the proxemics dimensions explored in kitchen As-A-Pal. A pilot study of a breakfast scenario comprising of 9 everyday activities in Kitchen As-A-Pal using 2 subjects yielded promising proximity tracking results with a precision of 100% and a recall of 68.3% for spatial zones with high and medium average time percentages (ATP). Also, 53.21% ATP has more than 95% recall values. Keywords - proxemics awareness; spatial modeling; proximity tracking, smart objects and smart environments I. INTRODUCTION Smart environments are everyday environments augmented with computational intelligence in the objects and people that occupy such environments with the primary purpose of understanding human needs and facilitating human-centered services [1] that enhance their activity performance, lifestyle and well-being. How people interact with computational intelligence that is distributed among several objects in the environment still remains an interesting and open challenge to explore. Context-awareness offers the possibilities of human interaction that are implicit by sensing and inferring context [2], and is also a key in offering human-centered services. Contextual information about the different entities in an environment that are relevant for enhancing human interaction with computing applications is given top priority [3]. While context-awareness in general is a widely researched topic [4], the spatial relationships between the different entities in an environment offer valuable cues but are often poorly understood and used in designing smart environment. Also, the lack of an infrastructure for implicit tracking of spatial relationships or proxemics among objects and human within smart environments presents a challenge for facilitating proxemics interaction [5, 6]. Proxemics drives human interaction with surrounding objects and other people in an environment [7]. Humans understand and make use of proxemics naturally in everyday environments. How far is a person to an object, moving towards or away from an object, turning the body orientation to the left or right of an object, placing objects together on a surface or in a container, and moving an object from location A to B have inherent meaning, and affect human-object interaction and human activity performance. Proxemics between objects and people changes dynamically and is dependent on the context making it hard to keep track of. However, smart environments worthy of its name are expected to be proxemics aware and apply the inferred meaning in facilitating smart services. The objects in a smart environment are no longer ordinary objects that are passive. They are smart “proxemics aware” objects belonging to a digital ecology that apart from maintaining their original property and functionalities offer value added capabilities of being proactive, adaptive and responsive to humans [8]. The challenge is to design such smart objects whose behavior and capabilities are easy to understand and control by humans. Since proxemics cues are natural to humans, they can be exploited for facilitating explicit interaction between smart objects and humans. Also, as the sophistication of a smart environment grows, the intensity of interactions between a human and the objects in the surrounding increases resulting in scalability as a challenge in terms of human attention, cognitive load, and distraction implications [9]. Proxemics is useful in discarding irrelevant human-object interaction from the ones that offer sense making. A proxemics aware environment can execute human- centered services like for example: a) optimal selection of the smart object(s) for information presentation and to obtain explicit user input from an ecology of smart objects; b) locate objects that are misplaced or lost like coffee cup, cutting board, etc. and guide a human towards those objects; and c) activity guidance for people with cognitive disabilities by informing about their current action, which objects to use, where they are located and how to use them for successful activity performance. This paper explores proxemics among smart objects and human in Kitchen As-A-Pal, a smart kitchen infrastructure. In particular, four dimensions of proxemics namely position, location, identity and movement [5, 6] are addressed from a sensing and modeling perspective. Kitchen As-A-Pal is a proxemics aware infrastructure using a network of sonar 2013 9th International Conference on Intelligent Environments Unrecognized Copyright Information DOI 10.1109/IE.2013.43 157 2013 9th International Conference on Intelligent Environments 978-0-7695-5038-1/13 $26.00 © 2013 IEEE DOI 10.1109/IE.2013.43 157

Proxemics Awareness in Kitchen As-a-Pal: Tracking Objects ...people.cs.umu.se/dipak/Publications/IE2013.pdfProxemics Awareness in Kitchen As-A-Pal: Tracking Objects and Human in Perspective

Embed Size (px)

Citation preview

Proxemics Awareness in Kitchen As-A-Pal: Tracking Objects and Human in Perspective

Dipak Surie, Berker Baydan, Helena Lindgren User Interaction and Knowledge Modeling Group

Dept. of Computing Science, Umeå University Umeå, Sweden

{dipak, mcs10bbn, helena}@cs.umu.se

Abstract—Spatial relationships or proxemics play an important role in how humans interact with other people and objects in an environment, yet spatial relationships are not thoroughly exploited within smart environments. Objects designed to be aware of its proxemics facilitate implicit and explicit interaction with humans. Kitchen As-A-Pal is an interactive smart kitchen that provides an infrastructure for sensing and modeling proxemics among objects and human using a sonar network and RFID technology. Position, movement, identity and location are the proxemics dimensions explored in kitchen As-A-Pal. A pilot study of a breakfast scenario comprising of 9 everyday activities in Kitchen As-A-Pal using 2 subjects yielded promising proximity tracking results with a precision of 100% and a recall of 68.3% for spatial zones with high and medium average time percentages (ATP). Also, 53.21% ATP has more than 95% recall values.

Keywords - proxemics awareness; spatial modeling; proximity tracking, smart objects and smart environments

I. INTRODUCTION Smart environments are everyday environments augmented

with computational intelligence in the objects and people that occupy such environments with the primary purpose of understanding human needs and facilitating human-centered services [1] that enhance their activity performance, lifestyle and well-being. How people interact with computational intelligence that is distributed among several objects in the environment still remains an interesting and open challenge to explore. Context-awareness offers the possibilities of human interaction that are implicit by sensing and inferring context [2], and is also a key in offering human-centered services. Contextual information about the different entities in an environment that are relevant for enhancing human interaction with computing applications is given top priority [3]. While context-awareness in general is a widely researched topic [4], the spatial relationships between the different entities in an environment offer valuable cues but are often poorly understood and used in designing smart environment. Also, the lack of an infrastructure for implicit tracking of spatial relationships or proxemics among objects and human within smart environments presents a challenge for facilitating proxemics interaction [5, 6].

Proxemics drives human interaction with surrounding objects and other people in an environment [7]. Humans understand and make use of proxemics naturally in everyday

environments. How far is a person to an object, moving towards or away from an object, turning the body orientation to the left or right of an object, placing objects together on a surface or in a container, and moving an object from location A to B have inherent meaning, and affect human-object interaction and human activity performance. Proxemics between objects and people changes dynamically and is dependent on the context making it hard to keep track of. However, smart environments worthy of its name are expected to be proxemics aware and apply the inferred meaning in facilitating smart services.

The objects in a smart environment are no longer ordinary objects that are passive. They are smart “proxemics aware” objects belonging to a digital ecology that apart from maintaining their original property and functionalities offer value added capabilities of being proactive, adaptive and responsive to humans [8]. The challenge is to design such smart objects whose behavior and capabilities are easy to understand and control by humans. Since proxemics cues are natural to humans, they can be exploited for facilitating explicit interaction between smart objects and humans. Also, as the sophistication of a smart environment grows, the intensity of interactions between a human and the objects in the surrounding increases resulting in scalability as a challenge in terms of human attention, cognitive load, and distraction implications [9]. Proxemics is useful in discarding irrelevant human-object interaction from the ones that offer sense making.

A proxemics aware environment can execute human-centered services like for example: a) optimal selection of the smart object(s) for information presentation and to obtain explicit user input from an ecology of smart objects; b) locate objects that are misplaced or lost like coffee cup, cutting board, etc. and guide a human towards those objects; and c) activity guidance for people with cognitive disabilities by informing about their current action, which objects to use, where they are located and how to use them for successful activity performance.

This paper explores proxemics among smart objects and human in Kitchen As-A-Pal, a smart kitchen infrastructure. In particular, four dimensions of proxemics namely position, location, identity and movement [5, 6] are addressed from a sensing and modeling perspective. Kitchen As-A-Pal is a proxemics aware infrastructure using a network of sonar

2013 9th International Conference on Intelligent Environments

Unrecognized Copyright InformationDOI 10.1109/IE.2013.43

157

2013 9th International Conference on Intelligent Environments

978-0-7695-5038-1/13 $26.00 © 2013 IEEEDOI 10.1109/IE.2013.43

157

sensors tracking proximity relationships and network of RFID readers within smart objects of the type surfaces and containers [10] for identifying co-located objects. This paper is organized as follows: Section 2 describes about proxemics dimensions that are explored at Kitchen As-A-Pal with insights into the modeling of different dimensions, the object tracking approach and related works. Section 3 describes human proximity tracking in detail presenting the sensing infrastructure and the studies conducted. Section 4 presents the evaluation results, while Section 5 discusses the challenges to be addressed. Section 6 presents the conclusion.

II. PROXEMICS AT KICHEN AS-A-PAL Kitchen As-A-Pal is an interactive smart kitchen where

natural human abilities are used and their limitations are compensated for enhancing human experience and supporting human activity performance. As-A-Pal refers to “like a friend”. We envision the kitchen to be a companion that is always there for supporting humans and is a building block of a holistic approach to ambient assisted living [11]. Kitchen As-A-Pal is located at MIT-Huset, Umeå University, Sweden as an environment for developing and testing ubiquitous computing technologies with potential users participating in the interaction design process.

Kitchen As-A-Pal has an area of 14.7 m2 with a space for dining, kitchen appliances, mixing and cutting, cooking, dishing and storage. The kitchen environment is capable of supporting everyday kitchen activities like preparing a sandwich, preparing coffee, having breakfast, and doing the dishes. Kitchen As-A-Pal comprises of ecology of smart objects that are of three types namely containers, surfaces and actuators [10]. Containers are a placeholder for objects like a refrigerator that contains objects like milk packet and yoghurt. Surfaces typically have objects on it while a person is performing activities like the counter top that contains salad bowl, veggies and cheese packet while preparing salad. Actuators change the state of the world like a coffee machine that prepares freshly brewed coffee, and a stove that heats up the objects on it. Kitchen As-A-Pal contains 5 containers, 7 surfaces and 8 actuators apart from 36 passively tagged objects. However within the scope of this paper, only the objects on the kitchen table are considered which includes 1 container, 4 surfaces and 5 actuators including all the passively tagged objects.

A. Proxemics Dimensions and Modeling Edward Hall [7] describes proxemics taking into account

the interpersonal relationship between people depending on their proximity to each other. Four spaces namely intimate (<50cm), personal (0.5-1m), social (1-4m) and public (>4m) are described based on physical proximity. Proxemic Interaction [5, 6] provides a design framework for facilitating human interaction with smart objects for the landscape of ubiquitous computing. In our work we will not identity novel proxemics dimensions, but rather use the proposed dimensions for coming up with a smart kitchen infrastructure that is aware of proxemics. The relative position of a person in distance metrics with reference to surrounding objects, the relative position of objects with reference to other objects, the movement patterns of a human and objects, the identity of

surrounding objects, and the location of objects and human offer value proxemics cues that are tracked in our work. Refer to Table 1 for further information about the different proxemics dimensions used in Kitchen As-A-Pal and the associated technologies. Of the different proxemics, human proximity to surrounding objects is given importance for the following reasons:

• Inspiration from Edward Hall’s theory of proxemics [7].

• Janlert’s proximity principle [12] which says that, “things that are close tend to matter; things that matter tend to be(come) close”.

• Other proxemics like human movement and orientation can be derived from human proximity measures over time.

Table 1. Proxemics description and the sensor technologies used in Kitchen As-A-Pal inspired from proxemic interaction [5, 6].

The type of smart object in perspective handles spatial relationships among objects (section 2B for more information about object tracking infrastructure). For instance, surfaces usually have several hotspots where tagged objects placed on it are detected. Humans within activity context generally spread objects on a surface with purpose and inherent meaning. For instance, while preparing salad the cutting board, kitchen knife, salad bowl and veggies are placed in a specific pattern with proxemics among the objects conveying certain meaning. Such patterns are modeled using simple logical rules. It should be noted that the exact location of objects, i.e. veggies on the counter top left, kitchen knife on the counter top right, etc. are less important in comparison to their colocation on the counter top. In that sense, the locations on the counter top are flexible for object placement while the key is what are the objects that are together at a particular instance in time.

!"#$%&'()*+'&%,)'#,)*

-#,.%$.*+%)("'/0#,*1'.2',**************3'.(2%,*45646!47* 5%,)',89"%(#8,'0#,*.%(2,#:#8'%)*

!"#$%"&' •  !"#$%&$'(%)*+&,(-$./(&01+2.1%&32)4&,(5(,(%6(&)1&178(6)+&

•  9(-$./(&01+2.1%&7()3((%&)4(&178(6)+&1%&)4(&+#$,)&+",5$6(&:(;';&178(6)+&1%&)4(&#2<2%'&$,($=&

•  >1%)$2%#(%)&01+2.1%&15&178(6)+&32)42%&$&+#$,)&61%)$2%(,&:(;';&178(6)+&32)42%&)4(&?,@2%'&,$6A=&

•  B,1<2#2)@&),$6A2%'&"+2%'&$&%()31,A&15&+1%$,&+(%+1,+&

•  C,$6A2%'&)4(&,(-$./(&01+2.1%+&15&178(6)+&1%&$&+#$,)&+",5$6(&"+2%'&$&%()31,A&15&9DEF&,($?(,+&

•  C,$6A2%'&)4(&61%)$2%#(%)&01+2.1%&15&178(6)+&2%+2?(G1")+2?(&$&+#$,)&61%)$2%(,&"+2%'&$&9DEF&,($?(,&)4$)&,('2+)(,+&178(6)+;&&

()$*&+,%"&' •  !"#$%&$'(%)*+&4($?&$%?&71?@&1,2(%)$.1%&32)4&,(5(,(%6(&)1&178(6)+&

•  H26,1+1I&J2%(6)K7$+(?&4"#$%&4($?&$%?&71?@&1,2(%)$.1%&?()(6.1%&;#,8#',8*1#"<=*

-".*/*&+' •  >4$%'(+&2%&4"#$%&01+2.1%&1/(,&.#(&&

•  >4$%'(+&2%&178(6)&01+2.1%&1/(,&.#(&

•  B$)4&L%?2%'&"+2%'&$&%()31,A&15&+1%$,&+(%+1,+&1/(,&.#(&&

•  M()31,A2%'&15&9DEF&,($?(,+&$%?&?()(6.%'&178(6)&-16$.1%&64$%'(+&1/(,&.#(&

01*&%+2' •  N%2O"(-@&2?(%.5@&:(;';&9")&$%?&P8Q,%=R&?2+6,2#2%$)(&:(;';&B(,+1%&S&$%?&P=&$%?G1,&?(+6,27(&4"#$%&$'(%)+;&

&•  N%2O"(-@&2?(%.5@&178(6)+&:(;';&

D1,A&T&$%?&D1,A&U=&$%?&178(6)&6-$++(+&:(;';&D1,A&$%?&V$.%'&B-$)(=&

•  H26,1+1I&J2%(6)K7$+(?&2#$'(&$%$-@+2+&$%?&#$002%'&)1&(<2+.%'&4"#$%&0,1L-(+&:(;';&0(,+1%&1%&$&34((-&64$2,R&0(,+1%&32)4&H@102$R&()6;=&;#,8#',8*1#"<=*

•  B$++2/(&9DEF&),$6A2%'&)1&2?(%.5@&178(6)+&$%?&"+(&(<2+.%'&178(6)&0,1L-(+&

3"4,%"&' •  C4(&O"$-2)$./(&$+0(6)&15&$&0-$6(&2%&34264&4"#$%&2%)(,$6.1%&32)4&+#$,)&178(6)+&)$A(&0-$6(&:(;';&#2<2%'&$,($R&611A2%'&$,($R&?2+42%'&$,($&$%?&?,@2%'&$,($=&

•  E?(%.)@&15&)4(&+1%$,&+(%+1,&),$6A2%'&4"#$%&0,1<2#2)@&$%?&)4$)&+1%$,&+(%+1,*+&$++162$.1%&)1&+0(62L6&-16$.1%&:(;';&+1%$,&+(%+1,&U&2+&$++162$)(?&)1&)4(&#2<2%'&$,($=&

158158

In Rule R7, preparing coffee context is made to be true and

preparing tea context to be false. The necessary to make a context false emerges from the fact that objects are associated to several contexts at the same time. For instance, the coffee cup, sugar packet and the teaspoon when placed on the counter top activate both preparing coffee and preparing tea contexts. The fact that coffee jar is collocated instead of tea box confirms the context to be preparing coffee and rejects the context to be preparing tea.

Containers usually have only one hotspot where tagged objects are registered to be inside them or are removed from them. The spatial relationship between objects inside a container is limited to sharing a common location, but temporal patterns related to their storage can be modeled. For instance, after doing the dishes as part of the breakfast scenario, coffee cup, juice glass, porridge bowl and eating plate are added to the drying rack which already contained sauce pan and baking dish from yesterday’s dinner. Such patterns are also modeled using simple logical rules.

In Kitchen As-A-Pal, the smart objects have designated

locations while the tagged objects are mobile. The tagged objects create dynamic proxemics with co-located objects by associating with smart surfaces and containers. Such associations also reveal their identity, another useful proxemics to keep track of. In the current implementation, actuators like water heater and coffee machine have parts that are mobile, but as a whole those smart objects have well-defined locations on the kitchen table. However in the future, the aim is to allow such actuators to change their location and still be tracked. For instance, a coffee machine might be placed in a new location to facilitate cleaning and by having a fixed location for it could limit the flexibility of the infrastructure proposed.

Kitchen As-A-Pal has several locations (5 in total) around the kitchen table like the mixing area, cooking area, and dishing area with inherent meaning to those locations which constrain humans from performing certain actions while facilitate other actions. A network of 5 sonar sensors embedded on the kitchen table associate themselves to respective locations. Based on which sonar sensor is responsible for human proximity tracking (section 3 and 4), and applying appropriate distance thresholds, human location at least if they are within the mixing area or cooking area is determined.

Human movement patterns are obtained by applying

human proximity measures over time while human orientation is obtained from human movement patterns (section 4B). Knowing human movement patterns like moving closer to an object can establish communication with that object, while moving away from an object disconnects existing connections. Knowing human proximity measures to complement their orientation enable information presentation adjusted to human needs. Knowing identity measures facilitate personalization of a smart environment according to human profile. Kitchen As-A-Pal is designed to be proxemics aware both to offer valuable contextual cues to interested applications in the smart kitchen but also to use the proxemics cues for facilitating human interaction with smart objects.

B. Object Tracking Infrastructure Object tracking infrastructure is briefly presented while

human proximity tracking is discussed in detail in the next section. Network of PhidgetRFID readers1 are used in creating smart surfaces that are capable of tracking the objects placed on it. Smart surfaces read RFID tagged objects using EM4102 protocol with typical read ranges of 7cm. However, the type of passive RFID tag embedded in objects determines the read ranges, which is usually between 3cm and 11cm. The RFID readers are strategically positioned to maximize the object recognition coverage. Also objects that are passively tagged typically contain more than one tag depending on the surface area of the object. For instance a bread packet contains 4 passive tags embedded to improve the object tracking coverage however a small knife or a fork contains just one passive tag. The number of RFID readers that should be a part of a smart surface was empirically evaluated depending on the RFID reader read ranges such that they do not collide, but at the same time offer maximum surface coverage. The precision and recall values for objects detection are 100% when tagged objects are placed on the hotspots of the smart surface marked for user perception. Challenges associated with a hotspot-based object tracking are described in section 5.

Standalone RFID readers are used for tracking containers in a semi-automatic manner where implicit tracking is complemented by explicit human actions to improve accuracy. Since object tracking granularity is in terms of if an object is inside or not inside a container with spatial relationships between objects inside a container left as future work, the object-tracking infrastructure currently available is extremely accurate. Three Toshiba AT100 tablets running Android 3.1 uses Phidget21 library to access the sensors. Several input/output boards are used including PhidgetsInterfaceKit 8/8/8 with 6 port hub and dongle sized 2/2/21. Raw sensor data are converted to low-level proxemics cues that are further inferred for higher-level proxemics between objects and human in Kitchen As-A-Pal.

1 http://www.phidgets.com/

!"#$%*!&$'#()*&!"#$%&'&(")&*+WX&&&&&&&&&&&&+,(#-,./*&,!"#$%&'()*$%+,-.$%'/01*-&4(-,.*&./001"23&4%1+35&'&6"7&1"23&4%1+38&9:&'&&+3&'&;&'<&%"&"$&&";&%:&&

&%/==&0&"23&4%<&&0&1)*02*&>?&7&==@&<&,!"#-5&4#A$=&12"/'0&,&'()*-5&</B/012"9B&,+,-.-5&

&C@%4:&$1C$@;&&,'/01*-&!SDEF&)'&)/'@$=1</B/014"$%&G%1!H&,*'2"-5&&)'&)/'@$=1</$09@4:14"$%&G%1!Y&,&,!3"-&&&

&&0&1)*03*&>?&4"I&&13/'&,!"#-5&4"I&&14#)&,&'()*-5&<#=/'1)/4C&%&,+,-.-5&

&%&/1<)""$&,'/01*-&!SDEF&)'&)/'@$=14"I&&14"$%&G%1!J&,*'2"-5&&)'&)/'@$=1%&/14"$%&G%1!K&,&,!3"-&&&

!"#$%*-,.%#!"#$%&!"#$%&&'()*&+,-.&&&&&-&"'(!")%&/!"#$%%&%%+'(-$#4%)*(0&4*+&"%&1(2234567)83,69&"7:4;734567)83,6<&=>7"7&,6&"7?7"@&84&4%7&4?&8>7&

&8(&&72&4567)8@&&,-.#*,/%&AB&+)4C773)DE9&6D$)73&F(@@9&E4""$2&7354=F9&7(G%&3EF(87.&

&/-%+$,"+(,4%-(-(+$!.09&HSIJK&24$%&32$@>7@3)4%87L83HMN&/$-2(09&&>(;$%&35"7(*?(@83)4%87L83HMM&/&,!#(0&&&

!"#$%*!&#'"%*(&)*"%+,&!"#$&"%$&"&$&"'$&"()&-".$/"#',*!*++,-*./01*20*$&3-4-.51*20*$&/667-.51*20*$&&

& & & & & &8-9:-.51*20*$&82;-.51*20*)&&

159159

C. Related Work There are several related works that attempt to keep track of

human and/or object locations. They are discussed in terms of the tracking technology used. Active Badges from AT&T Cambridge [13] uses infrared beacons which can be worn by a person or embedded in an object and is detected by a network of infrared sensors. Active Badges are affected by line-of-sight requirements and short-range signal transmission, but is a reasonable technology within the context of a kitchen environment. Active Bats [14] is an ultrasound positioning system using ultrasonic tags on a person or objects with a network of receivers mounted on the ceiling making use of ultrasound time-of-flight. Ultrasound positioning usually has high accuracy advantages. Both position and orientation information can be obtained from Active Bats. On the disadvantage, this approach requires large number of ultrasound receivers across the ceiling and proxemics are not sufficiently modeled. Cricket Location Support System [15] from MIT Laboratories and Dolphin [16] are other ultrasonic positioning systems available.

Received signal strength information (RSSI) of radiofrequency signals is used for indoor location tracking in RADAR [20]. Even though the system is easy to setup making using of existing WiFi network with few base station requirement, it is impractical to include small objects like coffee cup and fork as WiFi clients. Spatial tracking of smart objects with WiFi client capabilities in reference to a human is described in [17], however the proxemics between objects were not addressed. Also, radiofrequency signal strength-based approaches are less accurate for smaller spaces like a kitchen in comparison to ultrasound-based approaches. RSSI value are dependent on the objects in an environment and changes unreliably in the presence of specific objects like when a microwave oven is on or if the induction stove is on making it less suitable for an environment like the kitchen environment.

Radiofrequency Identification (RFID) technology is a contact-less technology for locating objects in indoor environments that does not require line-of-sight like with ultrasound or infrared-based location tracking technologies. While active tags with battery requirements are used in LANDMARC [18] and SpotON [19] for tracking objects using RSSI signal strengths, passive tags offer a cheaper battery-free option for tagging everyday objects like salad bowl, eating plate, milk packet, etc. in a kitchen environment. Passive tags support limited read ranges, but are useful for tracking applications where the objects will be in close range (about 10 cm) to the reader.

Moving from tracking technologies, we discuss research works that attempt to use proxemics for smart environment. RELATE Gateways [21] use spatial references by capturing the spatial relationship of a person’s mobile device with other co-located devices that are nearby in the environment for facilitating spontaneous interaction. Relation position tracking similar to our work is used. The nearby objects are presented as a spatial layout of the room. Situative space model [22] is a spatial model that describes human relationship to surrounding objects in terms of their abilities to perceive and act on objects. While the situative space model is used with a certain degree of success within the easy ADL home [23], the model lacks

awareness of spatial relationships among surrounding objects. The proxemics between human and objects discussed in this paper can be complemented by spatial relationships driven by human cognitive abilities like perception, attention and action planning in an environment for more advanced spatial models in the future. Proximity measures between an interactive wall and human are used in defining the nature of human-wall interaction in Hello.Wall [24]. Three spatial zones are defined using distance measure as an important proxemics cue. Proxemics awareness is used in a similar manner in the interactive public ambient display [25] where proximity measures are used for facilitating four proxemic zones of interaction. Here, the concepts of explicit and implicit interaction, and public and private display are explored using physical proximity.

III. HUMAN PROXIMITY TRACKING: EXPERIMENTAL SETUP

A. Networking of Sonar Sensors As mentioned in Section 2, human proximity tracking to

surrounding objects is important for proxemics awareness. A network of 5 sonar sensors is used for tracking human proximity around a kitchen table. The sonar sensors are connected to an android tablet on the kitchen table through the PhidgetInterfaceKit 8/8/8. MaxBotix EZ-1 sonar sensor2 is used which operates at a frequency of 20 Hz with reliability, stable range finding and at a low cost. The detection range is 0.15m to 6.45m, which is suitable for most kitchen environments. With a sensing resolution of 0.025m the selected sensor is accurate for physical proximity measures. In the current implementation, real proximity values are used and an approach based on fuzzy logic is under development to improve reliability.

Initially Sharp Infrared distance sensor 2Y0A21 was also experimented with, however its beam angle was lesser than MaxBotix EZ-1 sonar sensor. The width of the sonar beam is dependent on the reflecting object properties, but the approximate beam angle was empirically found to be around 43.3 degrees. Selecting a sensor with a larger beam angle offers better tracking coverage of an area. When all the networked sonar sensors operate in parallel, cross talk occurs between the sonar sensors resulting in sensor flickering and inconsistent data. The reason for the cross talk is that sensors are not triggered synchronously and also that the sensor’s triggering speed varies. To avoid cross talk between sonar sensors, each sensor was read sequentially by connecting the triggering device’s digital output 5V to the RX pin of the first sonar sensor, and the TX output of the first sonar sensor to RX of next sonar sensor. The process was followed to include all the sonar sensors in the network.

B. Sensor locations and Spatial Zones Kitchen As-A-Pal contains several sensors and smart

objects with their locations represented on the floor map (Figure 1). The locations are stationary locations, but dynamicity is supported through tagged objects that get associated to smart surfaces and containers. The 5 sonar sensors were strategically placed around the kitchen table to have wide sensor coverage in the kitchen environment. The

2 http://www.maxbotix.com/

160160

distances between sonar sensors 1 to 5 are 1.2m, 0.5m, 0.7m, and 0.9m respectively. The kitchen floor was divided into 21 spatial zones for determining the proximity tracking within individual zones and to know the average time spent in those zones during a breakfast scenario.

Figure 1. Floor map of Kitchen As-A-Pal with smart object locations and the

21 spatial zones.

C. Measuring Proximity Tracking Accuracies: Study 1 Human proximity tracking accuracies are determined

within the individual spatial zones. Two subjects not part of the proxemics awareness research team took part in this pilot study with enthusiasm. The subjects were young people nearing their thirties and could visualize a future smart kitchen. The two subjects spent 21 minutes each in the 21 spatial zones with natural body movements like bending, stretching and micro-movement within the individual spatial zones. To determine the accuracy of human proximity tracking, precision and recall values are calculated.

Human proximity tracking values from the 5 sonar sensors

were recorded as log files. For a single person scenario as with our case, one or two sensors actually track a person at a

particular moment in time. The remaining sensor beams hit the kitchen wall or other objects creating a virtual boundary. The following distance thresholds were used for sonar sensors 1 to 5: 0.50m (wall), 1.00m (refrigerator), 1.40m (glass frame), 1.40m (cupboard) and 0.75m (wall). This study provides the true positive and true negative values. Since the proximity measurement accuracy is high (0.025m) we assume that if a person is tracked, he/she is tracked accurately. Log file readings were used to confirm this hypothesis. Additionally, a 1 minute each recording was performed without the subject to obtain false positives. Electronic devices like microwave oven, refrigerator, etc. were turned on and checked if it would influence the sonar sensor human proximity tracking, but they did not.

D. Human Presence within Spatial Zones: Study 2Humans in a kitchen do not spend equal amount of time in

all the 21 spatial zones. There are some spatial zones where humans spent more time in comparison to others driven by different reasons, especially due to the presence of specific objects that determine their activity performance. We tried to determine how much time the subjects spend in individual spatial zones which can be correlated to human proximity tracking accuracies to determine the effectiveness of the proposed solution. The average time spent (AT) in sec and the average time percentage (ATP) in percentage were the measures used in knowing about human presence within the 21 spatial zones during the breakfast scenario.

Figure 2. A subject in Kitchen As-A-Pal performing breakfast activities. The

spatial zones are marked on the kitchen floor for the purpose of evaluating human proximity tracking.

E. Breakfast Scenario at Kitchen As-A-Pal The two subjects took part in a breakfast scenario

comprising of nine breakfast activities (Figure 2). Activities include: a) Preparing breakfast activities like preparing sandwich, boiling egg, preparing juice, preparing porridge and preparing hot drink; and b) Cleaning-up activities like cleaning the kitchen, cleaning dishes and emptying re-cycle bin. There were several actions within individual activities. For instance, in preparing sandwich the actions like toasting bread, applying butter, adding veggies, adding meat, adding cheese and adding dressing were performed. The subjects were allowed to be

<& =& >& ?&

@&A&B&C&

Z& <D& <<&

<=&<>&

<B&<A&<@&

<?&

<C&

<Z&=D&=<&

/!"#$"%&'&E!"#&$"%"&&'#%()&&$&($*+$&,&-)-&!('./0&&1+&%0&"*%$"&+2&3%"&+&#&%"&+2&'*4&&&5%'#!(&2&&))&1*!.&+2&%(0&$"*6&7&&

(#&)$!*'&3!"#&8#!0)&"9:;<&+&%0&+$&3#&+&&"#&&!((&+&'!+'.&&+&=+&$&("&"#&&(/51&+&*>&9:;<&+&%0&+$&

?%@P*A@&BCDE&F*(%+&F&($*+&&&?

F=%A%.&C*(&& !  8#!0)&"$&G!1+%+H&>*+&$&($*+&!("&+>%'!()&&!  I%6%&!(&S(0+*!0&8.%J*+5&

+%,"$-,*&&3!"#&8#!0)&"9:;<&+&%0&+&

!"#$*%&'()*$*+!%>,*K!&"&#$%&'(&)&&)(&"&#"%*+,-"%&.'($&"(/&"0&-$"&0&'($&'!&01$&2&&'("%&&$(&'%&&/$0$+0&01$&#$%&'(3&&-./'$*%&'()*$*+-%>,*4!&"&#$%&'(&)&&('0&)(&"(5&'!&01$&67&.'($&&"(/&"0&-$"&0&'($&'!&01$&2&&'("%&&$(&'%&&/$0$+0&01$&#$%&'(3&&-./'$*0$1.)*$*+-0>,*4!&"&#$%&'(&)&&)(&"&#"%*+,-"%&.'($&"(/&('($&'!&01$&2&&'("%&&$(&'%&&/$0$+0&01$&#$%&'(3&&*********%"$(('(&2*+%>*8&&&&&&&&&9:&&&&&&&&&&&&&&&&&&&&&&3$(.//*+3>*8&&&&&9:&

& &&&&&&&&&&&&&&&&&&&&&&9:&;&<:&& & & &&&&&&&&&&&&&9:&;&<=&9: <: 9: ; <=

161161

actors having the freedom to perform the activities in their own rights naturally without any fixed structures. A recipe booklet was included to assist the subjects if required, however the subjects were not forced to follow the recipe provided. One limitation is that the subjects could only perform the breakfast activities with the ingredients available in Kitchen As-A-Pal. Since there were 36 passively tagged objects to complement the smart objects and other ordinary objects without tags in Kitchen As-A-Pal, the subjects could choose from a wide range of options in performing the activities. One of the key parameters in this study is the time that the subjects spend in performing the activities. The subjects spent time naturally as they would do breakfast in their home. Subject A spent 909 sec while subject B spent 933 sec performing breakfast activities.

A video camera was used for passively recording the activity performance of the subjects through a glass window with their permission. Since the kitchen floor was marked with spatial zone numbers, they were also recorded in the camera. The video footages were analyzed for the time spent in each of the spatial zones and the average time (AT) was quantified in sec. Based on the AT values, the average time percentages (ATP) were calculated and the spatial zones were classified into low, medium and high ATP zones. The important point to make note of is if all the medium and high ATP zones are covered and tracked by the network of five sonar sensors. Table 2. Human proximity tracking accuracies represented as precision and

recall values along the different spatial zones with their corresponding average time spend in those zones.

IV. EVALUATION RESULTS

A. Precision, Recall and Average Time Percentage The human proximity tracking system has a precision of

100% making it useful for facilitating proxemic interaction. Refer to Table 2 and Figure 3 for further information. However, the recall value is 65.22%, which indicates that there are some areas that were immune from human proximity tracking. Taking spatial zones that actually matter during activity performance in a breakfast scenario, i.e. spatial zones with medium and high ATP (>=5%) yielded an increase in the recall percentage by 3.1%. A closer examination at the spatial zones suggests that the sonar sensor locations of S1 (sensor near spatial zone 21) and S5 (sensor near spatial zone 1) can be changed to be closer to spatial zone 19 (for sensor S1) and spatial zone 2 (for sensor S5) would have yielded far significantly better recall results since their ATPs are 9.29% and 11.28% respectively. This would have increased the coverage by 12.8% without increasing the number of sensors in the network. For spatial zones with above 95% recall (Z1, Z6, Z7, Z8, Z12, Z13, Z15, Z16 and Z21), the ATP is 53.21%. Increasing the number of sensors to 7 would have also improved the recall figures significantly. But the overall human proximity tracking results are promising to explore proxemics awareness further in Kitchen As-A-Pal.

Figure 3. Floor map of Kitchen As-A-Pal segmented into low, medium and high ATP spatial zones with their corresponding human proximity tracking

recall values.

B. Human Movement Pattern and Orientation Human movement patterns and their orientation are

detected during the breakfast scenario enacted by the two subjects as described in section 3E with their occurrence count

!"* #!* $"* $!* !**%&'*?(*

)*%&'*?(*

4#*%&'**+,(*

4#!*%&'*?(*

!& "#$#& !%& && !&&'&&& ""'(&& %(')& $'*#&+& +#+& "#+)& && !&&'&&& +'*$& !&*'&& !!'+(&#& #+%& "!")& && !&&'&&& #'*)& +"')& #'+&&*& %#*)& #!))& && !&&'&&& %%'$)& &'&& &'&&&)& #&(#& %##"& && !&&'&&& #+')*& &'&& &'&&&%& "*!+& ($& && !&&'&&& ""'&)& &'&& &'&&&$& "*&"& *+& && !&&'&&& ""')&& !#')& !'*$&(& "#!*& +)+& && !&&'&&& "$'#&& !)*'&& !%'$*&"& $&##& +##)& && !&&'&&& $)'!&& $(')& (')$&!&& %*")& +(**& && !&&'&&& %('()& &'&&& &'&&&&!!& )"$%& #*&(& && !&&'&&& %#'$)& &'&& &'&&&!+& "&+&& *#(& && !&&'&&& ")'#&& &'&&& &'&&&&!#& "*#!& +*& && !&&'&&& ""'$&& !&')& !'!)&!*& %")&& +%$(& && !&&'&&& $+'!&& %&'&& %'*(&!)& "#%!& +"%& && !&&'&&& "%'"&& !%)'&&& !$'(*&&!%& ")!$& %& && !&&'&&& ""'"&& $%'&&& ('!"&!$& )$!)& #(+!& && !&&'&&& )"'))& $&'&& $')"&!(& *+()& ))($& && !&&'&&& *+'))& &'&& &'&&&!"& !!&)& ()!&& && !&&'&&& !!'%&& ()'&& "'+"&+&& && "))(& && !&&'&&& &'&&& #'&& &'##&+!& "*&"& $+& && !&&'&&& ""'+&& #')& &'#"&

-:./0:* 12345@* 63A77* 5* 155855* 64822* A2185* 155855*&

,-./.&01& 23&!"#$%%5&'$(4&56& 23&)(5$%*"+,6,7$4&71& 23&-./+$%%$0.6,7$4&76& 23&-./+$% *"+,6,7$4& 6& 23& *($1,+,"#4& 8& 23& 2$1.//4& S5& 23& 37$(.0$% ),&$& 9:;& S56& 23&37$(.0$%),&$%*$(1$#6.0$8&

&&!&"&##$%&&

&&!&"&'$()&

&&!&"&*$(+&

&&!&"&,,$)+&

&&!&"&*'$+(&

&&!&"&##$&+&

&&!&"&##$+&&

&&!&"&#)$*&&&

&&!&"&)+$-&&

&&!&"&,%$%+&

&&!&"&,*$)+&

&!&"&#+$*&&

&!&"&##$)&&

&!&"&+#$++&

&&!&"&##$#&&

&!&"&#,$#&&

&!&"&)'$-&&

&&!&"&('$++&

&!&"&--$,&&

&!&"&&$&&&

&!&"&##$'&&

!  S./&012103&45&3.!'"#!%$/%!%&!''!)*"#!%674879&481&79:7;7:<=>&3?=@=>&A5913&:<079B&481&CP01=D2=34&EF19=075G&01?0131941:&79&?10F194=B1&HIJ&

!  /&012103&45&&'!'/3/()&52&8<K=9&?05L7K74M&40=FD79B&674879&481&79:7;7:<=>&3?=@=>&A5913N&687F8&73&-&&I&

!  !&012103&45&(!'"))&52&8<K=9&?05L7K74M&40=FD79B&674879&481&79:7;7:<=>&3?=@=>&A5913&01?0131941:&79&?10F194=B1&HIJ&

S./&O&+$&&I& +$&&I&O"&S./&O&-&$&&I& S./&P"&-&$&&I&

162162

(Table 3). Human movement patterns are represented as paths of a graph with each node representing individual spatial zones described in figure 1. For instance, pattern P1 represents the path taken by a human actor from the mixing area in front of the kitchen counter to the appliance area in front of the coffee machine where other appliances like bread toaster can also be accessed, and back to the mixing area. Pattern P2 represents the path from the mixing area to the storage area where food ingredients and utensils are stored, and back to the mixing area. Pattern P3 represents the path from the dishing area in front of the sink to the storage area and back to the dishing area. Of the 12 movement patterns determined, P1 to P4 have strong occurrence count (>8) while the remaining patterns did not signify in terms of occurrence. This shows that patterns P1 to P4 can be used for predicting the next human move and offer human-centered service within the smart kitchen. Knowing potential human movement patterns is useful for personalizing and adapting the smart kitchen to current and near-future contexts. Also, human movement patterns are useful for explicit interaction with smart objects in the kitchen. In this paper, we highlight the possibilities of human movement pattern recognition and a detailed evaluation of the accuracy of these patterns with error estimations for checking the validity of human movement pattern recognition is a future work.

The orientation of a human can be indirectly inferred from the path traversal through the different spatial zones. For instance in Table 3, if pattern P1 is detected then from the previous and current spatial zone numbers, current and future orientations can be determined. While it is possible to detect orientation without detecting human movement patterns, a person might be in the same spatial zone but could have changed the orientation. This makes it unpredictable to determine orientation outside a known movement pattern.

Table 3. Human movement patterns detected during the breakfast scenario enacted by the 2 subjects with their occurrence count.

V. CHALLENGES TO ADDRESS

A. Portability to Different Dynamic Environments Technologies developed for smart environments are often

restricted to the specific environment for which they are designed and developed for. While it is important to come-up to a “proof-of-concept” smart environment, the real value

comes from technologies and techniques that are portable to other environments. Kitchen As-A-Pal is envisioned to be a proxemics aware environment that is portable to other dynamic environments. However in the current implementation, a static model of the environment is used based on large stationary objects and kitchen walls for determining the sonar sensor maximum range thresholds. If the kitchen table with the sonar sensor network is moved to a different environment that is dynamic in terms of changes to its interior then novel approaches for dynamically calibrating the maximum range thresholds and other environmental constraints is needed. Such dynamically calibrating feature would enable the adaptation of the proposed infrastructure to any kind of existing kitchen environments.

B. Tracking Coverage for Larger Environments Human proximity tracking using a network of sonar

sensors works accurately for relatively smaller environments. However, as the environment area increases, more sensors are required for maintaining a decent coverage. We have used 5 sonar sensors for an area of 14.7 m2, however larger kitchens are less efficient to be tracked using sonar sensor networks. Approaches like radiofrequency signal strength attenuation [17] offer better efficiency, even though there would be a compromise on the human proximity tracking accuracy. Using multiple complementary technologies is a way to move forward preserving accuracy and supporting coverage for larger environments.

C. Automatic Tracking of Proxemics between Objects Smart containers and surfaces in Kitchen As-A-Pal handle

proxemics between objects. While the object tracking accuracy is high, the approach taken is semi-automatic since the sensing infrastructure using RFID technology does not offer 100% coverage on surfaces and inside containers. Semi-automatic sensing forces the occupants of a smart environment to use the hot spots as a reference in placing objects on a surface or inside a container which can be obtrusive at times and affect natural human-object interaction. Automatic tracking of proxemics between objects usually suffers from tracking the objects’ identity, but is an important challenge to solve if proxemics aware environments are to be user-friendly and support “invisibility” as in Mark Weiser’s vision of ubiquitous computing [26].

D. Human-Centered Perspective of Proxemics While proxemics between objects and human is viewed

from a “design for all” perspective, how humans perceive the space around them and the relationships they share with surrounding objects and other humans may vary a lot. The philosophy of defining the same proxemics for all occupants of a smart environment is questionable. Further studies are required to design smart environments that are adaptive and personalized to individual occupants taking an egocentric viewpoint [27] on establishing the proxemic cues. “Design for individual” is a challenge to strive for, especially while providing support for a multitude of human profiles within smart environments.

94!"#*!$%&!&#'*(")&*#*

+"',*-&./*01&-*".*.("2"3*4$#&.*

5//6**&#/&*7#6!1&**$8*

2!&.9*

!"' !"#"$"%"$"#"!& %#&

!#' !"&"%'"%("%'"&"!& %$&

!$' %)"%("%&"%("%)& %*&

!%' &"%'"%+"%'"&& &&

!&' ,%"+"&"%'"%("%'"&"+",%&& #&

!'' ,$"+"%)"%&"%)"+",$&& #&

!(' %("%&"$*"$%"$*"%&"%(& #&

!)' ,%$"%(",%$&& #&

!*' ,$"%)"%&"%)",$&& #&

!"+' ,)"%&",)&& $&

!""' %"$"#"+"&"%'"%("%'"&"+"#"$"%&& $&

!"#' !"+",'"+"!&& $&

163163

E. Dimensions and Granularity of Proxemics Proxemics is an abstract term and some of the dimensions

explored in this paper are limited to the dimensions proposed in Proxemic Interaction [5, 6]. While the five dimensions of position, orientation, motion, identity and location is useful as a starting point in understanding proxemics and using it within the context of smart environments, further research is required for establishing concrete proxemics dimensions. While the granularity of proxemics between objects and human is dependent on the smart environment and its applications, proxemics in general can be described at several different levels. What are the different levels of proxemics that are interesting and useful for smart environments? How to develop technologies for tracking and modeling proxemics at different levels of granularity, similar to existing location tracking approaches at different levels of granularity is a challenge to address in the future.

F. Infrastructure for Ambient Assisted Living One potential application area of Kitchen As-A-Pal is in

the development of ambient assisted living systems for the elderly population. Such population segments have several movement limitations and also cognitive issues. In context, Kitchen As-A-Pal could provide interesting solutions to predict the situations and offer support that maintains the safety and security of elderly people at home. Kitchen As-A-Pal could also have smart applications that interactively motivate elderly to cook and eat healthy food regularly, a major problem with the elderly population. Since the infrastructure is based on sonar sensor networks and RFIDs in kitchen objects, the infrastructure is not obtrusive with the need for wearable devices on elderly people, which is decisive feature for the success of assistive systems.

VI. CONCLUSION Proxemics between human and surrounding objects offer

important contextual cues for facilitating human-centered smart services. In this work, Kitchen As-A-Pal was presented as a proxemics aware infrastructure focusing on four dimensions namely position, motion, identity and location using RFID-based object tracking and ultrasound-based human proximity tracking. Human proximity tracking accuracies are promising (for 53.21% average time percentage, the precision is 100% and the recall is more than 95%) for further exploration of proxemics within smart environments.

ACKNOWLEDGMENT Thanks to Arslan Qureshi for working with RFID

technology and to the two subjects for taking part in the study. This work is partly sponsored by the Swedish Brain Power.

REFERENCES [1] D. Surie, T. Pederson and L-E. Janlert. The easy ADL home: A physical-

virtual approach to domestic living, J. Ambient Intell. Smart Environ., 2, pp. 287-310, 2010.

[2] A. Schmidt. Implicit human computer interaction through context, Personal and Ubiquitous Computing, 4, pp. 191-199, 2000.

[3] A. Dey. Understanding and Using Context, Personal Ubiquitous Comput., 5, pp. 4-7, 2001.

[4] M. Baldauf, S. Dustdar and F. Rosenberg. A survey on context- aware systems. Int. J. Ad Hoc Ubiquitous Comput., 2, pp. 263-277, 2007.

[5] T. Ballendat, N. Marquardt and S. Greenberg. Proxemic interaction: designing for a proximity and orientation-aware environment, ITS '10, ACM, pp. 121-130, 2010.

[6] N. Marquardt and S. Greenberg. Informing the Design of Proxemic Interactions, IEEE Pervasive Computing, Special Issue on Pervasive I/O, 2012.

[7] E.T. Hall. The Hidden Dimension, Doubleday, Garden City, N.Y, 1966. [8] F. Kawsar, K. Fujinami and T. Nakajima. Prottoy Middleware Platform for

Smart Object Systems, IJSH Special Issue on New Advances and Challenges in Smart Home, 2, pp. 1-18, 2008.

[9] M. Satyanarayanan. Pervasive Computing: Vision and Challenges, Personal Communications, IEEE, vol. 8, pp. 10-17, 2001.

[10] D. Surie, H. Lindgren and A. Qureshi. Kitchen As-A-Pal: Exploring Smart Objects as Containers, Surfaces and Actuators, To appear in proc. IsAmI, 2013.

[11] H. Lindgren, D. Surie and I. Nilsson. Agent-Supported Assessment for Adaptive and Personalized Ambient Assisted Living, Trends in Practical Applications of Agents and Multiagent Systems, Advances in Intelligent and Soft Computing, Springer, vol. 90, pp. 25-32, 2011.

[12] L-E. Janlert. Putting pictures in context. Proceedings of the working conference on Advanced visual interfaces, Venezia, Italy, ACM, 2006.

[13] R. Want, A. Hopper, V. Falcao and J. Gibbons. The active Badge location system, ACM Transactions on Information systems, vol. 40, no. 1, pp. 91-102, 1992.

[14] M. Hazas and A. Hopper. A Novel Broadband Ultrasonic Location System for Improved Indoor Positioning, IEEE Transactions on mobile Computing, vol. 5, no. 5, 2006.

[15] N. Priyantha. The cricket indoor location system, PhD Thesis, Massachusetts Institute of Technology, June 2005.

[16] M. Minami, Y. Fukuju, K. Hirasawa, S. Yokoyama, M. Mizumachi, H. Morikawa and T. Aoyama. Dolphin: A practical approach for implementing a fully distributed indoor ultrasonic positioning system, Ubicomp, pp. 347–365, 2004.

[17] D. Surie, F. Jäckel, L-E. Janlert and T. Pederson. Situative Space Tracking within Smart Environments, In Proceedings of the 6th International Conference on Intelligent Environments, Kuala Lumpur, Malaysia, IEEE Computer Society Press, pp. 152-157, 2010.

[18] L. Ni, Y. Liu, Y. Lau and A. Patil. LANDMARC: Indoor Location Sensing Using Active RFID, Wireless Networks, 10, pp. 701-710, 2004.

[19] J. Hightower, C. Vakili, C. Borriello and R. Want. Design and calibration of the SpotON ad-hoc location sensing system, UW CSE 00-02-02, University of Washington, Department of Computer Science and Engineering, Seattle, WA, 2001.

[20] P. Bahl and V.N. Padmanabhan. RADAR: An in-building RF-based user location and tracking system, In Proceedings of IEEE INFOCOM 2000, Tel-Aviv, Israel, 2000.

[21] H. Gellersen, C. Fischer, D. Guinard, R. Gostner, G. Kortuem, C. Kray, E. Rukzio and S. Streng. Supporting device discovery and spontaneous interaction with spatial references, Pers. Ubiq. Comput., vol. 13, issue 4, pp. 255-264, 2009.

[22] T. Pederson, L-E. Janlert and D. Surie. A Situative Space Model for Mobile Mixed-Reality Computing, IEEE Pervasive Computing, 2011.

[23] D. Surie, L-E. Janlert, T. Pederson and D. Roy. Egocentric interaction as a tool for designing ambient ecologies – The case of the easy ADL ecology, Journal of Pervasive and Mobile Computing, Special Issue on Ambient Ecologies, Elsevier, vol. 8, issue 4, pp. 597-613, 2012.

[24] T. Prante, C. Röcker, N. Streitz, R. Stenzel, C. Magerkurth, D. van Alphen and D. Plewe. Hello. Wall–Beyond Ambient Displays. Adj. Proc. of Ubicomp '03, 2003.

[25] D. Vogel and R. Balakrishnan. Interactive public ambient displays: transitioning from implicit to explicit, public to personal, interaction with multiple users, Proc. of UIST '04, ACM, pp. 137-146, 2004.

[26] M. Weiser. The computer for the 21st century, Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, 1995.

[27] D. Surie. Egocentric Interaction for Ambient Intelligence, PhD Thesis, Dept. of Computing Science, Umeå University, Sweden, ISSN 0348-0542, 2012.

164164