14
Metaphors for Linguistic Description of Data Gracian Trivino and Daniel S´ anchez Abstract In this paper, we propose a formal representation of the meaning of sen- tences involving conceptual metaphors in the context of the research line of Com- puting with Words and Perceptions. Conceptual metaphors are mappings between conceptual domains that are common in everyday natural language usage. They are not just a matter of lexico-grammar stratum but of representation and processing in the semantic stratum of language. Here, the Granular Linguistic Model of a Phe- nomenon is presented as a computational paradigm for representing the meaning of metaphorical sentences, with an application devoted to generate linguistic descrip- tions of data. The obtained results provide an approach to assign a fuzzy fulfilment degree to linguistic expressions with a more complex semantic and lexico-grammar structure than usually handled in Fuzzy Logic. 1 Introduction The general goal of the research line of “Computing with Words and Perceptions” [16] [18] is to develop computational systems with the capacity of computing with the meaning of Natural Language (NL) expressions involving imprecise descrip- tions of the world, like humans do. 1 As part of this research line, this paper is focused on the development of computa- tional systems able to produce linguistic summaries of data. Fuzzy sets are specially Gracian Trivino European Centre for Soft Computing, e-mail: [email protected] Daniel S´ anchez European Centre for Soft Computing, e-mail: [email protected] 1 We have shared with Prof. Mamdani the profession of engineer and the passion for knowledge. We engineers study physical phenomena to create metaphors of reality that initially are expressed in natural language (NL). In this paper, we will see that a possible implementation of these metaphor- ical mappings is the Mamdani’s fuzzy inference method [8]. 1

Metaphors for Linguistic Description of Data

  • Upload
    upm-es

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Metaphors for Linguistic Description of Data

Gracian Trivino and Daniel Sanchez

Abstract In this paper, we propose a formal representation of the meaning of sen-tences involving conceptual metaphors in the context of the research line of Com-puting with Words and Perceptions. Conceptual metaphors are mappings betweenconceptual domains that are common in everyday natural language usage. They arenot just a matter of lexico-grammar stratum but of representation and processing inthe semantic stratum of language. Here, the Granular Linguistic Model of a Phe-nomenon is presented as a computational paradigm for representing the meaning ofmetaphorical sentences, with an application devoted to generate linguistic descrip-tions of data. The obtained results provide an approach to assign a fuzzy fulfilmentdegree to linguistic expressions with a more complex semantic and lexico-grammarstructure than usually handled in Fuzzy Logic.

1 Introduction

The general goal of the research line of “Computing with Words and Perceptions”[16] [18] is to develop computational systems with the capacity of computing withthe meaning of Natural Language (NL) expressions involving imprecise descrip-tions of the world, like humans do. 1

As part of this research line, this paper is focused on the development of computa-tional systems able to produce linguistic summaries of data. Fuzzy sets are specially

Gracian TrivinoEuropean Centre for Soft Computing, e-mail: [email protected]

Daniel SanchezEuropean Centre for Soft Computing, e-mail: [email protected]

1 We have shared with Prof. Mamdani the profession of engineer and the passion for knowledge.We engineers study physical phenomena to create metaphors of reality that initially are expressed innatural language (NL). In this paper, we will see that a possible implementation of these metaphor-ical mappings is the Mamdani’s fuzzy inference method [8].

1

2 Gracian Trivino and Daniel Sanchez

well suited for filling the semantic gap between precise data and concepts expressedin linguistic terms. In Fuzzy Logic, the concepts of linguistic variable and linguisticlabel [14] allow to generate sentences as: “the temperature is high” in order to lin-guistically summarize numerical sensor data. The concept of linguistic summarizer[12] is employed for generating sentences like “most of the day the temperature islow”. There are some other approaches [5], but in general, the semantic and syntac-tic complexity of the generated sentences are still limited to a mapping between aprecise domain and a linguistic concept by means of a fuzzy set, and the assessmentof quantified sentences involving those concepts.

However, when describing data or, in general, our perceptions about a phe-nomenon, we humans make intensive use of linguistic expressions on the basisof metaphors. For example, when describing our perception about temperature wecould use metaphorical sentences like the air is fire this morning instead of non-metaphorical sentences like the air is very hot this morning. This is the norm assoon as one gets away from concrete physical experience and starts talking aboutabstractions or emotions [6]. Using metaphorical sentences is very important in or-der to transmit to the user the sensation that computational systems are more human-like, improving the empathy in human-machine interaction. Therefore, the objectiveis to provide linguistic expressions that are as much indistinguishable as possiblefrom those employed by human beings. Hence, the problem arises of how to obtaincomputational representations of the meaning of these metaphorical sentences.

The goal of this paper is to propose a representation of conceptual metaphors.Our proposal is based on establishing a relationship between two ideas belonging toapparently separated fields, namely, metaphors in Linguistics and Philosophy, andFuzzy Logic. This is achieved through the concept of Granular Linguistic Modelof a Phenomenon (GLMP) [10], [9], in which the possible (metaphorical) linguisticdescriptions of data obtained from a physical phenomenon are linked via a collec-tion of mappings organized in an hierarchical way. In our approach, the monitoredphenomenon consists of a set of interrelated objects (either mental or physical) thatevolve in time within the domain of experience of the application. A GLMP is amodel of the designer’s perceptions of that phenomenon, i.e., GLMP represents adesigner’s interpretation of the available data that is aimed to fulfill the computa-tional application requirements. The calculation of the fulfilment of the differentmetaphor-based linguistic descriptions is based on applying tools of Fuzzy Logic.

2 Metaphors

Metaphors, rather than rhetorical resources in natural language (NL), are the way weconceptualize one mental domain in terms of another [7]. Metaphors are mappingsf : X → Y from a Source Conceptual Domain (X), e.g., height, to a Target Concep-tual Domain (Y ), e.g., temperature, both conceptual domains being part of our con-ceptual model of the world. This mapping is a set of ontological correspondencesthat characterizes epistemic correspondences by mapping our experience about X

Metaphors for Linguistic Description of Data 3

onto knowledge about Y . Such correspondences permit us to reason about Y us-ing the knowledge we use to reason about X [6]. According to Lakoff, we namemetaphors using the format TARGET-DOMAIN IS SOURCE-DOMAIN, like forexample QUANTITY IS HEIGHT [6]. In this case, the concepts in the source do-main HEIGHT (like high, medium, low) are metaphorically mapped to groups ofvalues in the target domain QUANTITY, so that the previous linguistic terms areapplied in order to describe quantities in a more imprecise (human-like) way.

Linguistic expression of metaphors, i.e., texts, are the realization of this cross-domain mapping in the lexico-grammar stratum of NL [4]. Following with theQUANTITY-HEIGHT metaphor, the sentence “The temperature is high” is a re-alization of this metaphor since temperature is measured by using quantities.

Metaphors are ubiquitous in everyday NL. They are so incorporated in NL thatsometimes they are difficult to recognize. One very well-known example is theMORE-UP, LESS-DOWN metaphor that is applied to describe quantities. For ex-ample, the temperature is going up this morning or the temperature has fallen thisnight. Usually, we use a QUANTITY-LINEAR SCALE metaphor, e.g., the temper-ature is increasing or the temperature today is lower than yesterday (See [6] formore examples of metaphors used in everyday NL).

One important characteristic of metaphors is that they must respect the so calledInvariance Principle. It consists of a set of constraints on the possible conceptualcorrespondences, i.e., mappings does not always produce metaphors. In general,metaphorical mappings select a set of concepts in the Source Conceptual Domainwith meaningful relations among them. The mapping must allow to establish par-allel metaphorical relations among the corresponding elements in the Target Con-ceptual Domain. That is to say, metaphorical mappings must respect the inherentsemantical structure of the Source Conceptual Domain in a way consistent with thesemantical inherent structure in the Target Conceptual Domain. From a semanticalpoint of view, the Invariance Principle guarantees that metaphors are understoodby a certain specific human group when they are used in specific situation types. Itis worth to recall that the meaning of expressions in NL is defined by their prac-tical use [11]. The consistence of meanings imposed by the Invariance Principle,depends on the typical use of linguistic expressions in both, the source linguisticdomain and the target linguistic domain.

Metaphors are usually recursive, i.e., we can establish a metaphor where theSource Conceptual Domain is the Target Conceptual Domain of another metaphor.An interesting example of this, is the LINEAR SCALE-PATH metaphor. Typically,it is used to describe the evolution of temporal series. For example, today, the tem-perature in Barcelona is ahead of the temperature in Madrid or the temperaturehas gone slowly until 30oC and then it has stopped. Among others concepts, thismetaphor maps:

• Starting point in travel → Initial value of quantity• Distance traveled → Change of quantity• Going up a slope → Increasing of quantity• Going down a slope → Decreasing of quantity• End of travel → Last value of quantity

4 Gracian Trivino and Daniel Sanchez

To develop computational models of metaphors is a challenging task. The no-tions involved in the definition of metaphors are vaguely described, and hence itis not easy to computationally represent notions like, namely, Conceptual Domain(not to mention mappings between such domains), the Invariance Principle, etc.Furthermore, metaphors are built in our minds and are intimately associated to ourexperience, cultural framework and, summarizing, to our conceptual model of theworld. The representation of such conceptual models and their relationships to therepresentation of metaphors is also a hard problem. Another source of difficultiesis that we need to link the aforementioned models to mechanisms for understand-ing/generation of metaphorical sentences in NL, with the difficulty that metaphorsinvolve an unusual use (hence, semantics) of syntactical structures. Similar discus-sions by other authors have even led to the hypothesis that mappings between con-ceptual domains correspond to neural mappings in the brain, that are not easy tomodel [2].

Another remarkable issue when dealing with metaphors is that, as it is usually thecase with NL expressions, they are imprecise statements. Hence, in order to developand to use computational models of metaphors, we must provide mechanisms forevaluating their degree of compatibility with the real world they refer to. This is awell known problem in the area of Computing with Words, for which fuzzy sets arespecially well suited. For example, the fulfilment degree of our previous examplethe air is fire this morning is obviously closely related to that of the air is very hotthis morning. A simple solution in this example is to assign the fulfilment degreeof the latter, easy to obtain, to the former. However, in many cases the situation isnot so simple, the fulfilment of a metaphorical sentence being related to impreciseinformation coming from different sources, aggregated in complex ways.

As a first approach to the representation of the meaning of metaphors in practice,specifically for linguistic descriptions of a given phenomenon, we have consideredto employ the granular linguistic model of a phenomenon (GLMP), introduced in[10]. In this approach, we don’t need a model for representing conceptual domains,mappings, or the Invariance Principle; instead, we consider that a person that wecall designer provides a representation of metaphorical sentences and a way to de-termine their fulfillment degree, the latter by means of Fuzzy Sets Theory. Thismodel allows the designer to represent metaphors by means of two kinds of expertinformation: i) specifying how the fulfillment value of a certain linguistic state-ment can be obtained from simpler pieces of information, and ii) by translating theconceptual mapping defining a metaphor into a computational mapping betweenlinguistic statements according to his conceptualization of the world, hence guaran-teeing that the Invariance Principle holds. Notice that this way, we are representingthe use of the metaphor by the designer and, in this sense, its meaning accordingto Wittgenstein [11]. Therefore, we could say that the designer is who speaks whenthe machine speaks, i.e., the meaning of the generated sentences can be explainedby the designer’s personal experience in using NL.

In the next section, we introduce the GLMP on the basis of some relevant con-cepts of Zadeh’s Computational Theory of Perceptions [16].

Metaphors for Linguistic Description of Data 5

3 Granular Linguistic Model of a Phenomenon

In the research line of Zadeh’s Computational Theory of Perceptions [16], GLMPis a computational model allowing to generate different linguistic descriptions ofa phenomenon based on the subjective description of perceptions and conceptualmetaphors of a designer. The model allows the computational system to providelinguistic descriptions of data according to the designer’s way of using NL.

The GLMP consists basically of two elements: computational perceptions (CP)and perception mapping protoforms (PMP). Computational perceptions provide lin-guistic descriptions of certain relevant aspects of a phenomenon. The concept ofprotoform was introduced by Zadeh as “[...] a symbolic expression which definesthe deep semantic structure of an object such as a proposition, question, command,concept, scenario, case, or a system of such objects” [17]. Here, perception mappingprotoforms are the basic tool for representing the computational mapping, generatedby a given metaphor, between computational perceptions. In the next sections, weexplain these concepts in more detail.

3.1 Computational perception

In our approach, a computational perception (CP) is the computational representa-tion of a unit of information about a phenomenon in a certain context, consisting ofa couple (A, w) where:

A is a NL sentence. This sentence can be either simple, e.g., “The velocityis high” or more complex, e.g., “The feeling of health seems to be betterthis week”. Usually, as mentioned above, this sentence is the linguisticexpression of a metaphor.

w w ∈ [0,1] is the degree of fulfilment of A. Let us remark that w is nota degree of validity of a metaphor, but the fulfilment degree of the NLstatement (linguistic expression of a metaphor) in a certain context.

3.2 First-order perception mapping protoforms

A first-order perception mapping protoform (1-PMP), is a model for generatingcomputational perceptions derived from the direct observation of a phenomenon.A 1-PMP is defined as a tuple (U,y,g,T ) where:

U is a single, crisp variable defined in the input data domain, e.g., the valuez ∈ R provided by a thermometer.

y is the output variable, e.g., The temperature in a room. The domain of yis the power set of a set of computational perceptions py, where py is acollection of pairs (A,w) with w ∈ [0,1] and A ∈ Ay, being Ay a set of

6 Gracian Trivino and Daniel Sanchez

linguistic expressions Ay = {A1y ,A

2y , . . . ,A

nyy }. Each linguistic expression

A jy ∈ Ay is associated to a single fuzzy set on the domain of the input

variable, with membership function µA jy

: U → [0,1] ∀ j ∈ {1, . . . ,ny}.

g is a function U ×Ay → [0,1] defined as g(z,A jy) = µA j

y(z).

T is a Text generation algorithm employed for generating the output compu-tational perceptions. T makes use of g for assessing the fulfilment of eachpossible linguistic expression, and determines the subset of perceptions ofpy that will be generated as output. Hence, y = T (U).

We name First-order computational perceptions (1-CP) the output of a 1-PMP.They include linguistic descriptions with maximum level of detail (the highest gran-ularity) in the linguistic model of phenomena. 1-PMPs are models of the designer’sperceptions of the immediate environment of the monitored phenomenon, e.g., dataobtained from sensors.

Though they provide the simplest computational perceptions, 1-PMPs make useof metaphors very frequently. In the previous example, one possible computationalperception generated as output could be (The temperature in the room is high,0.8),that is a linguistic expression of the QUANTITY-HEIGHT metaphor plus a fulfil-ment degree in a certain context. Hence, typical 1-PMPs are based on metaphors inwhich the Target Conceptual Domain is formed by the set of possible sensor data,i.e., numerical quantities, and the Source Conceptual Domain is a set of linguisticlabels representing how we aggregate the values of the Target Conceptual Domaininto imprecise granules. Notice that the Invariance Principle and the consistencyin the assessment of the fulfilment of the metaphorical expression is guaranteed bythe design as provided by the designer, while the assessment is supported by well-known Fuzzy Logic reasoning tools.

3.3 Second-order perception mapping protoforms

Second-order perception mapping protoforms (2-PMP) are protoforms that definethe semantic structure of a mapping between sets of computational perceptions. 2-PMPs have not meaning as isolated entities but they use to be part of a structure ofperceptions in a domain of experience defined by the designer. A 2-PMP is definedby a tuple (U,y,g,T ) where:

U is a set of input variables (u1,u2, . . . ,un). The domain of each variable isthe power set of a set of computational perceptions CPui , i.e., the value ofui in a given context is Vui ⊆ CPui . Every set CPui is a collection of pairs(A,w) where w ∈ [0,1] and A ∈ Aui , with Aui a set of linguistic expressionsAui = {A1

ui,A2

ui, . . . ,Ani

ui}.y is the output linguistic variable. The domain of y is the power set of a set

of computational perceptions CPy, i.e., the value of y in a given context

Metaphors for Linguistic Description of Data 7

is Vy ⊆ CPy, where CPy is a collection of pairs (A,w) with w ∈ [0,1] andA ∈ Ay, being Ay a set of linguistic expressions Ay = {A1

y ,A2y , . . . ,A

nyy }.

g is a function CPup1× ·· ·×CPupn ×Ay → [0,1] that provides a fulfilment

degree of an output computational perception on the basis of the fulfilmentof the input computational perceptions. The function g can be of any kind,e.g., an aggregation function, a set of fuzzy rules with a certain reasoningscheme, etc. The designer chooses the most adequate function in eachcase.

T is a Text generation algorithm with the same characteristics that those of1-PMPs.

Computational perceptions generated by 2-PMP, whose meaning is based onother computational perceptions, are called second-order perceptions (2-CP). Mostusually, 2-CP are the expression of metaphors.

3.4 GLMP

A GLMP (Granular Linguistic Model of a Phenomenon) consists of a directed andacyclic network of PMPs designed by the designer with the objective of generatinglinguistic descriptions of phenomena with different levels of granularity. Each PMPhas as input the CP generated by other PMPs, and generates CPs that are transmittedto other PMPs in the network. When a PMP takes as part of its input the output ofother PMP, we say that the output of this second PMP is (partially) explained bythe output of the first PMP. We say that the second PMP is in a lower granularitylevel, in the sense that it provides computational perceptions with less granularity,

Fig. 1 Simple exampleof GLMP that includesmetaphorical sentences

8 Gracian Trivino and Daniel Sanchez

i.e., based on more abstract concepts and a larger number of metaphors defined oneover another recursively, as illustrated in Section 2.

Figure 1 shows an example where three 1-PMPs, namely, p11, p1

2, p13 are used to

explain p24. Then p1

1 and p24 explain p2

5 (hence, p25 is in a lower granularity level in

the network). Here p25 is a top-order PMP, i.e., a PMP that generates answers to a

general question about the phenomenon. Here, we have associated with perceptionsp2

4 and p25 two clearly metaphorical sentences.

When creating a GLMP, the designer uses the two main functions of NL, namely,to build the structure of its personal experience and to communicate with others [3].Particularly,

• Using the structure of GLMP, with suitable functions and linguistic expressions,the designer builds a computationally accessible representation of his/her expe-rience about the phenomenon in a situation type and with specific applicationgoals, much likely based on metaphors he/she usually employs.

• Additionally, the designer uses the structures of computational perceptions in theGLMP in order to provide the human user with meaningful linguistic descriptionsof input data.

The way a GLMP can be used depends on the specific application. For example,in order to generate linguistic descriptions of a phenomenon in different contexts(given by values of particular input variables), each PMP may provide only thesentence (Ap) with highest degree of fulfilment, choosing finally those sentences atthe most suitable granularity level in the network.

4 A GLMP for linguistic description of traffic in a roundabout

We illustrate our proposal in this section by describing a GLMP to generate linguis-tic descriptions, based on metaphors, of traffic in an urban roundabout situated inMieres (Asturias-Spain). Fig. 2 shows a diagram of the roundabout. Additional de-tails about this application can be found in our previous work [10]. For traffic dataanalysis and specifically for the case of roundabouts, experts collect information tostudy the interactions among the involved vehicles at roundabouts, with regard to

Fig. 2 Diagram of the round-about indicating the RegionsOf Interest (ROI).

ROIn

ROIs

ROIw

ROCw

ROCn

ROCs

Metaphors for Linguistic Description of Data 9

traffic operations and potential safety. The use of different types of cameras, im-agery, and image processing techniques for these purposes become very important[1]. We use this technical problem as a practical challenge in order to explore thepossibilities to generate automatically linguistic reports which could be useful forhelping the traffic experts in the most tedious tasks.

In the following we describe the components of this GLMP, graphically repre-sented in Fig. 3.

4.1 Top-order PMP

The top-order PMP generates linguistic descriptions of the Usual state of traffic inthe roundabout as an answer to a potential question like

Which is usually the state of traffic in the roundabout?

Here the linguistic expressions that may appear in the CPs generated by the top-order PMP, take the form of a template with two fillers, each one to be filled withone of three linguistic labels, more specifically:

{Usually | Half of times | Few times}, the roundabout is{empty | medium filled | full}.

Here, the designer used a ROUNDABOUT-CONTAINER metaphor to label thestate of occupancy of the roundabout. For performing this experiment, the designercreated a GLMP that explains the meaning of this metaphor using lower order per-ceptions (see Fig. 3).

4.2 Defining the first-order PMPs

As can be seen in Fig. 3, the designer defined six 1-PMPs, three describing the oc-cupancy of each of the entries in the roundabout (On,Os, and Ow), and three describ-

Fig. 3 Granular LinguisticModel of the behavior of traf-fic in a particular roundabout.

10 Gracian Trivino and Daniel Sanchez

ing the movement in each entry (Mn,Ms, and Mw), where n, s, and w refer to entriesNorth, South and West respectively. Using the QUANTITY-HEIGHT metaphor, thedesigner associated with each 1-PMP the following linguistic templates:

• For the 1-PMPs about occupancy, “The occupancy in the {north | south | west}entry is {high | medium | low}”.

• For the 1-PMPs about movement, “The movement in the {north | south | west}entry is {high | medium | low}”.

Fig. 4 shows the membership functions used in the case of occupancy, AO ={high,medium, low}. Notice that a normalized physical measure in [0,1] of the de-gree of occupancy is assumed as input domain. As we shall see, these degrees areobtained by means of video analysis of certain areas in the roundabout, called re-gions of interest (ROI), corresponding to roundabout entering lanes as indicated inFig. 2. The same applies to the case of movement with AM = {high,medium, low} .

4.3 Degree of saturation of an entry

The designer defined three 2-PMPs called degree of saturation of the x entry, wherex ∈ {north | south | west }. These 2-PMPs are explained by the corresponding pairof 1-PMPs in each entry, see Fig. 3. For each entry x, the corresponding 2-PMPindicating the degree of saturation is a tuple (Ux,Sx,g,T ) where:

Ux consists of two input variables {ox,mx} where ox (resp. mx) takes valuesin the output domain of the 1-PMP Ox (resp. Mx).

Sx is a set of computational perceptions of the form (A,w) with A in the setof linguistic expressions ASx matching the template “The degree of satu-ration of the x ROI is {high | medium | low}” associated to the linguisticlabels AS = {high,medium, low}, respectively; they are expressing the de-gree of saturation of the corresponding entry.

g is implemented using the following set of fuzzy rules:

IF (mx is low) AND (ox is low) THEN Sx is lowIF (mx is low) AND (ox is medium) THEN Sx is highIF (mx is low) AND (ox is high) THEN Sx is high

Fig. 4 Trapezoidal member-ship functions used in thefirst-order perceptions oncethe input variables are nor-malized.

Metaphors for Linguistic Description of Data 11

IF (mx is medium) AND (ox is low) THEN Sx is lowIF (mx is medium) AND (ox is medium) THEN Sx is mediumIF (mx is medium) AND (ox is high) THEN Sx is mediumIF (mx is high) AND (ox is low) THEN Sx is lowIF (mx is high) AND (ox is medium) THEN Sx is mediumIF (mx is high) AND (ox is high) THEN Sx is medium

Here, g(mx,ox,A) = A(c) where mx ∈ AM , ox ∈ AO, A ∈ ASx , and c is thenumerical value obtained after using Mamdani’s fuzzy control with de-fuzzification on the rules above (see, e.g., [13] for details).

T generates as output the computational perceptions:

• (The degree of saturation of the x ROI is high,w1)• (The degree of saturation of the x ROI is medium,w2)• (The degree of saturation of the x ROI is low,w3)

where w1, w2, and w3 are calculated by means of g.

4.4 Degree of saturation of the roundabout

The designer aggregated the perceptions of saturation in each entry in order to ex-plain the perception of saturation of the whole roundabout. At a given time instant t,the input consists of nine CPs (three for each entry). The 2-PMP of the instantaneousdegree of saturation in the roundabout is a tuple (U,Srdt ,g,T ) where:

U consists of three variables, one for each entry in the roundabout.Srdt is a set of CPs of the form (A,w) with A in the set of linguistic expressions

matching the template The roundabout is {empty | medium filled | full};they are expressing the degree of saturation of the roundabout at a timeinstant.

g is again based on a set of Mamdani-type fuzzy rules. The rules are dif-ferent from those of the immediate higher level, explained in the previoussection; however, the reasoning mechanism is the same (see more detailsin [10]).

T generates as output the computational perceptions:

• (The roundabout is empty,w1)• (The roundabout is medium filled,w2)• (The roundabout is full,w3)

where w1, w2, and w3 are calculated by means of g. Notice that here thedesigner speaks about the roundabout as if it was a recipient, i.e., employ-ing the ROUNDABOUT-CONTAINER metaphor.

12 Gracian Trivino and Daniel Sanchez

4.5 Completing the top-order PMP

As we explained in Section 4.1, the output CP has as first component linguistic state-ments matching the template {Usually | Half of times | Few times}, the roundaboutis {empty | medium filled | full}. Here, the three linguistic labels that may fill thefirst filler in the template are linguistic quantifiers, i.e., linguistic terms for repre-senting imprecise quantities. The domain of discourse of (Few times, Half of times,Usually) consists of the possible the relative cardinalities normalized in [0,1]. See[10] for additional details.

The top-order PMP is a tuple (U,yas,g,T ) where:

U consists of three variables containing the temporal series consisting of theevolution in time of the three CPs provided by the subordinate perception.

yas is a linguistic variable expressing the average of saturation of the round-about.

g determines the accomplishment degree of the quantified sentence “Q ofthe time instants are instants in which the roundabout is L” as the fulfil-ment of the CP “Q, the roundabout is L” by using Zadeh’s method [15],where Q is a linguistic quantifier in the set {Usually | Half of times | Fewtimes}, and L is a linguistic label in the set {empty | medium filled | full}.For more details see [10].

T provides pairs (A,w) where A matches the template {Usually | Half oftimes | Few times}, the roundabout is {empty | medium filled | full} and wis obtained by using g.

5 Experimental Results

We have implemented a prototype that generates linguistic descriptions of the trafficin the roundabout in Fig. 2 on the basis of video information. We used an IP-basedAXIS210 video camera with VGA video resolution (640x480) and 30 fps. In orderto test the prototype, we recorded movies of 15 minutes each. We employed imageanalysis techniques in order to extract a set of relevant characteristics from the videoimages, related to the regions of interest (ROI) corresponding to roundabout enteringlanes as indicated in Fig. 2. At each time, our prototype measures Occupancy Leveland Movement Level over each region. We used these data as input for the GLMPdescribed in the previous section.

Table 1 shows the results of using the GLMP described above with data obtainedby analyzing five video sequences. It shows the fulfilment degree of nine possiblemetaphorical sentences when they are applied to describe the behavior of traffic ina roundabout. In this case we assumed that we are interested only in sentences gen-erated by the top-order PMP. We can obtain similar set of sentences and fulfilmentdegrees for each PMP in the GLMP.

Metaphors for Linguistic Description of Data 13

Table 1 Results of analyzing five video sequences

V-1 V-2 V-3 V-4 V-5 Linguistic clause

0,79 0,88 0,18 0,47 0,41 Usually the roundabout is empty0,00 0,00 0,00 0,00 0,00 Usually the roundabout is medium filled0,00 0,00 0,00 0,00 0,00 Usually the roundabout is full0,21 0,12 0,82 0,53 0,59 Half of times the roundabout is empty0,18 0,09 0,22 0,00 0,03 Half of times the roundabout is medium

filled0,00 0,00 0,00 0,00 0,00 Half of times the roundabout is full0,00 0,00 0,00 0,00 0,00 Few times the roundabout is empty0,82 0,91 0,78 1,00 0,97 Few times the roundabout is medium

filled1,00 1,00 1,00 1,00 1,00 Few times the roundabout is full

6 Conclusions

This paper establishes a relationship between two ideas belonging to apparentlyseparated fields, namely, metaphors in Linguistics and Philosophy, and Fuzzy Logic.

Designers and users of computational systems use continuously metaphors ineveryday NL. For designers is natural to apply metaphors to build systems able toprovide linguistic descriptions of phenomena. Using metaphorical sentences is veryimportant in order to transmit to the user the sensation that computational systemsare more human-like, improving the empathy of human-machine interaction.

The GLMP provides a tool for filling the semantic gap between raw data andlinguistic descriptions of a phenomenon that is inspired in the way that humans usethe flexibility of NL to describe their environment. The complex and abstract con-cepts (second order computational perceptions) are built using metaphors of moreimmediate perceptions in the physical environment (first order computational per-ceptions). GLMP allows the designer to implement computational representationsof correspondences between linguistic expressions as induced by metaphors. In ad-dition, we describe how to calculate the fulfilment degree of metaphorical linguisticdescriptions of phenomena. These linguistic descriptions are similar to those typi-cally employed by human beings.

This proposal is a short but significant step in a long research travel. The appli-cation example is only a demonstration of the concept of GLMP and its application.This example could be easily extended with new metaphorical descriptions, e.g.,Which is the entry supporting more traffic?. We could generate a great amount ofsentences of the form “(Usually), the traffic in entry A is (heavier) than in entry C”,where the designer applies the QUANTITY-WEIGHT metaphor.

Acknowledgements This work has been funded by the Foundation for the Advancement of SoftComputing (Mieres, Asturias, Spain) and by the Spanish government (CICYT) under projectsTIN2008-06890-C02-01 and TIN2009-08296.

14 Gracian Trivino and Daniel Sanchez

References

1. Chae, K.: Simulation of pedestrian-vehicle interactions at roundabouts. Ph.D. thesis, NorthCarolina State University, North Carolina (2005)

2. Feldman, J., Narayanan, S.: Embodied meaning in a neural theory of language. Brain andLanguage 89(2), 385–392 (2004)

3. Halliday, M.A.K., Matthiessen, M.I.M.: Construing Experience through Meaning: ALanguage-based Approach to Cognition. Continuum; Study ed edition (June 3, 2006) (1999)

4. Halliday, M.A.K., Matthiessen, M.I.M.: An Introduction to Functional Grammar. OxfordUniversity, New York (2004)

5. Kacprzyk, J., Zadrozny, S.: Computing with words and Systemic Functional Linguistic: Lin-guistic data summaries and natural language generation. In: V.N. Huynh et al. (Eds.): Inte-grated Uncertainty Management and Applications, AISC, pp. 23–36. Springer-Verlag Berlin(2010)

6. Lakoff, G.: The contemporary theory of metaphor. In: Ortony, A. (ed.) Metaphor and Thought.Cambridge University Press (1992)

7. Lakoff, G., Johnson, M.: Metaphors we live by. University of Chicago Press (1992)8. Mamdani, E.H., Assilian, S.: An experiment in linguistic synthesis with a fuzzy logic con-

troller. Int. J. Hum.-Comput. Stud. 51(2), 135–147 (1999)9. Mendez-Nunez, S., Trivino, G.: Combining semantic web technologies and computational

theory of perceptions for text generation in financial analysis. In: Proceedings of the IEEEFuzzy 2010. Barcelona, Spain (2010)

10. Trivino, G., Sanchez, A., Montemayor, A.S., Pantrigo, J.J., Cabido, R., Pardo, E.G.: Linguisticdescription of traffic in a roundabout. In: Proceedings of the IEEE Fuzzy 2010. Barcelona,Spain (2010)

11. Wittgenstein, L.: Philosophical investigations. Blackwell Publishing (1953 (2001))12. Yager, R.R.: A new approach to the summarization of data. Information Sciences 28, 69–86

(1982)13. Yager, R.R., Filev, D.P.: Essentials of Fuzzy Modelling and Control. John Wiley & Sons

(1994)14. Zadeh, L.A.: The concept of linguistic variable and its application to approximate reasoning.

Information sciences 8, 199–249 (1975)15. Zadeh, L.A.: A computational approach to fuzzy quantifiers in natural languages. Computing

and Mathematics with Applications 9, 149–184 (1983)16. Zadeh, L.A.: From computing with numbers to computing with words - from manipulation

of measurements to manipulation of perceptions. IEEE Transactions on Circuits and Systems45(1) (1999)

17. Zadeh, L.A.: Toward a generalized theory of uncertainty (gtu)an outline. Information Sciences172, 1–40 (2005)

18. Zadeh, L.A.: Toward human level machine intelligence - is it achievable? the need for aparadigm shift. IEEE Computational Intelligence Magazine (2008)