isense2003

Embed Size (px)

Citation preview

  • 8/6/2019 isense2003

    1/37

    I-SENSE: An Approach to Intelligent

    Sensory Data Fusion

  • 8/6/2019 isense2003

    2/37

    OVERVIEW

    WIRELESS SENSOR NETWORKS

    DATA AND SENSOR FUSION

    INTELLIGENT EMBEDDED SYSTEMS AND I-SENSE

    I-SENSE ARCHITECTURE CONCLUSION

    REFERENCES

  • 8/6/2019 isense2003

    3/37

    WIRELESS SENSOR NETWORKS

    A wireless sensor network is an infrastructure comprised of

    sensing, computing, and communication elements that gives an

    administrator the ability to instrument, observe, and react to events

    and phenomena in a specified environment

    Four basic components in a wireless sensor network:

    y (1) an assembly of distributed or localized sensors

    y (2) an interconnecting network

    y (3) a central point of information clustering

    y (4) a set of computing resources at the central point (or beyond)

    to handle data correlation, event trending, status querying, and datamining

    y Sensor devices, or wireless nodes (WNs), are also (sometimes)

    called motes

  • 8/6/2019 isense2003

    4/37

    WIRELESS SENSOR NETWORKScontd..

    y WSN consists of densely distributed nodes that support sensing,

    signal processing , embedded computing, and connectivity.

    y Sensors range from nanoscopic-scale devices(1 to 100 nm in

    diameter) , mesoscopic-scale devices(100 and 10,000 nm indiameter) , microscopic-scale devices(10 to 1000 mm,) and

    macroscopic-scale devices(millimeter-to-meter).

    y Miniaturized sensors that are directly embedded in some physical

    infrastructure-micro sensors.

  • 8/6/2019 isense2003

    5/37

    WIRELESS SENSOR NETWORKScontd..

    Properties of WSN deployments

    Wireless ad hoc nature

    Mobility and topology changes

    Energy limitations

    Physical distribution

    WSN challenges

    Design and Deployment

    Localization

    Data Aggregation and Sensor Fusion Energy Aware Routing and Clustering

    Scheduling

    Security

    Quality of Service(QoS) Management

  • 8/6/2019 isense2003

    6/37

    DATA FUSION

    y Combination of several data sources with the goal of obtaining data

    that is of higher quality than the original dataThe Defense Science and Technology Organization of the Australian

    Department of Defense describes data fusion as a multilevel,

    multifaceted process dealing with the automatic detection, association,

    correlation, estimation, and combination of data and information from

    single and multiple sources.

    3 models of data fusion are:

    JDL Multi sensor Integration Waterfall

  • 8/6/2019 isense2003

    7/37

    DATA FUSION models contd

    y JDL model describes a number of levels for data fusion

    (i) the location and identification of objects,

    (ii) the construction of an image from incomplete information,

    (iii) the provision of possible opportunities (i.e., prediction of effects on

    situations)

    (iv) the optimization of sensor allocations .

    Multi sensor Integration model collects data from various sources and iscombined in a hierarchical way within embedded fusion centers. Data

    collected at the sensor level is transferred to the fusion centers where the

    fusion process takes place.

    y Waterfall model has the flow of data operating from the basic data

    level(data gathered from the sensors )to the abstract decision making level.

    The system is therefore updated continuously with feedback informationfrom the decision making model. These feedback elements advise the

    system on reconfiguration,recalibration and data gathering aspects.

  • 8/6/2019 isense2003

    8/37

    SENSOR FUSION

    Block diagram of sensor fusion

    Sensor fusion, or multi-sensor fusion, is

    considered to be a subset of data fusion.

    It is defined as "the theory, techniques andtools which are used for combining sensor

    data, or data derived from sensory data, into

    a common representational format. In

    performing sensor fusion our aim is to

    improve the quality of the information, so

    that it is, in some sense, better than wouldbe possible if the data sources were used

    individually.

  • 8/6/2019 isense2003

    9/37

    SENSOR FUSIONcontd

    GOALS OF SENSOR FUSION

    Representation

    Certainty

    Accuracy Robustness

    Completeness

    SENSOR CONFIGURATIONS

    Complementary

    Competitive

    Cooperative

  • 8/6/2019 isense2003

    10/37

    SENSOR FUSION METHODS

    y Aggregation Fusion Methods

    y Bayesian Fusion Methods

    y Dempster-Shafer Fusion Methods

    y Neural Network Fusion Methods

    y Fuzzy Fusion Methods

    Bayesian Fusion Method:

    y It is a probabilistic sensor fusion approach that is closely linked to

    Bayes theorem. The core idea of Bayesian fusion is to assume that

    the true values of the measurand as well as all sensor values are

    random variables.Given the sensor values and an a-priori distribution of the

    measurand, the most probable value of the measurand is

    determined.

  • 8/6/2019 isense2003

    11/37

    SENSOR FUSION METHODScontd

    y Bayes theorem states that

    y p is a random variable describing the true value of the measurandy v is a random variable describing the output of a sensor

    y Pr(v | p) is called the likelihood, and describes the probability of a sensor returning the value v if the true value of the measurand is p

    y Pr(p) is called the a-priori distribution and describes the probability that themeasurands true value is p.

    y

    Thus, Bayes theorem states that given the likelihood of sensor measurementsand the a-priori distribution of the measurands value, one can derive the a-posteriori distribution Pr(p | v)

    Dempster-Shafer Fusion Method:

    This theory allows to distinguish between a degree of beliefand adegree of plausibility .It allows to reason in situations with uncertainknowledge and was initially developed as a generalization of Bayesian

    statistics.DempsterShafer theory is based on two ideas: obtaining degrees ofbelief for one question from subjective probabilities for a relatedquestion, and Dempster's rule or combining such degrees of beliefwhen they are based on independent items of evidence.

  • 8/6/2019 isense2003

    12/37

    INTELLIGENT EMBEDDED SYSTEMS AND I-

    SENSE

    Embedded systems that automatically diagnose and plan courses of

    action at a reactive time.

    It is a system that reacts appropriately to changing situation

    without user input.

    Need for an an intelligent solution :

    Dependability

    Efficiency

    Autonomy

    Easy Modeling

    Maintenance costs

    Insufficient alternatives

    I-SENSE is an intelligentmulti-sensor fusion framework

    for embedded online data fusion

  • 8/6/2019 isense2003

    13/37

    I-SENSE

    I-SENSE framework is based on embedded intelligent sensor nodeswith sufficient computing and communication performance and a

    suitable embedded architecture, which allows distributing tasks

    among geographically distinct sensor nodes.

    Features: Embedded architecture

    Embedded intelligent sensor nodes

    Dynamic reconfiguration

    Effective online optimization

    Sensory data fusion

    Reduced communication bandwidth Light-weight middleware

  • 8/6/2019 isense2003

    14/37

    I-SENSE Architecture

    Architecture is categorized into three parts:

    (i) the Distributed Embedded platform,

    (ii) the HW/SW Architecture, and

    (iii) the Multi-level Fusion framework

  • 8/6/2019 isense2003

    15/37

    I-SENSE: DISTRIBUTED

    EMBEDDED PLATFORM

    The I-SENSE distributedplatform consists of a two-level architecture :

    The top-level is composedof a network ofgeographically distributedsensor nodes

    The bottom-level consistsof sensor nodes of I-SENSE platform and arethe main processingcomponents

  • 8/6/2019 isense2003

    16/37

    I-SENSE: HW/SW ARCHITECTURE

    y Hardware model

    y Software model

  • 8/6/2019 isense2003

    17/37

    HARDWARE MODEL

    Set of connected hardware

    nodes (N1 . . . N3) with

    specific parameters

    (computing power, size of

    memory, different sensors).

    Each hardware node has at

    least one general purpose

    CPU (parent) and optionally

    some digital signal processors

    (children) coupled via PCI,

    and various ports to interface

    sensors.

    The hardware topology of an I-SENSE network

  • 8/6/2019 isense2003

    18/37

    HARDWARE ACCESSORIES

    y I- SENSE Multisensor platform-- ePCI-101 Kontron board, together

    with a PCI backplane.

    y Baseboard is equipped with an Intel Pentium M processor withpassive heat sink running up to 1.6 GHz, 512 MB external memoryand the current backplane offers four PCI slots.

    y

    ePCI-101 board provides two 100 MBit/sec Ethernet ports, twoserial ports, several USB-ports, a VGA connector and IDEconnectors.

    y The on-board CF slot is well suited to store the operating system,the fusion software frameworkand the initial configuration on a

    affordable 256 MB flash card.

    y Network Video Development Kits (NVDK) from ATEME serve asthe DSP platform, equipped with Texas Instruments TMS320C6416DSPs running at 600 MHz and with a total of 264 MB of memory.

  • 8/6/2019 isense2003

    19/37

    HARDWARE ACCESSORIES

    y The CMOS sensor KAC-9628 from Kodak, is used tocapture color images. This image sensor provides a high-

    dynamic range of up to 110 dB at VGA resolution. To extendthe visible spectrum a infrared camera with night-visionfeatures is connected to the NVDK.

    y A professional audio card (Audiophile 2496 from M-Audio)allows the system to capture audio signals with up to 96 kHz

    sampling rate and 24 bit resolution.

    y sensors like inductive loop sensor or radar equipment can beconnected via PCI or USB to the I-SENSE platform.

    y

    KAC-9628 ePCI-101 Kontron board

  • 8/6/2019 isense2003

    20/37

    Software model

    y Set of communicating tasks which

    may be represented as a task graph G

    = (N,E).

    y A weighted directed acyclic graph,

    consisting of nodes N = (n1, n2.,nm) which represent the fusion tasks

    and the edges E = (e12, e13, ..., enm)

    which represent the data flow

    between those tasks.

    y Each node has some properties,

    describing the (hardware/resource-)

    requirements of a task. Every edge

    from node u to node v (euv) indicates

    the required communication

    bandwidth between those two tasks.

  • 8/6/2019 isense2003

    21/37

    TheConfiguration

    method

  • 8/6/2019 isense2003

    22/37

    HW/SW ARCHITECTURE contd

    y Optimizer: Both models are used as input to an Optimizer which computes the

    optimal mapping of the real-time fusion tasks onto the sensor node and utilize agenetic algorithm.

    y Configuration Synthesizer: It integrates the fusion tasks into the runtime

    environment of the distributed embedded platform, by 3 steps

    First,the dynamic link libraries for the fusion nodes have to be

    loaded on the specified hardware node.

    Secondly, the defined communication channels have to be established

    between the fusion tasks.

    Thirdly ,the initialization routine of all sensor- and fusion nodes are called.

    y Task Monitor: It runs on every sensor node of the embedded system. It checks

    periodically the health of its processor, the communication links and the utilization

    of the resources under its administration.

    y (Re)Configurator: The (Re) Configurator is responsible for maintaining the

    fusion- and hardware-model

  • 8/6/2019 isense2003

    23/37

    MIDDLEWARE

    y Middleware is usually below

    the application level and on

    top of the operating systems

    and network protocols.

    y

    It gathers information fromthe application and network

    protocols and determines

    how to support the

    applications and at the same

    time adjust network protocol

    parameters.

    Middleware architecture for

    wireless sensor networks

  • 8/6/2019 isense2003

    24/37

    MIDDLEWARE FUNCTIONS AND TYPES

    Middleware functions for WSNsare as follows:

    y standardized system service

    y

    environment that coordinatesand supports multipleapplications

    y adaptive and efficient utilizationof system resources

    y efficient trade-offs between the

    multiple QoS dimensions

    Existing Middlewarey MiLAN (Middleware Linking

    Applications and Networks)

    y IrisNet (Internet-Scale Resource-Intensive Sensor Networks

    Services)

    y AMF (Adaptive Middleware

    Framework)

    y DSWare (Data Service

    Middleware)

    y CLMF (Cluster-Based Lightweight

    Middleware Framework)y DDS (Device Database System)

    y Sensor Ware

    y DFuse

  • 8/6/2019 isense2003

    25/37

    I-SENSE MIDDLEWARE

    Message router

    Data transfer from one fusiontask to another, either on the

    same processor via shared

    memory, the same node via

    PCI or on a distant node via

    Ethernet.

    Supports message forwarding

    for tasks which have been

    migrated to another

    processor.

    Task loader

    It accepts requests to load,

    start, stop, migrate and

    remove fusion tasks.services of the I-SENSE middleware

  • 8/6/2019 isense2003

    26/37

    The I-SENSE middleware from the viewpoint of a fusion

    task

  • 8/6/2019 isense2003

    27/37

    FUSION TASKS

    y Fusion controller sends a request to load a specific task in form of a dynamic

    loadable library.

    y Creation and registration of the communication links.

    y If all previous steps have been completed successfully, the task main routine is called

    in an own thread. Resource monitors responsibility to keep a record of all

    consumed resources by a task (Memory blocks,DMA channels, . . . ).

    y DSP Monitor:To detect software failures

    y

    Diagnosis Unit block:To detect hardware-failuresy After an initialization phase, a message is taken from one port , its data is processed

    and the result is posted on another port .This procedure is repeated . If the task is

    requested to prepare for a migration, it must store its context in the task

    environment, so that it can continue its work on the new processor without

    information loss.

    y

    Time base module: uniform time base for all nodes Tasks can query the system.Theycan fork new threads by using the Scheduler module.

    y Memory Management module: Standardizes and encapsulates the hardware

    dependent memory management functions of the underlying operating system.

    y DmaManager : provides a variety of functions to ease the programming of DMA

    transfers onTI DSPs.

    y

  • 8/6/2019 isense2003

    28/37

    I-SENSE: MULTI-LEVEL

    FUSION FRAMEWORKMulti-Level Fusion is a

    technique by which data fromseveral sensors are combinedthrough a data processor to

    provide comprehensive and

    accurate information

    Three fusion methods

    (i) raw-data fusion,

    (ii) feature-level fusion

    (iii) decision fusion

    Multi-level Data Fusion framework

  • 8/6/2019 isense2003

    29/37

    I-SENSE: MULTI-LEVEL FUSION FRAMEWORK

    Raw-data fusionFusion of multi-sensor data to determine the position, velocity, and

    identity of a tracking object..Raw, uncorrelated data is provided to

    the user

    Feature-level fusion

    Data fusion provides a higher level of inference and deliversadditional interpretive meaning suggested from the raw data and

    data will be fused on feature level

    Decision fusion

    Data fusion is designed to make assessments and provide

    recommendations to the user or human observer

  • 8/6/2019 isense2003

    30/37

    MULTI-LEVEL FUSION METHODS

    Raw-data fusion

    Key problems which have to be solved at this level of data

    abstraction can be referred to

    (i) data association and

    (ii) positional estimation

    Data association is a general method of combining multi-sensor

    data by correlation of one sensor observation set with another set

    of observation.

    Positional estimation methods are Kalman filtering and Bayesianmethods

  • 8/6/2019 isense2003

    31/37

    MULTI-LEVEL FUSION METHODS

    Feature fusion

    Approaches are typically addressed by

    (i) Bayesian Theory and

    (ii) Dempster-Shafer Theory

    Bayesian theory is limited in its ability to handle uncertainty in

    sensor data. Hinders the application of this data fusion technique

    as sensor data are by nature highly uncertain

    Dempster-Shafer theory is a generalization of Bayes reasoning that

    offers a way to combine uncertain information from disparate

    sensor sources so can handle uncertainty in the sensor data

  • 8/6/2019 isense2003

    32/37

    MULTI-LEVEL FUSION METHODS

    Decision fusion

    Combines the decisions of independent sensor detection/

    classification paths by two methods

    Two basic methods for making classification decisions are:

    Hard decisions , a single, optimum choice, and Soft decision in which decision uncertainty in each sensor chain is

    maintained and combined with a composite measure of uncertainty

  • 8/6/2019 isense2003

    33/37

    I-SENSE DATA-ORIENTED FUSION MODEL

    Three different layers: sensing

    unit, the fusion layer and thesensor control & management

    unit

    y Sensor control & management

    unit: controls the overall fusion

    process and provides access to a

    database where resource

    requirements for the differentfusion tasks are stored

    y Sensing units : represent the

    intelligent sensors which consist

    of physical sensors and suitable

    data pre-processors (e.g.,resolution based down-sampling,

    automatic gain control, etc.).

    y Local feature extraction unit

    (LFE) : To extract a single-sourcefeature vector of an observed

    object.

    y Local decision extraction unit

    (LDE) : To extract local decision

    from the individual objectivesfeatures (e.g., classification of

    objectives identity).

  • 8/6/2019 isense2003

    34/37

    II--SENSE dataSENSE data--orientedoriented

    fusion modelfusion model

    Fusion layer includes the following fivefunctional units:

    y Data in data out unit (DIDO): raw-datafusion unit (RDF) where raw uncorrelateddata is fused from different and/or similarmultiple sensors

    y Data in feature out unit (DIFO): featureextraction II unit (FEII), where raw data fromthe individual sensors and/or fused raw-data isused to extract suitable features of theindividual tracked objects.

    y Feature in feature out unit (FIFO):

    feature fusion unit (FF), where features arefused to an overall feature vector based onindividual objects.

    y Feature in decision out unit (FIDeO):

    decision fusion unit (DF), where a classifierbased on support vector machines is trainedwith previously recorded and classified

    sequences.

    y Decision in decision out unit (DeIDeO):

    decision fusion unit, where extracteddecisions are fused from multiple sensors

    from the LDE unit.

  • 8/6/2019 isense2003

    35/37

    CONCLUSION

    y Sensor fusion improves the quality and robustness of many

    applications. Since sensor, computing and communication devices

    are getting more capable, smaller and cheaper at a very fast pace,

    fusion will become an enabling technology for many embedded

    applications

    y I-SENSE architecture is the development of a fully embedded

    distributed real time data fusion system. Instead of performing thecomputation on a central server it delegates the functionality to

    intelligent embedded sensor nodes.

    y I-SENSE used in surveillance systems combining visual, acoustic,

    tactile or location-based information.

    y I-SENSE also used in robotics, medical systems, chemicalprocesses.

  • 8/6/2019 isense2003

    36/37

    REFERENCES

    [1] Andreas Klausner1, Bernhard Rinner1and Allan Tengg1, I- SENSE: Intelligent Embedded

    Multi- Sensor Fusion, International workshop on Intelligent Solutions in Embedded Systems, 2006

    [2] Allan Tengg1, Andreas Klausner1, and BernhI-SENSE: A Light-Weight Middleware for

    Embedded Multi-Sensor Data-Fusionard Rinner2, , International workshop on Intelligent Solutions

    in Embedded Systems, 2007

    [3] Andreas Klausner, Allan Tengg and Bernhard Rinner, Distributed Multilevel Data Fusion for

    Networked Embedded Systems IEEE Journal of Selected topics in Signal Processing, Volume 2, Issue

    4, Aug 2008

    [4] Ananthram Swami, Qing Zhao, Yao-Win Hong, Lang Tong Wireless Sensor Networks

    Signal Processing and Communications Perspectives, John Wiley & Sons., 2007

    [5] Kazem Sohraby, Daniel Minoli, Taieb Znati. Wireless sensor networks: technology, protocols, and

    applications, John Wiley & Sons, 2007

    [6] Feng Zhao,Leonidas Guibas, Wireless Sensor Networks: An Information Processing

    Approach,Elsevier.Inc.2005

  • 8/6/2019 isense2003

    37/37

    REFERENCES [7] James L Crowley and Yves Demazeau, Principles and Techniques for Sensor Data Fusion

    [8] P. J. Escamilla-Ambrosio and N. Mort, Multisensor Data Fusion Architecture Based on

    Adaptive Kalman Filters and Fuzzy Logic Performance Assessment, 2002

    [9] Kai-Wei Chiang 1 and Hsiu-Wen Chang , Intelligent Sensor Positioning and Orientation

    Through Constructive Neural Network-Embedded INS/GPS Integration Algorithms, 2010

    [10] [5] Andreas Doblander1, Arnold Maier1, Bernhard Rinner1 and Helmut Schwabach2,

    Improving Fault-Tolerance in Intelligent Video Surveillance by Monitoring, Diagnosis and

    Dynamic Reconfiguration , International workshop on Intelligent Solutions in Embedded Systems,

    2005

    [11] David Kortenkamp and Patrick Beeson and Nick Cassimatis, Sensor-to-symbol Reasoning

    for Embedded Intelligence, 2010

    [12] Brian C. Williams, Michel d. Ingham, Seung H. Chung, and Paul H. Elliott, Model-Based

    Programming of Intelligent Embedded Systems and Robotic Space Explorers, 2003

    [13] Sing-Yiu Cheung and Pravin Varaiya, Traffic Surveillance by Wireless Sensor Networks: Final

    Report, 2008