16
S.BHARATHIMOHAN & B.RAJESHKANNA Page 1 of 16 REF: PRIST UNIVERSITY This work is delivering new means of mapping prehistoric and historic sites in three dimensions rather than traditional two-dimensional methods. M.TECH (ENV. ENGG), SEM – II : REMOTE SENSING AND GEOGRAPHICAL INFORMATION SYSTEM UNIT - I (Part – A) 1. DEFINE REMOTE SENSING? Remote Sensing is the science and art of obtaining information about an object by a recording device (sensor) that is not in physical contact with the object by measuring portion of reflected or emitted electromagnetic radiation from the earth’s surface. 2. GIVE TWO REMOTE SENSING APPLICATIONS OF NATURAL HAZARDS. Areas vulnerable to earthquakes, floods, cyclones, storms, drought, fire, volcanoes, landslides, soil erosion can be used to accurately predict future disasters. 3. GIVE TWO REMOTE SENSING APPLICATIONS OF ARCHAEOLOGY. Remote sensing in archaeology: past, present and future perspectives To the preservation and exploitation of cultural heritage, particularly archaeological assets, has recently directed Remote Sensing applications above all towards the multi-temporal monitoring of existing archaeological sites and their surrounding areas, to study the evolution over time of the most significant environmental and anthropic parameters (change in the use of soil, characteristics of vegetation, urban sprawl, thermal anomalies, etc.). 4. MENTION RECENT TWO GEOSTATIONARY SATELLITES SENT TO SPACE IN ISRO PROGRAM. GSAT - 16 on 07.12.2014 GSAT - 14 on 05.01.2014 5. MENTION RECENT TWO SUN-SYNCHRONUS SATELLITES SENT TO SPACE IN ISRO PROGRAM. SARAL on 25.02.2013 RISAT – 1 on 26.4.2012 6. WHAT ARE THE BASIC PROCESSES AND ELEMENTS INVOLVED IN ELECTROMAGNETIC REMOTE SENSING OF EARTH RESOURCES? Two main processes involved in passive or electromagnetic remote sensing are (i) Data Acquisition (ii) Data Analysis (i) Data Acquisition: It comprises the following distinctive elements namely a. Energy sources

RSGIS KEY PART A F (1).pdf

Embed Size (px)

Citation preview

  • S.BHARATHIMOHAN & B.RAJESHKANNA Page 1 of 16 REF: PRIST UNIVERSITY

    This work is delivering new means of mapping prehistoric and historic sites in three

    dimensions rather than traditional two-dimensional methods.

    M.TECH (ENV. ENGG), SEM II : REMOTE SENSING AND GEOGRAPHICAL INFORMATION SYSTEM

    UNIT - I (Part A)

    1. DEFINE REMOTE SENSING?

    Remote Sensing is the science and art of obtaining information about an object by a recording

    device (sensor) that is not in physical contact with the object by measuring portion of reflected or

    emitted electromagnetic radiation from the earths surface.

    2. GIVE TWO REMOTE SENSING APPLICATIONS OF NATURAL HAZARDS.

    Areas vulnerable to earthquakes, floods, cyclones, storms, drought, fire, volcanoes, landslides, soil

    erosion can be used to accurately predict future disasters.

    3. GIVE TWO REMOTE SENSING APPLICATIONS OF ARCHAEOLOGY.

    Remote sensing in archaeology: past, present and future perspectives

    To the preservation and exploitation of cultural heritage, particularly archaeological assets,

    has recently directed Remote Sensing applications above all towards the multi-temporal

    monitoring of existing archaeological sites and their surrounding areas, to study the

    evolution over time of the most significant environmental and anthropic parameters (change

    in the use of soil, characteristics of vegetation, urban sprawl, thermal anomalies, etc.).

    4. MENTION RECENT TWO GEOSTATIONARY SATELLITES SENT TO SPACE IN ISRO PROGRAM.

    GSAT - 16 on 07.12.2014

    GSAT - 14 on 05.01.2014

    5. MENTION RECENT TWO SUN-SYNCHRONUS SATELLITES SENT TO SPACE IN ISRO PROGRAM.

    SARAL on 25.02.2013

    RISAT 1 on 26.4.2012

    6. WHAT ARE THE BASIC PROCESSES AND ELEMENTS INVOLVED IN ELECTROMAGNETIC REMOTE

    SENSING OF EARTH RESOURCES?

    Two main processes involved in passive or electromagnetic remote sensing are

    (i) Data Acquisition (ii) Data Analysis

    (i) Data Acquisition: It comprises the following distinctive elements namely

    a. Energy sources

  • S.BHARATHIMOHAN & B.RAJESHKANNA Page 2 of 16 REF: PRIST UNIVERSITY

    b. Propagation of energy through the atmosphere

    c. Energy interactions with the earth surface features

    d. Air borne, space borne sensors to record the reflected energy

    e. Generation of sensor data as pictorial or digital information

    (i) Data Analysis: It can be broadly classified as

    a. Visual Image Interpretation: This involves the examination of data with various viewing

    instrument to analyze pictorial data.

    b. Digital Image Processing

    Sensing

    : When computers are used to analyze digital data then the process is

    called digital image processing.

    7. WRITE ABOUT ATMOSPHERIC WINDOWS.

    These are certain regions of the electromagnetic spectrum which can penetrate through the

    atmosphere without any significant loss of radiation. Such regions are called as atmospheric

    windows. In these regions the atmospheric absorption is low (i.e.) the atmosphere is particularly

    transmission of energy.

    Atmospheric Windows (Wavelength)

    Visible 0.38 - .72 m

    Near and middle infrared 0.72 3 m

    Thermal infrared sensing 8 14 m

    Radar sensing 1mm 1m

    8. BRIEFLY WRITE ABOUT SPECTRUM.

    The electromagnetic spectrum (EMS) may be defined as the ordering of the radiation according to

    wavelength, frequency or energy. The electromagnetic spectrum (EMS) can be explained as the

    continuum of energy that ranges from meters to nanometers in wavelength.

    9. BRIEFLY WRITE ABOUT THE ENERGY INTERACTION WITH ATMOSPHERE.

    All the electromagnetic radiation before and after it has interacted with the earths surface, has to

    pass through the atmosphere before it is detected by the Remote Sensors irrespective of its source.

    This distance is called Path Length of Atmosphere. This path length varies depending upon the

    types of sensors.

    In the case of space borne sensors, the sunlight passes through the full thickness of the earths

    atmosphere two times on its journey from source to sensor (two path lengths). In air borne sensors,

    the sensors detect energy emitted directly from objects on the earth such that only single

    atmospheric path length is involved.

  • S.BHARATHIMOHAN & B.RAJESHKANNA Page 3 of 16 REF: PRIST UNIVERSITY

    The net effect of the atmosphere depends upon the differences in path length, the magnitude of

    energy signal being sensed, the atmospheric conditions and the wavelengths. These effects are due

    to the mechanisms of Atmospheric scattering and Absorption (Atmospheric Effects).

    10. DEFINE ACTIVE SYSTEMS OF REMOTE SENSING.

    Description Active Systems of RS

    Energy Source Own energy

    Region of spectrum in which they operate Microwave region of the electromagnetic spectrum

    Wavelength Longer than one mm

    Example SAR (Synthetic Aperture Radar)

    11. DEFINE PASSIVE SYSTEMS OF REMOTE SENSING

    Description Passive Systems of RS

    Energy Source Depend on solar radiation

    Region of spectrum in which they operate Visible and Infrared region of the electromagnetic

    spectrum

    Wavelength Range from 0.4 to 10 m

    Example Any electromagnetic remote sensing system

    (Camera without flash)

    12. DIFFERENTIATE BETWEEN SELECTIVE AND NON-SELECTIVE SCATTERING.

    Scattering is the unpredictable diffusion of radiation caused by the molecules of the gases, dust and

    smoke in the atmosphere. Scattering is based on the particle sizes in the atmosphere.

    SELECTIVE SCATTERING:

    a. Rayleighs scattering

    It happens in the upper part of the atmosphere. This happens when radiation interacts with

    atmospheric molecules and other tiny dust particles, which are smaller in diameter that the

    wavelength of the interacting radiation wavelength. Ex. The blue sky concept.

    b. Mie scattering

    This is lower atmosphere scattering from 0 to 5 Km. It happens when the atmospheric particles

    diameter are of same size as that of the wavelength of radiations being sensed. Spherical particles

    of water vapour, pollen grains and dust are the main causes of Mie scattering.

    NON-SELECTIVE SCATTERING:

    Non-selective scattering occurs when the size of effective atmospheric particles are much larger

    than the wavelength of radiations. Water droplets, ice and snow crystals are the main causes of this

    scattering.

  • S.BHARATHIMOHAN & B.RAJESHKANNA Page 4 of 16 REF: PRIST UNIVERSITY

    13. DIFFERENTIATE BETWEEN GEOSTATIONARY AND SUN-SYNCHRONOUS SATELLITES.

    GEOSTATIONARY SATELLITES SUN-SYNCHORONOUS SATELLITES

    These satellites stationary with respect

    to given position of the earth surface.

    These rotate at the same rate as the mean rotation of

    earth around the sun and on the plane near to polar.

    Weather and Communication Purpose Earth Resources Purpose

    Altitude : 35,790 KM Approx. Altitude : 300 - 800 KM Approx.

    Ex : GSAT 16 Ex : RISAT 1

    14. WRITE ADVANTAGES OF REMOTE SENSING. 1. Remote sensing detects features, which are not visible to the human eye, such as a dense

    forest, Antarctic region and in accessible areas. 2. It provides up to date and continuous information about an area, such as the changing

    pattern of wealth, land use etc. 3. It helps the planners for formulating policies and programmes to achieve the holistic

    functioning of environment, because of its speedy, accurate and up-to-date information. 4. It caters the information needed by the agriculturists to identify the areas affected by pests,

    crop disease, water logging, wasteland etc. 5. It spots the areas of natural disasters such as Tsunami, drought prone, flood affected and

    cyclone-hit areas. It is highly useful for detecting damage, estimating the loss, for providing relief, rehabilitation and helps in reconstruction.

    6. The most important utility of remote sensing is into the science of cartography. It enables the cartographers to prepare thematic maps like geological maps, soil maps, population maps etc with greater accuracy and speed.

    15. WRITE RELATION BETWEEN WAVELENGTH AND FREQUENCY.

    Wavelength

    The distance from one wave peak to another is called wavelength. It is measured in meters or

    fraction of meters such as nanometers (nm, 10-9 meters), micrometers (m, 10-6 meters),

    centimeters (cm, 10-2 meters). Wavelength is usually represented by Greek letter lambda ().

    Frequency

    The number of wave peaks passing a fixed point in a given period of time. It is measured in hertz

    (Hz). The speed of EM energy is constant and its value is 3X108 m/sec.

    Wavelength and Frequency are related by the following formula:

    C = () x n

    C = Speed of light (3X108 m/sec)

    () = Wavelength (m)

    n = Frequency (cycles per second, Hz)

  • S.BHARATHIMOHAN & B.RAJESHKANNA Page 5 of 16 REF: PRIST UNIVERSITY

    16. BRIEFLY DESCRIBE EFFECT OF ATMOSPHERE ON ELECTROMAGNETIC RADIATION

    When electromagnetic radiation travels through the atmosphere, it may be absorbed or scattered

    by the constituent particles of the atmosphere.

    Absorption converts the radiation energy into excitation energy of the molecules.

    Scattering redistributes the energy of the incident beam to all directions. The overall effect

    is the removal of energy from the incident radiation.

    17. BRIEFLY WRITE ABOUT SCATTERING?

    Scattering is the unpredictable diffusion of radiation caused by the molecules of the gases, dust and

    smoke in the atmosphere. Scattering is based on the particle sizes in the atmosphere.

    There are four different types of scattering. They are

    a. Rayleighs scattering b. Mie scattering

    c. Non-selective scattering d. Ramans scattering

    18. DIFFERENTIATE BETWEEN RADIANT FLUX AND IRRADIANCE.

    Radiant Flux: The amount of radiant energy per unit time either emitted, received or transmitted

    across an area is called radiant flux, units: J s-1 = W(att).

    Irradiance: The irradiance is the radiant flux density received by a surface (W m-2).

    19. DIFFERENTIATE BETWEEN RADIANT ENERGY AND RADIANT INTENSITY

    Radiant energy denoted by the symbols Q the measures of all the energy received at a particular

    point or all the energy contained in a particular radiation field. Radiant energy is measured in watt-

    seconds

    Radiant intensity, denoted by the letter I, is the amount of power radiated per unit solid angle,

    measured in W/sr (watt per steradian)

    ================================================================================

  • S.BHARATHIMOHAN & B.RAJESHKANNA Page 6 of 16 REF: PRIST UNIVERSITY

    UNIT - II (Part A) 1. DEFINE RADAR.

    RADAR (RAdio Detection And Ranging) is an object detection system, which uses radio waves to

    determine the range, altitude, direction or speed of objects. It can be used to detect aircraft, ships,

    spacecraft, guided missiles, motor vehicles, weather formations and terrain.

    2. DEFINE LIDAR.

    LIDAR (Light Detection And Ranging or Laser Imaging Detection And Ranging) is an optical remote

    sensing technology that can measure the distance to, or other properties of, targets by illuminating

    the target with laser light and analyzing the backscattered light. LIDAR technology has applications

    in geometrics, archaeology, geography, geology, geomorphology, seismology, forestry, remote

    sensing, atmospheric physics, etc.

    3. WRITE RADAR EQUATION USED IN MICROWAVE REMOTE SENSING.

    The power Pr returning to the receiving antenna is given by the equation:

    where

    Pt = transmitter power

    Gt = gain of the transmitting antenna

    Ar = effective aperture (area) of the receiving antenna

    = radar cross section, or scattering coefficient, of the target

    F = pattern propagation factor

    Rt = distance from the transmitter to the target

    Rr = distance from the target to the receiver.

    In the common case where the transmitter and the receiver are at the same location, Rt = Rr and

    the term Rt Rr can be replaced by R4, where R is the range. This yields:

    4. MENTION RADAR PRINCIPLE WITH FLOW CHART.

  • S.BHARATHIMOHAN & B.RAJESHKANNA Page 7 of 16 REF: PRIST UNIVERSITY

    A radar system has a transmitter that emits radio waves called radar signals in predetermined

    directions. When these are exposed to an object, they are usually reflected or scattered in many

    directions. Radar signals are reflected especially well by materials of considerable electrical

    conductivityespecially by most metals, by seawater and by wet lands. Some of these make the

    use of radar altimeters possible. The radar signals that are reflected back towards the transmitter

    are the desirable ones that make radar work. If the object is moving either toward or away from the

    transmitter, there is a slight equivalent change in the frequency of the radio waves, caused by the

    Doppler effect.

    5. DEFINE BACK SCATTERING COEFFICIENT.

    The backscattering, or backward scattering, coefficient, in units of m-1. It indicates the

    attenuation caused by scattering at angles from 90 to 180. bb is commonly estimated from

    measurements of the VSF around a single fixed angle.

    6. WRITE ANY 4 BANDS USED IN MICROWAVE REM. SEN. AND MENTION THEIR WAVELENGTHS.

    Band

    Microwave frequency bands

    Frequency

    range

    Wavelength

    range Typical uses

    L band 1 to 2 GHz 15 cm

    to 30 cm Military, GPS, mobile phones (GSM), amateur radio

    S band 2 to 4 GHz 7.5 cm

    to 15 cm

    Weather radar, surface ship radar and some

    communications satellites (microwave ovens, microwave

    devices/communications, mobile phones, wireless LAN,

    Bluetooth, ZigBee, GPS, amateur radio)

    C band 4 to 8 GHz 3.75 cm

    to 7.5 cm long-distance radio telecommunications

    X band 8 to 12 GHz 25 mm

    to 37.5 mm

    satellite communications, radar, terrestrial broadband,

    space communications, amateur radio

    7. WRITE ABOUT NADIR IN REMOTE SENSING

    The surface directly below the satellite is called the Nadir point. Nadir also refers to the downward-

    facing viewing geometry of an orbiting satellite, such as is employed during remote sensing of the

    atmosphere, as well as when an astronaut faces the Earth while performing an spacewalk.

  • S.BHARATHIMOHAN & B.RAJESHKANNA Page 8 of 16 REF: PRIST UNIVERSITY

    8. MENTION MAJOR COMPONENTS OF REMOTE SENSING TECHNOLOGY.

    1. Energy Source or Illumination

    2. Radiation and the Atmosphere

    3. Interaction with the Target

    4. Recording of Energy by the Sensor

    5. Transmission, Reception and Processing

    6. Interpretation and Analysis

    7. Application

    9. WHAT IS FORWARD LOOKING INFRARED SYSTEM (FLIR) IN REMOTE SENSING?

    An airborne, electro-optical thermal imaging device that detects far-infrared energy, converts the

    energy into an electronic signal, and provides a visible image for day or night viewing. Also called

    FLIR.

    10. WHAT IS MEANT BY SIDE LOOKING AIRBORNE RADAR (SLAR)?

    SLAR is an aircraft or satellite-mounted imaging radar pointing perpendicular to the direction of

    flight (hence side-looking).

    11. DEFINE RADIOMETRIC RESOLUTION?

    While the arrangement of pixels describes the spatial structure of an image, the radiometric

    characteristics describe the actual information content in an image. Every time an image is acquired

    on film or by a sensor, its sensitivity to the magnitude of the electromagnetic energy determines

    the radiometric resolution. The radiometric resolution of an imaging system describes its ability to

    discriminate very slight differences in energy. The finer the radiometric resolution of a sensor, the

    more sensitive it is to detecting small differences in reflected or emitted energy.

  • S.BHARATHIMOHAN & B.RAJESHKANNA Page 9 of 16 REF: PRIST UNIVERSITY

    12. DIFFERENTIATE BETWEEN FAR RANGE AND NEAR RANGE IN RADAR USING REMOTE SENSING.

    The near field and far field regions of an isolated source of electromagnetic radiation are generally

    used terms in antenna measurements and describe regions around the source where different

    parts of the field are more or less important. The boundary between these two regions depends on

    the geometric dimensions of the source and the emitted by the source dominant wavelength . In

    the region of near field of an antenna the angular field distribution is dependent upon the distance

    from the antenna. The different parts of energy emitted by different geometric regions of the

    antenna have got a different running time and the resultant field cannot be constructively

    interfered to an evenly wave front.

    13. SHOW INCIDENCE ANGLE AND DEPRESSION ANGLE USING DIAGRAM.

    Angle of incidence

    The angle formed by a ray or wave, as of light or sound, striking a surface and a line

    perpendicular to the surface at the point of impact.

    Depression angle

    In aerial photography, the angle between the optical axis of an obliquely mounted air camera and

    the horizontal.

    14. MENTION THE OPERATING TEMPERATURES OF THREE PHOTON DETECTORS, WHICH ARE

    COMMON IN USE.

    Type Operating temperature

    Mercury-doped germanium Ge:Hg Spectral range: 3 - 14 35 Kelvin

    Indium antimonide In Sb Spectral range: 3 - 5 > 77 Kelvin

    Mercury Cadmium telluride Hg cd Te Spectral range: 8 - 14 77 Kelvin

    (MCT)

    ================================================================================

  • S.BHARATHIMOHAN & B.RAJESHKANNA Page 10 of 16 REF: PRIST UNIVERSITY

    UNIT - III (Part A) 1. WHAT IS MEANT BY THE VISUAL IMAGE INTERPRETATION?

    Image interpretation is defined as the act of examining images to identify objects and judge their

    significance. An interpreter studies remotely sensed data and attempts through logical process to

    detect, identify, measure and evaluate the significance of environmental and cultural objects,

    patterns and spatial relationships. It is an information extraction process.

    2. WRITE ANY FOUR ELEMENTS OF VISUAL IMAGE INTERPRETATION PROCESS.

    1. Shape 2. Size 3. Tone 4. Texture 5. Pattern 6. Shadow 7. Location 8. Association

    3. WHAT ARE THE TWO DIFFERENT TYPES OF KEY USED IN VISUAL IMAGE INTERPRETATION?

    a. Selective key b. Elimination key

    4. WHAT ARE THE TWO IMAGE PRE-PROCESSING TASKS NEED TO BE PERFORMED BEFORE

    PROCESSING AN IMAGE?

    Before an interpreter undertakes the task of performing visual interpretation, two important issues

    should be addressed.

    The first is the definition of classification system or criteria to be used to separate the

    various categories of features occurring in the images.

    The second important issue is the selection of minimum mapping unit (MMU) to be applied

    on the image interpretation. MMU refers to the smallest size areal entity to be mapped as a

    discrete area.

    5. WHY THERE IS NEED OF IMAGE ENHANCEMENT?

    Low sensitivity of the detectors, weak signal of the objects present on the earth surface, similar

    reflectance of different objects and environmental conditions at the time of recording are the

    major causes of low contrast of the image. The main aim of digital enhancement is to amplify these

    slight differences for better clarity of the image scene. This means digital enhancement increases

    the separability (contrast) between the interested classes or features.

    6. WHAT IS THE DIFFERENCE BETWEEN SUPERVISED AND UNSUPERVISED CLASSIFICATION OF

    IMAGE?

    1. Supervised classification - The analyst identifies in the imagery homogeneous representative

    samples of the different surface cover types (information classes) of interest. These samples are

    referred to as training areas.

  • S.BHARATHIMOHAN & B.RAJESHKANNA Page 11 of 16 REF: PRIST UNIVERSITY

    2. Unsupervised classification It is in essence reverses the supervised classification process.

    Spectral classes are grouped first, based solely on the numerical information in the data, and are

    then matched by the analyst to information classes (if possible).

    7. WITH ONE EXAMPLE PLEASE EXPLAIN THE NEED OF IMAGE TRANSFORMATION?

    All the transformations in image processing of remotely sensed data allow the generation of a new

    image based on the arithmetic operations, mathematical statistics and fourier transformations. The

    new image or a composite image is derived by means of two or more band combinations,

    arithmetics of various band data individually and/ or application of mathematics of multiple band

    data. The resulting image may well have properties that make it more suited to a particular purpose

    than the original.

    Example:

    Near infrared and red bands of an image set are widely used as a vegetation index, as an attribute

    of vegetative cover, with a particular biomass and green leaf area index of the area covered by the

    image.

    8. DEFINE SELECTIVE KEY USED IN VISUAL IMAGE PROCESSING.

    Selective keys are arranged in such a way that an interpreter simply selects the example that closely

    corresponds to the object the interpreter is trying to identify.

    Ex. Industries, landforms, etc.

    9. DEFINE ELIMINATIVE KEY USED IN VISUAL IMAGE PROCESSING.

    Elimination keys are arranged such that the interpreter follows a precise stepwise process that

    leads to the elimination of all items/targets, except the one that the interpreter is trying to identify.

    Ex. Agricultural studies and Forestry applications.

    10. DIFFERENTIATE BETWEEN GEOMETRIC CORRECTION AND RADIOMETRIC CORRECTION.

    Geometric Correction Methods

    Frequently information extracted from remotely sensed images is integrated with map data in a

    geographical information system. The transformation of a remotely sensed image into a map with a

    scale and projection properties is called geometric correction.

    Radiometric Correction Methods

    The primary function of remote sensing data quality evaluation is to monitor the performance of

    the sensors. The performance of the sensors is continuously monitored by applying radiometric

    correction models on digital Image data sets.

  • S.BHARATHIMOHAN & B.RAJESHKANNA Page 12 of 16 REF: PRIST UNIVERSITY

    11. WHAT IS BIAS IN DIGITAL IMAGE PROCESSING?

    If the sky is clear with no scattering, then the radiance reflected from the earth surface feature in

    any of the region of the electromagnetic spectrum should be the same. This is the ideal case. In

    reality, because of the presence of haze, fog, or atmospheric scattering, there always exists some

    kind of unwanted signal value called bias.

    The bias is the amount of offset for each spectral band.

    12. WHAT IS RANDOM NOISE IN DIGITAL DATA OF REMOTE SENSING?

    Image noise is any unwanted disturbance in image data that is due to limitations in the sensing and

    data recording process. The random noise problems in digital data are characterized by

    nonsystematic variations in gray levels from pixel to pixel called bit errors.

    ================================================================================

    UNIT - IV (Part A) 1. DEFINE GIS?

    GIS can be defined as a system which involves collecting/capturing, storing, processing,

    manipulating, analyzing, managing, retrieving and displaying data which is, essentially, referenced

    to the earth.

    2. WHAT ARE THE VARIOUS SOURCES FROM WHICH DATA CAN BE DERIVED TO BE USED FOR GIS?

    The data are usually derived from a combination of hard copy maps, aerial photographs, remotely

    sensed images, reports, survey documents, etc.

    3. MENTION THREE KINDS OF DATA REQUIRED IN GIS?

    a. Raster Data : Remotely Sensed Imagery, Aerial Photographs, Scanned images, etc

    b. Vector Data : Point, Line and Polygon

    c. Attribute Data : Also called non spatial data or a spatial data or tabular data

    5. WRITE TYPES OF BUFFERING IN GIS.

    Point Feature Buffering : Round

    Line and Polygon Feature Buffering : Round and Flat

    6. WRITE ABOUT NEIGHBORHOOD FUNCTIONS IN GIS.

    Neighborhood Function analyzes the relationship between an object and similar surrounding

    objects. For example, in a certain area, analysis of a kind of land use is next to what kinds of land

    use can be done by using this function. This type of analysis is often used in image processing. A

  • S.BHARATHIMOHAN & B.RAJESHKANNA Page 13 of 16 REF: PRIST UNIVERSITY

    new map is created by computing the value assigned to a location as a function of the independent

    values surrounding that location. Neighborhood functions are particularly valuable in evaluating the

    character of a local area.

    7. WRITE ABOUT MAP OVERLAY IN GIS.

    The combination of several spatial datasets (points, lines or polygons) creates a new output vector

    dataset, visually similar to stacking several maps of the same region.

    8. WRITE ABOUT FEATURE IDENTIFIER IN GIS.

    Point Data: these are single geometric positions. Spatial locations of points are given by their

    coordinates (x, y). Features such as wells, buildings, survey control points, monuments and

    mines etc

    Line and String Data: These are obtained by connecting points. Line connects two points and a

    string connects two or more lines. These are formed by features such as highways, railways,

    canals, rivers, pipelines, power lines etc.

    9. WHAT ARE FILTERS IN GIS?

    The Filter is a tool which can be used to either eliminate spurious data or enhance features

    otherwise not visibly apparent in the data. Filters essentially create output values by a moving,

    overlapping 3x3 cell neighborhood window that scans through the input raster.

    There are two types of filters available in the tool: low pass and high pass.

    10. WHAT IS RECLASSIFICATION PROCESS IN GIS?

    Reclassification operations merely repackage existing information on a single map.

    11. DIFFERENTIATE RASTER AND VECTOR DATA? SL.NO. VECTOR DATA RASTER DATA

    1 Represented by point, line and polygon Point, line & polygon everything in the form

    of Pixels

    2 Relatively small file size (small data volume) Large file size

    3 Excellent representation of networks Networks are not so well represented

    4 A large no. of attributes can be attached,

    hence more information intensive. Only one pixel value represents each grid cell

    5 Features are more detailed & accurate Generalization of features (like boundaries)

    hence accuracy may decrease

    6 Assigning projection and transformations are

    less time taking and consumes less memory.

    Coordinate-system transformations take

    more time and consume a lot of memory

  • S.BHARATHIMOHAN & B.RAJESHKANNA Page 14 of 16 REF: PRIST UNIVERSITY

    12. WRITE TWO ADVANTAGES OF GIS OVER OTHER METHODS.

    (a) Information can be stored, manipulated and retrieved with the help of computer and software

    within no time, which is the essence of GIS.

    (b) It removes the need of paper plans and associated documents and speeds up the production of

    information in the form of maps, tables, etc by rapidly updating and editing the data in computers.

    13. DIFFERENTIATE SPATIAL AND NON-SPATIAL DATA.

    Spatial Data (graphical data): Consists of natural and cultural features that can be shown with lines

    or symbols on maps, or that can be seen as images or photographs.

    Non-Spatial Data

    regional planning and site investigation,

    (Attribute Data): Describes geographic regions or defines characteristics of spatial

    features within geographic regions.

    These data usually alphanumeric and provide information such as color, texture, quantity and

    quality.

    ================================================================================

    UNIT - V (Part A) 1. MENTION FOUR MAJOR APPLICATIONS OF REMOTE SENSING IN CIVIL ENGINEERING.

    In civil engineering projects, RS and GIS techniques can become potential and indispensable tools.

    Various civil engineering application areas include

    terrain mapping and analysis,

    water resources engineering,

    town planning and urban infrastructure development,

    transportation network analysis,

    landslide analysis, etc.

    2. WHAT IS IMAGING AND NON-IMAGING SENSORS?

    A sensor classified as a combination of passive, non-scanning and non-imaging method is a type of

    profile recorder, for example a microwave radiometer.

    A sensor classified as passive, non-scanning and imaging method, is a camera, such as an aerial

    survey camera or a space camera.

    3. DIFFERENTIATE BETWEEN BLACK BODY AND GRAY BODY?

    Black body

    A blackbody allows all incident radiation to pass into it (no reflected energy) and internally absorbs

    all the incident radiation (no energy transmitted through the body). This is true of radiation for all

  • S.BHARATHIMOHAN & B.RAJESHKANNA Page 15 of 16 REF: PRIST UNIVERSITY

    wavelengths and for all angles of incidence. Hence the blackbody is a perfect absorber for all

    incident radiation.

    Gray body

    A body that emits radiant energy and has the same relative spectral energy distribution as a

    blackbody at the same temperature but in smaller amount.

    4. WHAT ARE THE COMPONENT SUBSYSTEMS OF GIS?

    Data Input Subsystem

    Data Storage, Editing and Retrieval Subsystem

    Data Manipulation and Analysis Subsystem

    Data Output and Display Subsystem

    5. WHAT ARE THE FUNCTIONALITIES OF GIS?

    Capture

    Transfer

    Validate and edit

    Store and structure

    Restructure

    Generalize

    Transform

    Query

    Analyze

    Present

    6. WHAT IS MMU IN RASTER DATA MODEL?

    The linear dimension of each cell defines the spatial resolution of data or the precision with which

    the data is presented. Thus, the size of an individual pixel or cell is determined by the size of the

    smallest object in the geographic space to be represented. The size is also known as Minimum

    Mapping Unit (MMU).

    7. WHAT IS MEANT BY THE SPATIAL RESOLUTION?

    Spatial Resolution : Defined by area or dimension of each cell

    Spatial Resolution : (cell height) X (cell width)

    High resolution : cell represents small area

    Low resolution : cell represents larger area

  • S.BHARATHIMOHAN & B.RAJESHKANNA Page 16 of 16 REF: PRIST UNIVERSITY

    8. WHAT IS FCC?

    FCC = False Colour Composite

    FCC refers to a group of color rendering methods used to display images in color which were

    recorded in the visible or non-visible parts of the electromagnetic spectrum. A false-color image is

    an image that depicts an object in colors that differ from those a photograph (a "true-color" image)

    would show.

    9. DRAW DIAGRAMS FOR ANY TWO SURFACE SCATTERINGS.

    10. DISCRIMINATE BETWEEN BLACK BODY AND WHITE BODY.

    A black body is an idealized physical body that absorbs all incident electromagnetic radiation,

    regardless of frequency or angle of incidence.

    A white body is one with a "rough surface [that] reflects all incident rays completely and uniformly

    in all directions.

    ================================================================================

    1. Energy Source or IlluminationGray body