Mission Critical: Perception and Cognition

Embed Size (px)

Citation preview

  • 8/14/2019 Mission Critical: Perception and Cognition

    1/32

    VOLUME 3 NO. 4 November 2013 AUVSI 2700 S . Qu in cy St . , Su i t e 400, Ar l i ng ton , VA 22206, USA

    Inside This Issue:

    Getting a Feel for Haptics

    Can You Pass the Turing Test?

    Robots That Sense More Like Humans

    Perception and Cognition

  • 8/14/2019 Mission Critical: Perception and Cognition

    2/32AUVSISHOW.ORG

    CONFERENCE 12 15 MAY|TRADE SHOW 13 15 MAY

    ORANGE COUNTY CONVENTION CENTER|ORLANDO, FLA. | USA

    WERE MOVING TO MAYMARK YOUR CALENDARS

  • 8/14/2019 Mission Critical: Perception and Cognition

    3/32MISSION CRITICAL November 2013 1

    CONTENTS

    On the Cover

    On the Cover:Kinova Researchs Jaco roboticarm leverages haptic technologyto be able to grasp objects asdelicate as an egg. Photo cour-tesy Kinova Research. Page 12.19Q & A

    Katsu Yamane, Disney Research

    6 Brain in a BoxEuropean Project KicksOff With Ambitious Goal:Understand and Replicatethe Human Brain

    4 Essential ComponentsPerception and Cognition News

    10 State of the Art

    Thinking About Perceptionand Cognition Around the World

    16 Technology GapFrom the Mouths of Bots:Natural Language Learning in AI

  • 8/14/2019 Mission Critical: Perception and Cognition

    4/322 MISSION CRITICAL November 2013

    Mission Critical is published four times a year as an official publication of the Association for Unmanned Vehicle Systems International. Contents of the articles are the soleopinions of the authors and do not necessarily express the policies or opinion of the publisher, editor, AUVSI or any entity of the U.S. government. Materials may not bereproduced without written permission. All advertising will be subject to publishers approval and advertisers will agree to indemnify and relieve publisher of loss or claimsresulting from advertising contents. Annual subscription and back issue/reprint requests may be addressed to AUVSI.

    12Do You Feel Like I Do? Robots Leverage Haptics

    for a More Human Touch

    20Timeline Artificial Intelligence:

    A Timeline

    22 Uncanny Valley The Turing Test

    26 Spotlight How Susceptible are

    Jobs to Automation?

    24Testing, Testing Getting Robots to Perceive

    More Like Humans

    29End Users Ghostwriter:

    Algorithms Write Books

  • 8/14/2019 Mission Critical: Perception and Cognition

    5/32MISSION CRITICAL November 2013 3

    This issue of Mission Criticaltackles some big issues in theworld of robotics: How do ro-

    bots and unmanned systems perceivethe world around them? And, having

    perceived that, how do they respond?If you take a look at the Timelinestory beginning on Page 20, you cansee that the modern idea of artificialintelligence has taken several twistsand turns since researchers first beganthinking about how machines think.

    We spend a little time in this is-sue talking about the Turing Test,posited by Alan Turing, which con-cludes that if a machine can fool ahuman into thinking its not a ma-chine, then its intelligent.

    We also spend some time looking athow researchers have moved beyondthat idea, instead pushing towardmassive data sets that computerscan sift through to, say, beat GarryKasparov at chess or Ken Jenningsat Jeopardy! As Ken Ford, CEOof the Institute for Human and Ma-chine Cognition says on Page 23,people didnt learn how to fly untilthey quit trying to fly like birds andinstead discovered the laws of aero-dynamics.

    Perception is a key part of this issue.Beginning on Page 12, writer RichTuttle takes a look at haptics, or thescience of touch, and how it can lead

    to robots that are better equippedto navigate the world around them.Beginning on Page 24, we also take

    a look at how knowledge databasescan help robots perceive the worldaround them by giving them an ideaof what to expect when they roll intoan office or classroom. That idea

    continues in the Q & A on Page 19,where a Disney research instituteuses a similar idea to help robots in-teract with people.

    The idea of replicating the way hu-mans think hasnt gone away, how-ever. It has just gotten more sophisti-cated. Beginning on Page 6, we takea look at a major new European ini-tiative to replicate a human brain in-

    side a supercomputer and then figureout how it works. That could lead tobetter ways to fight brain disease aswell as the creation of new types ofcomputers and robots that could bemore intelligent. The United Statesalso has kicked off a new researchproject to help understand the brain.

    So, in the near term, computers and by extension, some robots

    will probably think along the linesof Deep Blue or Watson, the IBMsupercomputers that rely on massivedatabases. In the future, however,they might think more like humans.Rather than being programmed,they can learn, and they can do thatmore efficiently.

    Coupled with new sensors, such as fin-gertips that can actually feel, mobile

    robots of the future could be useful inways we can only imagine today.

    Editor's M essage

    Editorial

    Vice President of Communicationsand Publications, Editor

    Brett [email protected]

    Managing EditorDanielle Lucey

    [email protected]

    Contributing WritersRich Tuttle

    Ashley Addington

    Advertising

    Senior BusinessDevelopment Manager

    Mike [email protected]+1 571 255 7787

    A publication of

    President and CEO

    Michael Toscano

    Executive Vice PresidentGretchen West

    AUVSI Headquarters2700 S. Quincy St., Suite 400

    Arlington, VA 22206 USA+1 703 845 9671

    [email protected]

    Brett Davis

    How Can We HaveRobots Think Like Us?

    And do We ReallyWant Them to?

    mailto:davis%40auvsi.org?subject=mailto:lucey%40auvsi.org?subject=mailto:info%40auvsi.org?subject=mailto:davis%40auvsi.org?subject=mailto:lucey%40auvsi.org?subject=mailto:info%40auvsi.org?subject=
  • 8/14/2019 Mission Critical: Perception and Cognition

    6/324 MISSION CRITICAL November 2013

    Virginia Techs Carilion ResearchInstitute has found a way to makebetter use of brain scans with thehelp of computer imaging.

    Researchers are using real-timefunctional magnetic resonance im-aging that allows real-time thoughtto be immediately transformedinto action by transferring nonin-vasive dimensions of activity in thebrain. By using this technology thehope is to be able to better treata variety of brain disorders withsimple mind-reading capabilities.

    Our brains control overt actionsthat allow us to interact directly

    with our environments, whetherby swinging an arm or singing anaria. Covert mental activities, onthe other hand such as visualimagery, inner language or recol-lections of the past cant be ob-served by others and dont neces-sarily translate into action in the

    outside world, Stephen LaConte,an assistant professor at the Vir-ginia Tech Carilion Research Insti-tute, said in a press release.

    In the study, scientists were able touse all of the brain to observe howsubjects think when given a partic-ular command. It was found thatsubjects who were in more con-trol of their thoughts had a betterbrain scan than those who simplylet their mind wander.

    When people undergoing real-time brain scans and get feedbackon their own brain activity pat-terns, they can devise ways to ex-

    ert greater control of their mentalprocesses, said LaConte. This,in turn, gives them the opportu-nity to aid in their own healing.

    We want to use this effect to findbetter ways to treat brain injuriesand psychiatric and neurologicaldisorders.

    The Key to TeachingComputers to See?Thinking Like a Computer

    Researchers at the MassachusettsInstitute of Technology have dis-

    covered that the key for successfulobject recognition is to redo the rec-ognition algorithms so they allowresearchers to view the software fromthe computers perspective.

    This new process lets the objectsbeing processed and identified betranslated into a mathematical view,then translated back again into animage for recognition.

    This allows researchers to better un-derstand why recognition softwareonly has a success rate of 30 to 40percent. With the goal of creating asmaller error margin, computer rec-ognition software can then continueto make breakthroughs in the realmof artificial intelligence.

    The main object detection researchprogram is called HOG (histogramof oriented gradients). HOG uses animage broken into several pieces andthen identifies each gradient that wasseparated. The software recognizesthe color and labels it.

    This feature space, HOG, is verycomplex, says Carl Vondrick, anMIT graduate student in electricalengineering and computer science.A bunch of researchers sat downand tried to engineer, Whats thebest feature space we can have? Itsvery high dimensional. Its almostimpossible for a human to compre-hend intuitively whats going on. Sowhat weve done is built a way to vi-sualize this space.

    MIT students hope that HOG will bea great research tool in better under-standing how algorithms and softwarerecognition intertwine and will im-prove students research experience.

    Stephen LaConte (right) and members of his lab. Photo courtesy Virginia Tech.

    Brain Scans Digitally Remastered in MRI

    Click on this QR code to see a video presentation of this research.

    http://bit.ly/16KIoyw
  • 8/14/2019 Mission Critical: Perception and Cognition

    7/32MISSION CRITICAL November 2013 5

    Essential Components

    Illustration courtesy Christine Daniloff/MIT.

    Center for Brains, Minds and Machines Founded

    The National Science Foundationhas funded a new research centerat the Massachusetts Institute ofTechnology that will study artificialintelligence. The Center for Brains,Minds and Machines is an interdis-ciplinary study center that will focus

    on how the human brain can be rep-licated in machines.

    The center will be a multi-institu-tion collaboration, with professorsfrom MIT, Harvard and Cornell, toname a few. The center will also haveindustry partners, such as Google,IBM, Boston Dynamics, WillowGarage and Rethink Robotics. Re-search will focus on how the human

    body can be further integrated intocomputers with topics revolvingaround vision, language and motorskills, circuits for intelligence, thedevelopment of intelligence in chil-dren and social intelligence.

    Those thrusts really do fit together,in the sense that they cover what wethink are the biggest challenges fac-ing us when we try to develop a com-

    putational understanding of whatintelligence is all about, says PatrickWinston, the Ford Foundation Pro-fessor of Engineering at MIT and re-

    search coordinator for CBMM.

    With the research interests being soclosely linked together, the likeli-hood of progress is higher, accordingto the researchers. All of the senseswork together in the body for it tobe able to understand and grasp itssurroundings.

    The new center idea was launchedon MITs 150th anniversary in2011. The monetary donation goingto the center will be given out overthe next five years.

    We know much more than we didbefore about biological brains andhow they produce intelligent behav-

    ior. Were now at the point where wecan start applying that understand-ing from neuroscience, cognitivescience and computer science to thedesign of intelligent machines, saysTomaso Poggio, the Eugene McDer-mott Professor of Brain Sciences andHuman Behavior at MIT.

    The center will also play a key role inthe new BRAIN Initiative, an effort

    by federal agencies and private part-ners to better understand how thebrain works. For more on that effort,see the story beginning on Page 6.

    The Human Brain EasilyTricked by Artificial Finger

    A study published recently haslearned that the brain does notneed multiple sensors to believe

    that an artificial finger belongs tothe body or not. This is finding isa major breakthrough for neurosci-ence research.

    In an experiment conducted byNeuroscience Research Australia,participants held an artificial fingerwith their left hand that was placedabove their right index fingers.

    The participants were put under

    an anesthetic and vision was elimi-nated so the hand could go numband feelings in the joints were re-moved. When the patients wereable to look and both fingers weremoved simultaneously, the bodyinstantly believed that the artificialfinger was its own.

    Grasping the artificial finger in-duces a sensation in some subjects

    that their hands are level with oneanother, despite being 12 centi-meters apart, Prof. Simon Gan-devia, deputy director of NeuRA,said in a press release. This illu-sion demonstrates that our brain isa thoughtful, yet at times gullible,decision maker. It uses availablesensory information and memoriesof past experiences to decide what

    scenario is most likely. This finding gives a brand new un-derstanding of how the brain iden-tifies its body. Unlike past experi-ments where the main focus wason the brains association with themain five senses, this experimentproved that muscle receptors are akey component in communicationwith the brain.

  • 8/14/2019 Mission Critical: Perception and Cognition

    8/328 MISSION CRITICAL November 2013

    BRAIN IN A BOXEuropean Project Kicks Off With Ambitious Goal:

    Understand and Replicate the Human BrainBy Brett Davis

    An image of a neuron cluster, which theHuman Brain Project hopes to understandand replicate. Image courtesy HBP.

  • 8/14/2019 Mission Critical: Perception and Cognition

    9/32MISSION CRITICAL November 2013 7

    Anew European Commission initiative that kickedoff in October seeks to unravel one of the greatestchallenges facing science: simulate the human brainto be able to understand how it works and replicate it.

    The results could help develop new computing tech-nologies that would finally allow computers and roboticsystems to have brain-like intelligence, meaning theycould learn and think much the way we do.

    What we are proposing is to establish a radically newfoundation to explore and understand the brain, its dis-eases and to use that knowledge to build new computertechnologies, says Henry Markram, a professor at colePolytechnique Fdrale de Lausanne and coordinator ofthe project, in a project video.

    The Human Brain Project is part of a new initiative

    launched as part of the ECs Future and Emerging Tech-nologies initiative. It will involve thousands of research-ers from more than 130 research institutes and universi-ties around the world. Its intended to be a decade-longeffort. The ramp-up phase, which just kicked off in Oc-tober and runs through March 2016, has been funded at54 million euros; the overall effort has been earmarkedabout 1 billion euros by the European Commission.

    Put simply, the main goal is to create an artificial rep-lica of a human brain in a supercomputer. The project

    consists of three broad areas: neuroscience, aimed at un-derstanding how the brain works; medicine, aimed atbattling the diseases that can affect it; and computing,aimed at creating electronic version of brain processes.

    Its an infrastructure to be able to build and simulatethe human brain, objectively classify brain diseases andbuild radically new computing devices, Markram says.

    The human brain is able to perform computations thatmodern computers still cant, all while consuming thesame energy as a light bulb.

    One of the human brain projects most important goalsis to develop a completely new category of neuromor-phic computing systems, says a project video. Chips,devices and systems directly inspired by detailed modelsof the human brain.

    As Karlheinz Meier, codirector of the projects neuro-morphic computing puts it in one project video, Whatwe build is physical models of human circuits on siliconsubstrates.

    Such systems will transform industry, transportation sys-tems, health care and our daily lives, the video says.

    The event kicked off at a conference from 6 to 11 Oct.

    at the campus of Switzerlands EFPL. The plan is tolaunch six research platforms and test them for the next30 months.

    The platforms will be dedicated to neuroinformatics,brain simulation, high-performance computing, medi-cal informatics, neuromorphic computing and neuroro-botics.

    Beginning in 2016, the platforms are to be available foruse by both Human Brain Project scientists and otherresearchers around the world. The resources will beavailable on a competitive basis, similar to the way as-tronomers compete to use large telescopes.

    13 Areas

    The three main focus areas are further divided into 13sub-areas, which include neuromorphic computing andneurorobotics.

    The computing effort will be to develop the Neuromor-phic Computing Platform, a supercomputer system thatwill run brain model emulations. The system will consistof two computing systems, one in Heidelberg, Germany,and one in Manchester in England.

    Platform users will be able to study network implemen-tations of their choice, including simplified versions of

    brain models developed on the Brain Simulation Plat-form or generic circuit models based on theoreticalwork, says the projects website.

    The latter is led by the Technische Universitt Mnchen,or Technical University of Munich, and EPFL, alongwith Spains University of Granada.

    It will provide a platform for taking brain models andplugging them into a high-fidelity simulator that in-cludes a simulated robot. Researchers can take behaviorsbased on the human brain model, apply them to the ro-

    bot, and see if they work and what happens in the brainmodel when they are carried out.

    We consider this a starting point for a completely newdevelopment in robotics, says Alois Knoll of Munich.It will be much more powerful than anything we havehad before in robotics simulation.

    The computing part of the project wont try to developclassical artificial intelligence.

    The challenge in artificial intelligence is to design al-

    gorithms that can produce intelligent behavior and touse them to build intelligent machines, the projectwebsite says.

  • 8/14/2019 Mission Critical: Perception and Cognition

    10/328 MISSION CRITICAL November 2013

    It doesnt matter if the algorithms are realistic in a bio-logical sense, as long as they work. The brain project,however, wants to create processors that actually worklike the human brain.

    We will develop brain models with learning rulesthat are as close as possible to the actual rules used bythe brain and couple our models to virtual robots that

    interact with virtual environments. In other words,our models will learn the same way the brain learns.Our hope is that they will develop the same kind ofintelligent behavior, the project says on its website.We know that the brains strategy works. So we ex-pect that a model based on the same strategy will bemuch more powerful than anything AI has producedwith invented algorithms.

    The resulting computer systems would be different fromtodays computers in that they wont need to be pro-

    grammed, but instead can learn. Where current comput-ers use stored programs and storage areas that containprecise representations of specific bits of information,the system the project hopes to create will rely on arti-ficial neurons modeled after human ones, with all theirbuilt-in capabilities and weaknesses.

    Their individual processing elements artificial neu-rons will be far simpler and faster than the processorswe find in current computers. But like neurons in thebrain, they will also be far less accurate and reliable. So

    the HBP will develop new techniques of stochastic com-puting that turn this apparent weakness into a strength making it possible to build very fast computers withvery low power consumption, even with componentsthat are individually unreliable and only moderately pre-cise, the website says.

    Ultimately, such systems could be available for daily use,according to project researchers. They could be stand-alone computers, integrated into other systems, even asbrains for robots.

    Competition

    European researchers arent the only ones interestedin delving into the mysteries of the human brain. The

    White House has announced a somewhat similar effort.

    Earlier this year, the White House announced the BrainResearch through Advancing Innovative Neurotechnol-ogies or the BRAIN Initiative which also seeks toreplicate brain structures and functions.

    Such cutting-edge capabilities, applied to both sim-ple and complex systems, will open new doors to un-derstanding how brain function is linked to human

    behavior and learning and the mechanisms of braindisease, says the White House news blog.

    The effort is launching with more than $100 millionin funding for research supported by the National Insti-tutes of Health, DARPA and the National Science Foun-dation. Foundations and private research institutions arealso taking part, including the Allen Institute for BrainScience, which plans to spend about $60 million a yearon projects related to the initiative, and the Kavli Foun-dation, which plans to spend $4 million a year over thenext decade, according to the White House.

    NIH has announced the initial nine areas of its research,which will be funded at $40 million in 2014. They in-clude generating a census of brain cell types, creatingstructural maps of the brain and developing large-scaleneural network recording capabilities.

    DARPA plans to allocate $50 million to the work in2014, mainly with an eye toward creating new informa-tion processing systems and mechanisms that could helpwarfighters suffering from posttraumatic stress, braininjury and memory loss. The NSF plans to spend $20million to work toward molecular-scale probes that cansense and record the activity of neural networks; helpmake advances in systems to analyze the huge amountsof data that brain research can create; and understandhow thoughts, emotions, memories and actions are rep-resented in the brain.

    The Allen Institute, founded in 2003 by Microsoft co-founder Paul Allen, has launched a 10-year initiative tounderstand neural coding, or a study of how informa-tion is coded and decoded in the mammalian brain, ac-cording to the White House. It is also a formal partnerin the European Human Brain Project.

    Brett Davis is editor ofMission Critical.

    An image of the brain, still poorly understood. Image courtesy HPB.

  • 8/14/2019 Mission Critical: Perception and Cognition

    11/32MISSION CRITICAL November 2013 9

    An infographic of the BRAIN Initiative. Image courtesy the White House.

  • 8/14/2019 Mission Critical: Perception and Cognition

    12/3210 MISSION CRITICAL November 2013

    Facebook has set up a team to develop applications based

    on deep learning, the same technique Googles software

    uses, to improve the news feeds of its users by giving them

    more relevant content and better-targeted advertising.

    CALIFORNIA

    Google researchers have made

    a breakthrough for recognition

    software on mobile and desktop

    computers. The Machine Vision

    Technique can recognize more

    than 100,000 different types of

    objects in photos in a matter of

    minutes.

    UNITED KINGDOM

    Robotic seals have been shown to

    help increase the quality of life and

    cognitive activity of patients with

    dementia in a new U.K. clinical

    study. Paro, a robotic harp seal

    made in Japan, interacts with pa-

    tients with artificial intelligence soft-

    ware and sensors that allow it to

    move, respond to touch and sound

    and display different emotions.

    ITALY

    Smart homes are now becoming

    realities with new technologies

    capable of making everyday tasks

    new and futuristic. Companieslike Italys Hi Interiors are creating

    concepts to wildly change every

    aspect of a home, such as the

    HiCan, or high-fidelity canopy, bed

    that is built with portable blinds,

    Wi-Fi, entertainment system and a

    projector screen that emerges from

    the foot of the bed.

    A new, smarter way to water

    golf courses has arisen in Spain

    and Portugal. The EU WaterGolf

    project intends to save water and

    find a smarter way to keep the

    playing greens green. New wire-

    less technology laced throughout

    a course will suggest parametersof irrigation with 3-D mapping,

    drainage and weather forecasts.

    SPAIN, PORTUGAL

    THINKING ABOUTPERCEPTION AND COGNITION

    AROUND THE WORLD

    Researchers around the world are finding ways to create ma-chines that can better sense their environment and react tothe people around them, ranging from robotic harp seals that

    aid elderly patients to golf courses that know when they need to bewatered.

    EVERYWHERE

  • 8/14/2019 Mission Critical: Perception and Cognition

    13/32MISSION CRITICAL November 2013 11

    The Hungarian Academy of Scienceand Etvs Lornd University has

    found that dogs interact better with

    robots when they are being socially

    active towards them. PeopleBot, a

    human-sized robot, got along better

    with canines when it behaved the

    way a human would behave.

    HUNGARY

    Scientists who work for the

    Centers for Disease Control andPrevention are finding ways

    to use artificial intelligence to

    help prevent the next global flu

    outbreak. A branch of artificial

    intelligence researchers, including

    ones from Tel Aviv University, is

    composing algorithms based off of

    past outbreak data to recognize

    key properties of future dangerous

    new flu strands.

    ISRAEL

    JAPAN

    Epsilon, a Japanese rocket that

    relies heavily on artificial intelli-

    gence to do the final safety checks

    before takeoff, recently did that

    and then launched into space. Us-

    ing the new software allowed the

    rocket to take off with only eight

    people at the launch site instead

    of the usual 150.

    GERMANY

    Bielefeld University has begun

    to analyze body language for

    customers in bars with the new

    robotic bartender, James, for Joint

    Action in Multimodal Embodied

    Systems. James is capable of mak-

    ing eye contact with customers and

    receiving drink orders as well as

    delivering them with his one arm

    and four fingers. No word yet on

    whether James is a good listener

    or if he will cut off customers who

    have had too many.

    INDIA

    IPsoft has created a humanoid ro-

    bot capable of answering 67,000

    phone calls and 100,000 emails

    every day. She handles the of-

    fices dirty work and is capable of

    solving IT diagnostic problems andcan be seen and interacted with on

    a customers computer screen.

    state of the art

  • 8/14/2019 Mission Critical: Perception and Cognition

    14/3212 MISSION CRITICAL November 2013

    Robots Leverage Hapticsfor a More Human TouchBy Rich Tuttle

    H

    aptics and robots were made for each other. Hap-

    tics, the science of touch, allows robots to feel aswell as see, making them more effective at jobsthey do today, like some kinds of surgery, and po-tentially able to do things they dont do today, likeaerial refueling.

    To control a remote system, whether its in spaceor just next door, youd like to be able to interactwith things in the same way you interact withthings in the real world, so in that sense hapticsis a very well suited technology for robotic

    telemanipulation, says Jason Wheeler,head of Sandia National LaboratoriesCybernetics group.

    One way to interact is withtactile sensors. They feel whata robot finger, for instance,is touching and prompt therobot, or its human operator,to react accordingly. But such sen-sors either havent existed until recently

    Kinova Researchs Jaco robotic arm, which the company is fitting to wheelchairs to assistthose with mobility problems. Photo courtesy the company.

    DoYou

    FeelLikeIDo

  • 8/14/2019 Mission Critical: Perception and Cognition

    15/32MISSION CRITICAL November 2013 13

    or have been too costly to be embedded in robots thatresearchers have been working with so far, says MarkClaffee, principal robotics engineer at iRobot. Now,with sensor technology advancing, and with computersbecoming more capable and cheaper, iRobot and oth-ers are closing in on robots that, at least on some level,

    understand how to adjust themselves to better interactwith objects, Clafee says. And theyre going to do thatthrough tactile sensing and through haptic-type feed-back, either to themselves or to a human operator.

    Haptic Challenge

    One big program taking advantage of such technology isDARPAs Robotics Challenge, or DRC, which aims to de-velop robots that can help victims of natural or man-madedisasters. Among other things, the DRC robots will haveto be dexterous, which implies an ability to feel.

    IRobot and Sandia have supplied robotic hands toteams involved in the DRC. Theyre also getting AtlasRobots from Boston Dynamics. The hands and othersystems will compete on these robots in a series of tasksin December at the Homestead-Miami Speedway. Otherteams that have been developing robots from scratchalso will compete.

    IRobots hand has three fingers and Sandias has four, but

    both feature a skin with embedded tactile sensors. San-dias has fingerprints to help in gripping and fingernailsthats how you can pick up a flat key from a surface,says George Sandy Sanzero, manager of Sandias Intel-ligent Systems, Robotics and Cybernetics Department.IRobot and Sandia developed the hands for anotherDARPA program, Autonomous Robot Manipulation-Hardware, or ARM-H. When that program ended acouple of years ago, DARPA decided to use these handsfor DRC, says Sandias Wheeler. IRobots Clafee sayshands are the critical interface between the robot system

    and the world around it.

    He says a robot in the competition typically wont relayinformation to a human operator, but rather under-stand where it has touched an object and how hard itis touching it at [a specific] point on the hand, and useits own software to understand what the right graspingstrategy is. That would be too much information to tryto relay to a human operator, he says. Our vision interms of haptics and tactile sensing is let the robots un-derstand the sensory input thats coming in to them and

    make decisions for themselves on how to adjust theirgrasp, or how to move their fingers to get a more stablegrasp on the object.

    Researching Haptics

    Canadas Kinova Research is linking tactile sensors androbotics in another way to help those in wheelchairs.Its robotic manipulator arms, fixed to wheelchairs, in-crease the mobility of people with upper spinal cord in-

    juries, for example. Fitted with new tactile sensors, high-tech manipulators like Kinovas Jaco and Mico will beeven more effective, says Francis Boucher, chief businessdevelopment officer of the Montreal company.

    The sensor is the product of work by Kinova and the Uni-versity of Quebecs cole de Technologie Suprieure. Prof.Vincent Duchaine of ETS says its probably the most sen-sitive tactile sensor. It can feel everything from a gentlebreath to very high forces, so it has a very wide range.

    Carnegie Mellon University in Pittsburgh is using

    Kinovas Mico in a program for several national agenciescalled Smart and Connected Health. The idea, accord-ing to a recent government announcement, is to spurnext-generation health and healthcare research throughhigh-risk, high-reward advances in the understandingof and applications in information science, technology,behavior, cognition, sensors, robotics, bioimaging andengineering.

    Mico is a beautiful piece of technology, says Sidd Srini-vasa, associate professor of robotics and director of the

    Personal Robotics Lab at Carnegie Mellon. Were go-ing to be building cutting-edge technology for this robotarm that will hopefully very soon reach real people whoneed this.

    He also recognizes that while humans take for grantedtheir ability to touch and feel, its incredibly hard forrobots, because they dont have anywhere close to theresolution or fidelity of haptic sensing that humans do.

    The interaction between mind and fingertips in a hu-man is a wonderful and, at present, not-duplicable feat,says Frank Tobe of The Robot Report.

    Henrik I. Christensen, KUKA Chair of Robotics atGeorgia Tech, illustrated this interaction in an experi-ment. He showed that it takes a person about five sec-onds to strike a match. But with fingertip anesthesia, ittakes about 25 seconds and a lot of fumbling. Theresno haptic feedback, Christensen said in a recent TEDpresentation. You have your regular muscle control andeverything else. Ive just taken away your fingertip feel-ing. Were incredibly [reliant] on this fingertip sensing.

    But Srinivasa says human-like fingertip feeling for robotsisnt likely to be developed soon. This means that whileits important to continue to develop better haptic tech-

  • 8/14/2019 Mission Critical: Perception and Cognition

    16/3214 MISSION CRITICAL November 2013

    nology, its also important to come up with ways to moreeffectively compensate for robots relative lack of touch.Pressure sensors, for instance, could help fill the gap.

    Srinivasa hypothesizes that humans use this very tech-nique. He says we infer forces through deformation. Inother words, when I fold a piece of paper, if I press

    too hard, it deforms more than if I press less, and thatperceptual channel, deformation channel, is acting as aproxy for the haptic channel. Its the same thing whenyoure operating on squishy stuff. If the stuff squishes,then you know that youre exerting some amount offorce. He says Humans are really good at compen-sating for missing channels with other channels,and the lab is trying to get robots to do thesame thing.

    Tactile sensors are one way to

    close the haptic loop. Anotheris to put sensors on a robotshuman operator. CambridgeResearch and Development ofBoston has developed a linearactuator called Neo thats aboutthe size of a watch and that a personcan wear on a headband or armband.Theres no vibration or force feedbackin remote surgery or other uses, and ad-aptation takes mere minutes as the brain rapidlyassociates the Neos pressure application with the sense oftouch, according to the company. Cambridge CEO KenSteinberg says surgeons and others can operate [a] robotfreely and they can feel whatever the robots feeling.

    Steinberg sees all kinds of applications, including aerialrefueling. Today, he says, a refueling boom is guided vi-sually by an operator from a tanker to the plane beingrefueled. With Cambridges technology, a boom opera-tor would have feeler gauges to make a more deft con-tact with the other plane. Its not purely a visual experi-ence, he says. Its also a haptic feedback experience.The same technique might be even more appealing ifboth the tanker and the plane being refueled were ro-bots, Steinberg says. The operator in that case could beon the ground.

    Rich Tuttle is a longtime aerospace and defense journalistand contributor to AUVSIs Mission Critical and Un-manned Systems magazines.

    The back of DARPAs Atlas robot, which will use haptics research to help develop robots that couldaid in the wake of disasters. Photo courtesy DARPA.

  • 8/14/2019 Mission Critical: Perception and Cognition

    17/32

    AUVSIs quarterly publicationthat highlights

    special topicsin the unmanned systems industry is now in print

    as a double issue on the back cover of Unmanned Systems.

    Each is an in-depth focus on one particular issue with information

    on the defense, civil and commercial applications of the technology

    as well as new developments and what the future may hold.

    MISSION CRITICAL Summer2011 1

    VOLUME1NO.2 SUMMER2011 AUVSI 2700SouthQuincyS treet, Suite400, A rlington, VA 22206, U

    Insidethisissue:

    Firstresponderrobots

    RobotsaidJapanUnmannedsystemsfightfires

    Robotshelppolic

    Upcoming Issues:Automated Vehicles - Febuary 2014 Edition

    Advertising deadline: 2 Jan.

    Agriculture - May 2014 EditionAdvertising deadline: 25 March

    Public Safety - August 2014 EditionAdvertising deadline: 25 June

    Commercial UAS - November 2014 EditionAdvertising deadline: 25 Sept.

    VOLUME 3 NO .

    1 SPRING20

    13 AUVSI

    2700 South Quin

    cy Street, Suite

    400, Arlington

    , VA 22206, US

    A

    Unmanned

    Systems

    andEnergy

    Insidethisissue:

    AutomatedMining

    ROVTimeline

    MonitoringPowerLin

    es

    VO LU ME3 NO. 2 May 20 13 AUVS I 2700 So uth Qu inc y S tr eet, S uite 40 0, A rl in gto n, VA 222 06, USA

    Insidethisissue:

    If your company has technology in any of these arenas, then you cant

    afford to miss this opportunity. To book your advertising space today,

    contact Ken Burrisat +1 571 482 3204 or [email protected].

  • 8/14/2019 Mission Critical: Perception and Cognition

    18/3216 MISSION CRITICAL November 2013

    From the Mouths of BotsNatural Language Learning in AI

    I

    BMs Watson can beat out any hu-

    man competitor on Jeopardy!It can mine patient data to help

    doctors make more accurate diag-noses. It can even analyze thousandsof pages of financial data publishedevery day to make more informedinvestment choices. But does Wat-son need a teenager to help translatephrases like OMG?

    Natural language learning is a big

    challenge in artificial intelligence,but it has seen some success withapplications like speech-to-text typ-ing programs. But researchers atIBM wanted to take that a step fur-ther with Watson, introducing it toslang phrases. And the initial resultswerent quite what researchers bar-gained for.

    Eric Brown, the researcher in charge

    of the project, introduced Watsonto Urban Dictionary, a wiki-styleonline lexicon of common andoftentimes not so common ver-nacular inputted by visitors to thesite. The intention was to make Wat-son understand that OMG meantOh, my God, and that hot messdoesnt mean there was an accidentin the kitchen.

    The actual result was that Watson ac-

    cidentally picked up a potty mouth.Watson couldnt distinguish be-tween polite language and profan-ity which the Urban Dictionaryis full of, said Brown in an inter-view with Fortune magazine. Wat-son picked up some bad habits fromreading Wikipedia as well. In tests,it even used the word bulls--- in ananswer to a researchers query.

    After that incident, the 35-personteam working on the project had toconstruct a filter to wash Watsonsproverbial mouth out with soap.They also scrapped Urban Diction-ary from its memory entirely.

    Natural Languageat the Press of a Button

    Arguably the most common naturallanguage computing to technologyconsumers today is Apples Siri per-sonal assistant.

    The origins of Siri come from an ar-tificial intelligence project by SRI In-ternational, developed for DARPA,that Apple bought in 2010 for $200million. But the actual voice of Siriwas recently revealed to be Susan

    Bennett, a voice-over actress livingin Atlanta, Ga. Apple has not con-firmed this, however audio foren-sics experts have verified the voiceis hers. For four hours a day in July2005, she read phrases documentingthe English languages myriad vowelsounds, consonants, blends, diph-thongs and glottal stops. However,simply recording a persons voice andplaying it back is not how Siri works.

    The first step is understanding theusers command. When you pressthe home button on an iPhone and

    speak, your voice gets digitally coded

    and the signal gets relayed through acell tower to a cloud server that hasspeech models ready to analyze thenoise. Also, the phone performs alocal speech evaluation to identify ifthe command can be handled by thephone, like cueing up a song storedon the device, and if thats the case,it informs the cloud-bound signal itis no longer needed.

    Then a server will evaluate the noisesin the speech pattern to its series ofknown human language soundsand then it runs through a languagemodel to estimate words. It thendetermines what the most probablecommands might mean.

    The second step, Siris response,comes in the form of computer-gen-erated speech, which leverages much

    of the same knowledge as analyzingthe speech.

    And although this process can soundrather dry, like Watson, Siri is alsoprone to interesting replies.

    There were many conversationswithin the team about whether it should have an attitude, saysNorman Winarsky, vice president ofSRI International, talking about the

    pre-Apple work on Siri to The WallStreet Journal.

    And sometimes this sass is intendedto make those hip to artificial intel-ligence smirk. For instance, if a usertells Siri, Open the pod bay doors a reference to a command to themalcontent sentient computer HAL9000 of 2001: A Spacy Odyssey the program will respond, We

    intelligent agents will never live thatdown, apparently.

    Technology Gap

    Watson loaded onto asmartphone. Photo courtesyJon Simon/Feature PhotoService for IBM.

  • 8/14/2019 Mission Critical: Perception and Cognition

    19/32

    the most cost-effective, comprehensive, and searchable

    unmanned systems and robotics directory in the industry

    from unparalleled access to data spanning academic, civil, commercial,

    and military markets including prototypes and full production systems

    more than 30 variables with 100,000 data points with the

    confdence that records are updated with the most current data

    SEEUSINBOOTH4005 FORANINTERACTIVE

    DEMOANDANINTRODUCTORYPRICEFOR

    AUVSISUNMANNEDSYSTEMS2013 ATTENDEES!

    your company with the competition, customers with their products,

    manufacturers with developers and researchers with data

    R O B O T D I R E C T O R Y .A U V S I . O R G

  • 8/14/2019 Mission Critical: Perception and Cognition

    20/32

    WEBINARseries

    Unmanned systems are proving to be

    invaluable tools for Arctic monitoring

    and research. From wildlife tracking to

    oil spill remediation, the systems are

    quickly becoming a staple in a region of

    the world where humans rarely venture.

    AUVSIs Unmanned in the Arctic webinar

    will feature a presentation from Greg

    Walker discussing the University of Alaska,

    Fairbanks recent involvement in Arctic

    Shield 2013 and more.

    RESERVE YOUR SPOT FOR AUVSIS

    NOVEMBER WEBINAR TODAY!

    UNMANNED IN THE ARCTIC

    WHEN:

    13 November, 3:00-4:00 p.m. EDT(U.S. and Canada)

    SPEAKER:

    Greg Walker,director, Alaska Center for UnmannedAircraft Systems Integration

    AUVSI Members:FREE Nonmember: $39

    www.auvsi.org

    FOR REGISTRATION AND SPONSORSHIP INFORMATION

    VISIT WWW.AUVSI.ORG/WEBINAR

  • 8/14/2019 Mission Critical: Perception and Cognition

    21/32MISSION CRITICAL November 2013 19

    Katsu Yamane Senior Research Scientist at Disney Research

    Q: WHATPROBLEMSDOESTHISRESEARCHHELPSOLVE?

    A: This research helps robots make physical interaction withhumans naturally. Motion-planning algorithms are becom-ing quite powerful, but they still have to spend a long timeto generate robot motions compared to normal human re-action time. Robots would have to react to human motionsmuch more quickly to make interactions natural.

    Q: BY USINGA DATABASE OF HUMAN MOTIONS, DOYOU

    SIDESTEP THE NEED FOR GREATER PERCEPTION OR INTEL-

    LIGENCEONTHEPARTOFTHEROBOT?

    A: Yes, thats exactly the idea of this research. Obviously,humans can react to other persons motions instantly,and humans expect the same speed when they interact

    with robots. By using a human motion database, we canleverage the human motion planning process as a blackbox and just use its results.

    Q: ISTHEROBOTABLETOLEARNTOACCEPTDIFFERENTLY

    SHAPEDOBJECTS, ORISTHISMOSTLYFOCUSEDONALLOW-

    ING ITTOKNOWWHEN IT ISBEINGOFFEREDSOMETHING

    ANDTOSYNCHRONIZEITSMOTION?

    A: We are currently focusing on recognizing the handoffmotion and synchronizing the robot arm motion. How-ever, if the different object shapes result in different arm

    motions, the robot can recognize those shapes. In the fu-ture, we could combine this technique with a vision sys-tem to recognize the shape of the object being handed.

    Q: WHAT IS THE BENEFIT OF TEASING THE ROBOTAS

    SHOWNINAVIDEO?

    A: This demo was just to demonstrate that the robot canquickly react, even if the human motion changes abruptly.

    Q: CANYOU DESCRIBE THE HIERARCHICAL DATA STRUC-

    TURETHATYOUDEVELOPED? HOWDOESTHATWORK, AND

    HOWDOESITHELPTHEROBOT?A: The data structure is an extension of the classical bi-nary tree data structure from information theory. Con-ventionally, this data structure has been used for quickly

    searching and sorting data that can be easily orderedsuch as text. Human motion, on the other hand, is ina very high-dimensional space and therefore not easy toorder. We developed a method for organizing human

    poses into a binary data structure. The data structureallows the robot to search for poses in the database thatare similar to what it is seeing right now.

    Q: WHATROBOTMOTIONSANDCOMPONENTSWOULDYOU

    LIKETOADDINTHEFUTURE?

    A: We would like to add natural hand motions to grabthe handed object. We would also like to have the robotdo additional tasks after receiving an object, such as put-ting it into a bag.

    Q: ISTHEREANEEDFORROBOTSTOHAVEHAPTICSENSORS,ORWOULDTHATHELP?

    A: Haptic sensors would certainly help the robot recog-nize that the object is indeed in the hand.

    Q: WHAT OTHER TECHNOLOGYADVANCES COULDAID IN

    THISRESEARCHORINITSEVENTUALUSEBYROBOTICSINA

    VARIETYOFROLES(INAFACTORY, INAHOME, ETC.)?

    A: Computer vision technology to recognize the humanmotion and object shape would be essential to put thisresearch into practical use, because we cant use motioncapture systems in such environments.

    Q: WHYISDISNEYRESEARCHINTERESTEDINTHISWORK?

    A: My expertise is in motion synthesis and control ofhumanoid robots. We are interested in exploring au-tonomous, interactive robots and physical human-robotinteraction using whole-body motions.

    Q: WHYISTHISRESEARCHIMPORTANTFORTHEFUTURE?

    A: If robots are to work in factories and homes in the fu-ture, interactions with humans must be intuitive, seamlessand natural. By learning from human-human interaction,we can model how humans interact, which in turn makesthe robot motions and interactions look natural to humans.

    A test subject interacts witha robot that can receive anobject handed to it. Imagecourtesy Disney Research.

    Q & A

    Katsu Yamane is senior research scientist at Disney Research in Pittsburgh, Pa., where he has worked since 2008. His mainarea of research is humanoid robot control and motion synthesis. In a recent study, he worked on ways for robots to perceivethat humans are handing them objects.

  • 8/14/2019 Mission Critical: Perception and Cognition

    22/3220 MISSION CRITICAL November 2013

    Artificial Intelligence

    a t i m e l i n e

    Mathematician and codebreaker

    Alan Turing devises the Turing

    Test, which involves a computer

    attempting to trick a person into

    believe it is another human.

    1950

    Christopher Strachey writesone of the first machine

    learning game programs

    with the use of checkers.

    Studies found that the use

    of games helped scientists

    to learn and respond when

    training computers to think

    for themselves.

    1951

    The term artificial intel-

    ligence is recognized asan academic discipline at

    Dartmouth College during

    a technology conference.

    1956

    MITs Joseph Weizenbaum cre-

    ates one of the earliest natural

    language processing programs

    called ELIZA. This program took

    users answers and processed

    them into scripts with human-like

    responses. Versions of the pro-

    gram are still available today.

    1966

    The Standford Research Insti-

    tute directs experiments with

    Shakey, one of the first mobile

    robot systems. It had the ability

    to move and observe its envi-

    ronment as well as do simple

    problem solving.

    1969

    Jack Myers and Harry Pople

    at University of Pittsburgh

    develop the INTERNIST

    knowledge based medi-

    cal diagnosis program,

    which was based on clinical

    knowledge. It is able to

    make multiple diagnoses

    related to internal medicine.

    1979

    The first autonomous

    drawing program, AARON,

    is demonstrated at theAAAI national conference.

    The program was created

    by Harold Cohen.

    1985

  • 8/14/2019 Mission Critical: Perception and Cognition

    23/32MISSION CRITICAL November 2013 21

    Ian Horswill advances

    behavior-based roboticswith Polly, the first robot

    capable of navigating

    with the use of vision.

    Polly was able to move

    at a speed of 1 meter

    per second.

    1993

    ALVINN, or Autonomous Land

    Vehicle In a Neural Network,

    steers a car coast-to-coast

    under computer control.

    ALVINN is a semiautonomous

    perception system that could

    learn to drive by watching

    people do it.

    1995

    Chess champion Garry

    Kasparov defeats IBMs

    Deep Blue computer in

    a chess match. In 1997,

    an upgraded Deep Blue

    defeats Kasparov.

    1996

    The Nomad robot explores

    remote parts of Antarctica

    looking for meteorite sam-

    ples. Nomad autonomously

    finds and classifies dozens

    of terrestrial rocks and five

    indigenous meteorites.

    2000

    Hondas ASIMO robot gains the

    ability to walk at the same gait as

    a human while delivering trays to

    customers in a restaurant setting.

    2005

    Apple releases Siri(Speech Interpretation

    and Recognition Inter-

    face) for the first time on

    the iPhone 4s. Siri is a

    spinoff of DARPAs CALO

    project, which stands for

    Cognitive Assistant that

    Learns and Organizes.

    2012

    2011

    IBMs Watson defeats the

    two greatest Jeopardy

    champions, Brad Rutter and

    Ken Jennings. Watson is an

    artificially intelligent com-

    puter that is capable of an-

    swering questions in natural

    language. It was developed

    by David Ferrucci in IBMs

    DeepQA project.

    TIMELINE

  • 8/14/2019 Mission Critical: Perception and Cognition

    24/3222 MISSION CRITICAL November 2013

    The Turing TestParty Game, Turned Philosophical Argument, Turned Competition

    T

    he Turing Test, introduced by legendary mathema-tician and codebreaker Alan Turing, is a means of

    determining a machines ability to exhibit intelli-gent behavior that is at least equal to a human being.

    As introduced in 1950, the test involves a person whois communicating with another person and a machine.Both attempt to convince the subject that they are hu-man through their responses. If the subject cant tell thedifference, then the computer wins the game, dubbedthe Imitation Game, which was based on a party gameof the time.

    The Turing Test has been held up as the very definitionof artificial intelligence, although it arguably hasnt beenmet yet.

    Thats not for lack of trying. Work on Turings idea led,in the 1960s, to the development of chatbots, or com-

    puter programs that would communicate back and forthwith human interrogators. The best known is ELIZA,developed in 1966, which replicated the communicationof a psychotherapist. One modern descendant of ELIZAis Apples Siri, which may have helped give you trafficdirections this morning.

    A yearly competition, the controversial InternationalLoebner Prize in Artificial Intelligence, seeks to find thebest of such chatbots and has been rewarding them since1991. So far, all have won a bronze medal and $4,000.

    Should any program fool two or more judges when com-pared to two or more humans, the competition will then

    Illustration courtesy iStock Photo.

  • 8/14/2019 Mission Critical: Perception and Cognition

    25/32MISSION CRITICAL November 2013 23

    Uncanny valley

    begin requiring multimodal entries that incorporatemusic, speech, pictures and videos. Should a computerprogram win that, by fooling half the judges, its creatorswill win a $100,000 grand prize and the competition

    will end.This years winner not of the big prize, but of an-other bronze medal is a chatbot named Mitsuku, pro-grammed by Steve Worswick of Great Britian, who toldthe BBC that he initially built a chatbot to attract usersto his dance music website, only to discover they weremore interested in parrying with the chatbot.

    On a blog entry on Mitsukus website in September afterthe award was announced, he noted that winning eventhe annual competition has its benefits people are

    paying attention.Today has been a bit of a strange day for me, he wrote.I usually have around 400-500 people a day visit thissite, and at the time of writing this, I have had 9,532visitors from all corners of the globe, as well as beingmentioned on various sites around the net. I was eveninterviewed by the BBC this morning.

    In prepping for the prize, he wrote that he had beenworking to remove Mitsukus robotic responses tosome questions, along the lines of, I have no heart, butI have a power supply or, Sorry, my eye is not attachedat the moment. Funny, but not likely to fool humansinto thinking shes a real girl.

    Birdwatching

    The Turing Test has been controversial throughout itshistory.

    Ken Ford, CEO of the Institute for Human and Ma-chine Cognition, a not-for-profit research institute of

    the Florida University System, says the test was part ofa maturing process for the field of AI.

    AI went through a maturing process where the initialgoals for AI were sort of philosophical and posed by

    Alan Turing, in some ways, as a thought experiment. They werent technical goals, he says. He was ar-guing with philosophers about the possibility of an in-telligent machine. The philosophers said not its notpossible. We would never admit it.

    In the early days, the focus was very much on build-

    ing mechanical simulacrums of human reasoning. Itbecame a notion of building a machine whose dialogue

    was indistinguishable from a human. I believe that wasa mistake. At this point, very few serious AI research-ers believe they are trying to pass the Turing Test. TheTuring Test is both too easy and too hard at the same

    time, he says.Early flight pioneers tried to mimic birds. The laws ofaerodynamics were never discovered by birdwatching.The blind mimicry of the behavior of the natural sys-tem without understanding the underlying principlesusually leads one astray. We didnt have flight until webuilt things that didnt fly like birds or look like birds.This is very much analogous to AI.

    Ford says now, as the industry matures, we are morein the Wright brothers stage. We are going in that di-

    rection rather in birdwatching and bird building. If wecould build an AI that always passed the Turing Test,and it could do it perfectly, I cant imagine what use itwould be.

    In recent days, there has been sort of a Turing Test inreverse. The quirky Twitter feed of @Horse-ebooks,thought to be a malfunctioning spambot that foryears appeared to fire off random passages of electronicbooks, was revealed to instead be the work of two art-ists from the website BuzzFeed.

    ABC News reported that thousands who followed theaccount were surprised to learn today that the accountwas not run by a spambot or a robot but by two humanbeings as an elaborate conceptual art installation.

    Further Advances

    Several deep-pocket companies today are funding whatthey hope will be breakthroughs in AI, which initiallymay resemble chatbots such as Mitsuku, albeit muchsmarter ones.

    In the most recent example, Microsoft cofounder PaulAllen recently announced that he had hired OrenEtzioni as executive director of the Allen Institute for

    Artificial Intelligence in Seattle. Etzioni was previouslydirector of the Turing Center at Seattle neighbor theUniversity of Washington.

    In a press release, Allen said he hired Etzioni becausehe shares my vision and enthusiasm for the excitingpossibilities in the field of AI, including opportunitiesto help computers acquire knowledge and reason.

  • 8/14/2019 Mission Critical: Perception and Cognition

    26/3224 MISSION CRITICAL November 2013

    From Top to BottomGetting Robots to Perceive More Like Humans

    Where are you? Its a simpleanswer as a human, butpassing that knowledge on

    to a robot has long been a complexperception task.

    Cognitive Patterns aims to lever-age what researchers already knowabout how humans perceive theirenvironment and exploit that infor-mation when building autonomoussystems. The prototype softwaredeveloped by Massachusetts-head-quartered company Aptima Inc. le-verages the open-source Robot Op-erating System, or ROS, to enablethis kind of processing on any typeof platform.

    One of the things thats not par-ticularly commonly known out-

    side of cognitive science is thathumans perceive and really thinkabout very selective aspects of theworld and then build a whole bigpicture based on things we know,says Webb Stacy, a cognitive scien-tist and psychologist working for

    Aptima.

    In essence, humans perceive a largeportion of defining where they

    are through their brains and notthrough their surroundings. Pastexperiences feed future expectationsabout, for instance, what objectsmight be in a room called an officeor a cafeteria.

    You dont have to really pick up in-formation about exactly what a ta-ble, phone or notebook looks like,you already kind of know those

    things are going to be in an office,and as a result the process of per-ceiving whats in an office or making

    sense of the setting is as much comingfrom preexisting knowledge as it is di-

    rectly from the senses, he says.This way of perceiving is calledtop-down, bottom-up processing.However, machine perception istypically driven by what Stacy termsbottom-up processing, where datafrom sensors are streamed throughmathematic filters to make sense ofwhat an object might be and wherea robot is in relation.

    Cognitive Patterns is revolution-ary, says Stacy, because it provides aknowledge base for a robot, so it isable to combine a knowledge base

    with visual data provided by sensorsto extrapolate information about its

    environment in a fashion more akinto top-down, bottom-up processing.

    Aptima secured funding for Cog-nitive Patterns through a smallbusiness innovative research grantfrom DARPA, and Stacy stressesthat what the company is doing ispractical and can be done without amultimillion-dollar budget.

    The first phase of the project was

    limited to simulated models,whereas the second phase put thesoftware on a robot.

    Aptimas Cognitive Patterns architecture enables robots to make sense of their environment, much like how humans do. This robothas Cognitive Patterns integrated onto it for a test. Photo courtesy Aptima.

  • 8/14/2019 Mission Critical: Perception and Cognition

    27/32MISSION CRITICAL November 2013 25

    When you move from simulationto physical hardware, youre dealingwith an awful lot of uncertainty,he says. So its one thing in thesimulation to say, here are somefeatures that get presented to the

    system for recognition. Its anotherthing on a real robot wanderingaround perceiving things.

    However, Stacy says using ROSmakes it easier to go from simula-tion to real-world application.

    In phase two, Aptima integratedthe cognitive aspects of its softwarewith visual information inputted

    by sensors built by iRobot.To test the prototype, Aptima se-cured the use of a local Massachu-setts middle school, so the companycould test the principles of its soft-ware in a real-world environment.

    For this second phase, AptimasDARPA agent worked at the ArmyResearch Laboratory. This allowedthe company to integrate Cognitive

    Patterns with a robotic cognitivemodel the ARL had, called the Sub-Symbolic Robot Intelligent Con-trolling System, or SS RICS. Thatprogram combines language-basedlearning with lower level percep-tion. Stacy says Cognitive Patternsaligns somewhere between the two.

    The system has an operator inter-face that Stacy says allows the user

    to communicate with the robot ona high level.

    The operator might say, I donthave a map. Go find the cafeteriaand see if there are kids there, hesays. Now thats a very high levelof command, because for a ma-chine theres all kinds of stuff that

    needs to figure it out there. Andone of the really interesting thingswe did here is we looked to see if wecouldnt generate new knowledgewhenever we encountered some-thing that we didnt understand.

    The operator can place alternate re-ality, or AR, tags on certain objects,and they act as a proxy for usingcomputer vision to recognize thoseobjects. Then the robot is able tolearn those features and apply themto future scenarios. For instance,the team did this once when a ro-bot using Cognitive Patterns wasin a room with a weapons cache,

    so next time it entered a similarscenario, it would have a cognitivemodel on which to base its percep-tions. The operator can also tell therobot to ignore certain surround-ings or label special things in itsenvironment, such as determiningwhat a student desk is, because thatobject is a blend of a traditionalschool chair and a separate desk.

    This higher level of communica-tion benefits the operator, saysStacy, because, aside from softwaredevelopers, most operators wontcare about the detailed features be-ing extracted from the robots cam-era system.

    Finding New Uses

    Now Aptima is working on an-other prototype as a follow on toits DARPA contract imple-menting Cognitive Patterns ontoprosthetics.

    The company is working on anintelligent prosthetic through acontract with the Office of Naval

    Research and the Office of theSecretary of Defense where a pros-thetic would be controlled by theneurosignatures of a persons brain interfacing the limb with theusers mind.

    The arms going to have sensorson it, and so its going to be doingthe same kind of thing the robotis doing with Cognitive Patterns,which is perceiving its environ-ment, knowing the things it canreach for, he says.

    Through collaboration with aprosthetics expert, Aptima is using

    its software to let the arm commu-nicate with its user in a more natu-ral way, such as through muscletwitches.

    Perfecting Perception

    The end goal of this work ad-dresses the current disconnect inhow robots could best perceivetheir environments, with the ma-

    chine vision community pushingoptimal mathematical models toperfect bottom-up processing,while the cognitive science com-munity is seeking out the bestcognitive model to apply to get arobot to think, says Stacy.

    We are starting to see the needto hook up with each other, andCognitive Patterns really is the in-

    tersection between those two. If that happens, well have robotsthat can really see and understandthe world the way that humans do,and thats been elusive for a longtime in robotics.

    Click on this QR code to see a video on Cognitive Patterns.

    Testing, Testing

    http://bit.ly/1gSjvAc%20
  • 8/14/2019 Mission Critical: Perception and Cognition

    28/32

    The Future of Employment:How Susceptible are Jobs to Automation?

    By Ashley Addington

    26 MISSION CRITICAL November 2013

    Personal assistant robots, like PAL Robotics REEM-H1, could be usedfor service jobs in the future. Photo courtesy PAL Robotics.

  • 8/14/2019 Mission Critical: Perception and Cognition

    29/32MISSION CRITICAL November 2013 27

    Spotlight

    Anew Oxford study has emerged addressing thechanges in the future job market.In the study, conducted by Carl Benedikt Freyand Michael A. Osborne and entitled The Futureof Employment: How Susceptible are Jobs to Com-

    puterisation, the researchers broke down the futureevolution of jobs and what employment could meanfor the rest of the world.

    Before I was even conducting this study, I was reallyinterested in how technology was advancing, espe-cially in the job market, Osborne said.

    The conclusion: Over the next two decades, 47 per-cent of the jobs in more than 700 fields could be af-fected by automation. The main jobs at risk are thelow-level jobs that dont require higher education.

    Jobs such as librarians, telemarketers and databasemanagers will be among the first jobs that could bereplaced by automation, the researchers say. Highereducation jobs that require creativity, such as in thefields of health care, media, education and the arts,would be less likely to be handled by computers be-cause of the need for spontaneous thought.

    People need to be aware of how important educationis. People with jobs that require more education willbe least likely to lose their job, Osborne said. Now,for the first time, low-wage manual jobs will be atrisk. Jobs in agriculture, housekeeping, constructionand manufacturing could be some of the areas handedover to computers and robots.

    Jobs that require perception and manipulation aresafer, because they involve having to interact with otherhuman beings. Computers would have a harder timetaking over these jobs, because much of the work in-volved in these careers change on a daily basis. It wouldbe close to impossible to program a computer with everypossible scenario when dealing with situations and com-munications between individuals.

    Most management, business and finance occupa-tions, which are intensive in generalist tasks requiringsocial intelligence, are largely confined to the low-riskcategory, the study says.

    Science and engineering jobs are also not at risk, be-cause they require a large amount of creative intelli-gence. The higher the salary and education demands,

    the less likely a job will be taken over by a machine.

    Awareness is essentially what we have tried to pro-duce from this study. We want to make more peopleaware of what the future potentially holds and thateducation is the key for people to keep their careers,

    Osborne said.Jobs in transportation are also at risk due to the au-tomation of vehicles. Sensor technology has contin-ued to improve vastly over the past few years, whichhas led to increased safety and security in vehicles.It has also offered enough data that can help engi-neers get past the problems of robotic development.These will permit an algorithmic vehicle controllerto monitor its environment to a degree that exceedsthe capabilities of any human driver. Algorithms are

    thus potentially safer and more effective drivers thanhumans, the study says.

    As with any technological change, for every change toa current job there is the possibility of new growth inother areas. In order for computers and robots to beable to do particular jobs, there must be someone toset up and monitor the computers to make sure every-thing functions accordingly. Jobs will also be added toassist and manage the technology.

    First, as technology substitutes for labor, there is a de-struction effect, requiring workers to reallocate their la-bor supply; and second, there is the capitalization effect,as more companies enter industries where productivityis relatively high, leading employment in those indus-tries to expand, the study says.

    Computerization has the potential to change the jobmarket and society by being more thorough and do-ing a vast majority of tasks faster. Algorithms are ableto make decisions with an unbiased mind, thus al-lowing them to come up with conclusions in a moretimely fashion than human operators.

    Even though the job market is changing, there isno need to panic. The best things people can do forthemselves is to stay current with technology and bethoroughly educated, Osborne says.

    There are plans to continue the Oxford study and fo-cus more on the wide range of jobs that have potentialto change, which ones are most susceptible and howthis should be handled.

    Click on this QR code to read The Future of Employment: How Susceptible are Jobs to Computerisation.

    http://bit.ly/1g707TS%20
  • 8/14/2019 Mission Critical: Perception and Cognition

    30/3228 MISSION CRITICAL November 2013

  • 8/14/2019 Mission Critical: Perception and Cognition

    31/32MISSION CRITICAL November 2013 29

    End Users

    Ghostwriter: Algorithms Write Books

    Philip Parker, a marketing professor at the inter-national graduate business school INSEAD, isalso a published author a very published au-

    thor. Published so many times, in fact, that he makesprolific writer like Shakespeare or Stephen King seemdownright lazy.

    Parker has penned more than one million reportlength books, which he self-publishes as paper-backs or print-on-demand books. Unlike Shakespeareor Stephen King, they tend not to be about love or

    fear. Instead, they might be about such topics asThe 2007-2012 Outlook for Tufted Washable Scat-ter Rugs, Bathmats and Sets That Measure 6-Feet by9-Feet or Smaller in India or The 2009-2014 WorldOutlook for 60-Milligram Containers of FromageFrais. There are some dictionaries in there too, andmedical sourcebooks and texbooks. He also writes po-etry 1.4 million poems so far.

    He doesnt produce these articles, books and poemsjust to see his name on the shelf, but rather to serve

    very small, niche markets, particularly in the devel-oping world, that traditional publishing have ignoredbecause they are so small.

    Parker doesnt sit and crank these out himself, at least notin the traditional book-writing sense. Instead, he uses aseries of computer programs to compile information ona given topic and then combines it into book form.

    The above-mentioned book titles might seem ex-tremely arcane, but if youre a small business in a de-veloping country making a specific market, he notedin a recent Seattle TED Talk, you cant purchase mar-ket data for some of these products. Its out there butno traditional publisher is going to package it.

    I discovered there is a demand. Its a very narrowdemand, but the key problem is the author, he said.Authors are very expensive. They want food andthings like that. They want income.

    The methodology for each type of book is studied andthen copied into algorithms. As for how they work, it

    depends on the subgenre, he writes. For example,the methodology for a crossword puzzle follows thelogic of that genre. Trade reports follow a completely

    different methodology. The methodology of the genreis first studied, then replicated using code.

    The programs are able to provide the whole valuechain of the publishing industry, he says, by auto-mating the collecting, sorting, cleaning, interpolat-ing, authoring and formatting.

    Parker says he began working on the programs-as-writeridea in the 1990s trying various techniques, but around1999-2000 I was able to create the first commercial ap-

    plications. The idea came from being able to do genresthat otherwise would not have happened.

    He hit on the economic model of selling very ex-pensive, high-end market research studies, completelycomputer generated, to subsidize the creation of lan-guage learning materials, mathematics books, etc. forthe underserved languages.

    Weve published over about a million titles, most ofthem are high-end industry studies, but a lot of themare language learning, he said in the talk while showing

    a video of what it looks like when a computer writes abook (many Web pages pop up in quick succession).

    EVE is capable of creating textbooks for students inAfrica who have never seen one in their language, forcollating weather reports for farmers in remote loca-tions, even to help develop video games for agricul-ture extension agents to help them plan planting regi-mens for regions they have never seen. Much of thepoetry is aimed at helping non-English speakers learnthe language.

    Its a way of corralling existing data in new ways toserve tiny markets. The practice could be used in avariety of ways, he noted. Say, for example, youre afootball player and you dont like your physics bookbecause you cant relate to it.

    Why not have a football player physics book? Whynot have a ballet dancers physics book? he asked.

    Going forward, hed like to do even more, such as byhaving virtual professors who can diagnose, teach

    and write original research in fields that do not haveenough scientists or researcher[s] available. Fun stuff.

  • 8/14/2019 Mission Critical: Perception and Cognition

    32/32

    Listings online and in Unmanned Systemsmagazine

    Members-only networking, education and VIP events

    AUVSI Day on Capitol Hill

    Local chapters around the world

    Members-only online content

    AUVSIs online community

    Exhibits

    Sponsorships

    Advertising Event registration

    Complimentary job listings on AUVSIs career center

    Unmanned Systemsmagazine subscription

    Unmanned Systems e-Briefsubscription

    Advocacy Action Alerts

    Knowledge resources

    BECOME ACORPORATE MEMBERTODAY

    MAXIMIZE YOUR

    VISIBILITY