View
214
Download
1
Category
Tags:
Preview:
Citation preview
1
DARPA Agent Markup Language(DAML)
Dr. Mark Greaves
May 2005
Ontologies for the Web
Jun 2000 Program start
Feb 2004 OWL accepted by W3C as a Web Standard
Dec 2004 SWRL FOL and OWL/S submitted to W3C
May 2005 Program complete
2
What is DARPA?
DARPA = Defense Advanced Research Projects Agency
Long Range R&D Organization of the US Department of DefenseEstablished 1958 as a US response to the Soviet launch of SputnikPursues high-risk, high-payoff basic and applied researchOrganizationally part of USD(AT&L) and DDR&EOperates in coordination with, but independent of, the military research and
development establishment (ARL, AFRL, ONR)Committed to maintaining U.S. military technology superiority
Chartered to Prevent Technological SurpriseFunds work that is a counterpoint to traditional thinking and approachesNoteworthy programs include VELA HOTEL, M-16, Stealth aircraft, GPS,
ARPANET, Unmanned aircraft, most early AI, MEMS, ARPANET…
FY05 research budget is ~3B
DARPA Description and active solicitations at www.darpa.mil
3
DARPA, DAML, and Google…
#2
#3
Google “darpa”on 10/21/04
4
DAML Program Summary
Solution:Augment the web to link machine-readable knowledge to web pages
– Extend RDF with Description Logic
– Use a frame-based language design
– Create the first fully distributed web-scale knowledge base out of networks of hyperlinked facts and data
Approach:Design a family of new web languages
– Basic knowledge representation (OWL)
– Reasoning (SWRL, OWL/P, OWL/T)
– Process representation (OWL/S)
Build definition and markup tools
Link new knowledge to existing web page elements
Test design approach in the Intelligence Community
Standardize the new web languages in the W3C
People use implicit knowledge to reason with web pages
People use implicit knowledge to reason with web pages
Computers require explicit knowledge to reason with web pages
Computers require explicit knowledge to reason with web pages
Existing Web
(HTML/XML over HTTP)
Semantic Web
(OWL over HTTP)
Links via URLs
Problem: Computers cannot process most of the information stored on web pages
5
Program Elements
• Web Ontology Language (OWL)– Enables knowledge representation and
tractable inference across the web
– Based on Description Logics and RDF
• OWL Reasoning Languages– SWRL Rules Language: Supports business
rules, policies, and linking between distinct OWL ontologies
– OWL/P Proof Language: Allows software components to exchange chains of reasoning
– OWL/T Trust Language: Represents confidence that OWL and SWRL inferences are valid
– Based on Description Logic Programming
• Semantic Web Services (OWL/S)– Allows discovery, matching, and execution of
web services based on action descriptions
– Unifies semantic data models (OWL) with process models (Agent) and shows how to dynamically compose web services
– Based on process algebra and NIST PSL
• OWL Tools
Completed standards process
Started standards process
Under development
SWRL: RulesOWL/P: Proof
SWRL: RulesOWL/P: Proof
OWL/S:Semantic Web
Services
OWL/S:Semantic Web
Services
Web OntologyLanguage (OWL)
Web OntologyLanguage (OWL)
OWL/T:Trust
OWL/T:Trust
DAML Program Technical Flow
Each DAML Program Element includesspecifications, software tools,
coordination teams, and use cases
6
2004 Technical Progress
• Web Ontology Language (OWL)– W3C accepted OWL; formed Semantic
Web Best Practices Working Group to maintain the standard
– W3C agreed to host Ontaria, a permanent public OWL ontology registry/download site
– OWL gained traction (250K RDF/OWL pages, 20M+ triples, 10K classes available on-line)
– W3C Workshop on OWL in life science
• OWL Reasoning Languages– SWRL 0.6 released 24 May 2004 by the
US/EU Joint Committee; being tested at JWAC, IMO, NSA
– SWRL-FOL submitted to the W3C
– SweetRules complete
– W3 Rules Workshop April 27-28
• Semantic Web Services– OWL/S Web Services Specification
submitted to W3C
– Semantic Web Services Interest Group chartered by W3C
– Semantic Web Services Initiative (~45 organizations) coordinates commercial, DAML, and EU Framework 6 output
– SWSL and SWSL-FOL submitted
– W3C SWS Workshop June 9-10
• OWL Tools– DAML sponsored a new open source
website www.semwebcentral.org
– Over 70 OWL tools released by DAML contractors
– New OWL plugins for Eclipse
– Currently 84 hosted projects, 3M hits and >100GB of downloads since Dec 2003
7
Transition
Intelligence Community[6 funded pilots at different IC agencies]
DoD AF AMC Foreign Clearance Guide FCS SOSCOE OWL/S use in the TINAF AMC NOTAMs Joint Explosive Ordinance Detection ACTDDISA Discovery Metadata Repository Center for Army Lessons Learned Prototype
FederalCIO Council Semantic Interoperability Community of Practice formed, 2 conferences SWANS conference April 7-8 2004 (300+ attendees, 40 trade show participants)
Commercial43 companies in SWSI working on OWL/S 19 commercial OWL implementations including IBM and HP
More evidence of uptake…58 “Semantic Web” books on Amazon.comNCI Thesaurus is 100% OWLNIH and NIST are sponsoring work to define a comprehensive protein chemistry taxonomyDARPA XG using OWL for policy language vocabulary
8
DAML Schedule
• Complete OWL versioning tools, Ontaria, OWL/T 1.0• Deliver OWL/S 1.1 to W3C and complete OWL/S editors • Complete SWRL 0.6 reasoning environment and submit SWRL FOL• Tools and Outreach
– Semantic Web Applications for National Security (SWANS) and SWIG meetings– Stabilize and transfer semwebcentral.org and daml.org to W3C– Complete SWeDE, IE plugin, and reference application
Meetings and Reviews
Web Ontology Language
Trust Specification
Program Elements
PI MeetingPI Meeting
SWMU SWMU
PI MeetingPI Meeting
SWMU
Revisions (DAML+OIL), OWL Lite,OWL DL, OWL Full
FY02 FY03 FY04 FY05
PI Meeting
Semantic Web Services Semantic WebServices Initiative
Process Representation, Brokering, Profile, Grounding Ontologies
Trust Algorithms
Proof Specification
Rules Specification Createv 0.6
Logic Mappings, Descriptive Logic Programming, Tool Development
Proof Language and Query Engines
IC Transitions
SWANS Mtg
PI Meeting
W3CDelivery
FY01Work
Createv 0.7
ExternalConf
END
Ontaria,OWL Versioning
Program Milestone
FY05 Remaining Tasks
Saturn CombineHorus
NOTAMS NGA
Saturn II
JWAC ALVJWAC
SWRLReasoners
OWL/SEditors
9
DAML’s Legacy
Success = creating the conditions for early adopters to allow the semantic web revolution to succeed
DAML has had incredible successWe have gone from DARPA-hard challenge to accepted industrial standard in four
yearsThe PM has lost control of the technology
It is time for OWL to leave the DARPA nest and flyThere is more work to be done: OWL 2.0, Semantic Web Services, Rules, Query
Languages, Tools, Documentation, Killer Apps, Proof Exchange, TrustDomain-specific ontologies and applicationsMore standards, collaboration with Europe, funding organizationsMore nonacademic conferences
DAML’s intellectual thread will be carried by other programs and organizations
So… What kind of new DARPA program would compliment DAML?
10
How Does a New DARPA Program Start?
New Programs Must result in or point to a new military capability
Must be about removing a technological barrier, not a policy barrierProblem must be “DARPA-hard”; typically 10x improvement
Barrier to capability must be primarily technical, not policy
Must start from a specific new immature technology idea or ideasSpecific = must be identified at the program approval phase
New = typically based on work that is < 5 yrs old
TechOffices
SystemsOffices
ProgramOffices
6.4Prototype
6.1Tech IdeaDARPA
OLD NEW
“The Particle Accelerator”Technology Base
Syst
ems
Tech
nolo
gy
Syst
ems
Tech
nolo
gy
Syst
ems
Tech
nolo
gy
…
11
The Heilmeyer Catechism
What are you trying to do? Articulate your objectives using absolutely no jargonExample: “take anthrax off the table as a threat to our forces”
What is the new military capability that your technology could provide?
How is it done today, and what are the limits of current practice?Why is this specifically a technology problem?
What's new in your approach and why do you think it will be successful?All software is Turing-equivalent, so software methodology is usually not relevant
What is your argument/analysis that a 10x difference in a technology will result in a new capability?
Who cares? If you are successful, what difference will it make?Who is the customer for the new idea, and what evidence do you have that any
transition will be successful?
What are the risks and the payoffs? How much will it cost? How long will it take?
What are the midterm and final exams to check for success?Metrics and experimentation plans must be defined up front
12
Other Program Questions
What is DARPA’s Transition Strategy?How does new capability transfer to a Service or Agency?
Gold: DARPA work leads to a direct acquisition
Silver: DARPA work leads to a direct maturation effort by a DoD PEO
Bronze: DARPA work leads to a new capability that a contractor will try to sell back to DoD
Tin: DARPA work leads to a better state of the world
Is there an MOU / MOA and funding in the POM?
Why is this different from other DARPA and DoD programs?
What are our metrics for measuring our progress?Always difficult for software; exceptionally difficult for architectures
What are the phases of the Program?Phase I is typically 12-18 months
Phase II funding is contingent on meeting specific agreed-upon phase I milestones
13
Program Creation Basics
DARPA PM finds new technology idea(s) and links it to capability
Seedling funding to explore idea and create program briefTypically $200K - $300K / 4-6 months / 1-3 contractorsSolidify program argument, financials, milestones, phases, metrics, experimentation
strategy, and program deliverable/transition/MOUsSeedling output is the newstart brief – not jumpstart technology
Brief to DARPA DirectorRepeat a few times
Solicitation construction and publication
Source Selection (and possible plan revision)Multiple contractors, teams, areas of expertise
Contracts Awarded via an AgentProgram Phase I with milestonesDARPA Director Brief for go/no-goProgram Phase II with milestones
14
Sample Program: Dynamic User Interfaces
MAIN OBJECTIVE
EXPECTED IMPACTTECHNICAL APPROACHUse situation theory to quantify the information content of a UI
Decompose user’s info tasks into UI task specifications
Leverage OWL to create a tractable logic language that can express analytic tasks and data semantics
Apply constraint-based solvers, CBRs, and other planning technologies to yield task-specific UI specs
Map UI tasks onto available graphical elements
Build a semantically characterized set of UI graphical elements by using OWL/S and SWRL
Use a planner/shape grammar and machine learning to derive the UI layout for an user’s individual profile
Dynamically create the new UI on the user’s desk
Replace current mass-produced general-purpose UIs with task-sensitive, user-specific interfaces
Customize each user’s I/O with the data sources
User interacts with the web at the problem level
User does not have to master all the data sources and algorithms that are available
UI is automatically built for each user’s unique cognitive/perceptual talents, training, experience, and current problem context
Allow UIs to better support independent hypothesis generation and unconventional concept exploration
Faster and higher-quality analytic output
Embrace individual styles and competencies
Tune core UI planners to allow rapid confirmation or disconfirmation of different uncommon hypotheses
Increase user satisfaction
Increased agility in response to new missions
Restructure planners and interfaces on the fly to handle new information requirements and data sources
Interaction with the DBs structured around user task requirements, not data structures
Late binding the UIs relative to the individual user, task, and problem context allows for rapid learning and evolution of interface paradigms
NGA Task SME
Cognitive Task Analyst
Interface NGA Analyst
Interface Designer
Computable Interface Spec
Interface Generator
Machine learningInterface Generator reasonsover domain and task semantics to dynamically adapt interface to the situation
Interface is automatically tailored for each user
Interface adapts to the user’s context
15
How Is DARPA Different?
Lightweight and nimble organizational model“120 PMs with a common travel agent”Currently organized into 8 tech offices plus the Director
Technology – DSO, MTO, IPTO; Systems – ATO, TTO, IXO, SPO, J-UCASOffices come and go fairly frequently and the tech/systems boundary is fluidNo institutional incentives to collaborateNo technical interdependencies
No dedicated facilities beyond simple office space in Arlington, VAhttp://www.darpa.mil has programs, solicitations, lists, areas of interest
4-year personnel rotation policy embedded in the cultureNo institutional biasesNo empire building
Always looking for new Program Managers with great technical ideasPMs come from academia, industry, government, militaryMust be a US citizen with the ability to hold a clearanceMust be willing to work incredibly hard, travel extensively, and have a national-scale
visionYou will be changed by the experience, and you might change the world
Come Join Us!
Recommended