60
A Collective Intelligence Tool for Evidence Based Policy & Practice Anna De Liddo & Simon Buckingham Shum Knowledge Media Institute The Open University, Milton Keynes, MK7 6AA, UK Jurix 2012 Workshop on Argumentation Technology for Policy Deliberations http://evidence-hub.net/

De liddo & Buckingham Shum jurix2012

Embed Size (px)

Citation preview

Page 1: De liddo & Buckingham Shum jurix2012

A Collective Intelligence Tool for Evidence Based Policy & Practice

Anna De Liddo & Simon Buckingham Shum Knowledge Media Institute

The Open University, Milton Keynes, MK7 6AA, UK

Jurix 2012 Workshop on

Argumentation Technology for Policy Deliberations

http://evidence-hub.net/

Page 2: De liddo & Buckingham Shum jurix2012

Computational Services/Analytics

UX design

Human Dynamics of Engagements

NLP, XIP, discourse analysis Structure

searches/Agents

Cohere/free semantics

Policy: Olnet, Ed

Future

EH/simplified model, widget/threaded

interface

Research: Rcyp Hub

Human-­‐Centred  Compu/ng  for  CCI  (Argumenta/on-­‐based  Collec/ve  Intelligence)  

Practice: CoPHV

Page 3: De liddo & Buckingham Shum jurix2012

Collective intelligence is a new umbrella term used to express the augmented functions that can be enabled by the existence of a community. It refers to the intelligence that emerges by the coexistence of multiple people in the same environment. This environment can be both a real world environment and a virtual one. We are looking at the Web and at what intelligent hints, supports or behaviors can be tracked and emerge by the coexistence of a collective of people online.

Collective Intelligence: How do we Crowdsource Policy Deliberation?

Page 4: De liddo & Buckingham Shum jurix2012

Nowadays successful CI tools have been developed, especially in the IT business and e-commerce sector, that use those users’ ‘traces’ to depict users’ profiles and suggest user actions based on those profiles. These CI tools support quite simple user objectives, such as: deciding what book to buy (Amazon), finding the picture or video that mach their needs (Flickr and Youtube), or deciding what music to listen (LastFM). Collecting fragmented users’ traces seems to work to collect and exploit CI in the business and commerce sector.

Collective Intelligence

Page 5: De liddo & Buckingham Shum jurix2012

On the other hand if we look at the social and political sector, or at higher level organizational and business strategy issues we need to support more complex users goals such as i.e.: understanding policy actions; learning environmental responses to adapt organizational actions; understanding the economical crisis and the possible implications for the community etc.

Contested Collective Intelligence

CI tools that aims at supporting users in more complex knowledge works need to be thought and designed so that users’ collective intelligence can be captured and shared in a much richer and explicit way.

Page 6: De liddo & Buckingham Shum jurix2012

In the design space of CI systems •  where there is insufficient data to confidently compute an answer,

•  when there is ambiguity about the trustworthiness of environmental signals,

•  and uncertainty about the impact of actions,

…then a more powerful scaffolding for thinking and discourse is required, in order to support the emergence of CI around complex socio political dilemmas.

Contested Collective Intelligence

Page 7: De liddo & Buckingham Shum jurix2012

 With Cohere users can make their thinking visible and sharable with online communities by:

ü  collaboratively annotating the Web, ü  leveraging lists of annotations into meaningful knowledge maps and ü  Engaging in structured online discussions.

First Prototype Tool for Contested Collective Intelligence (CCI)

Page 8: De liddo & Buckingham Shum jurix2012

Cohere Conceptual Model

 Cohere  builds  on  a  conceptual  model  which  consists  of  four  main  users  ac/vi/es  through  which  users  can  make  their  thinking  visible  and  contribute  to  the  development  of  Collec/ve  Intelligence  around  specific  issues:  

Page 9: De liddo & Buckingham Shum jurix2012

Collaborative Annotate Web Resources

Page 10: De liddo & Buckingham Shum jurix2012

Make Semantic Connections

Page 11: De liddo & Buckingham Shum jurix2012

Watch the demo video at: http://www.youtube.com/watch?v=Fcn2ab9PYo4 Watch the Open Deliberation model video at: http://www.youtube.com/watch?v=vthygbKA2Mg

Explore, Filter and Makesense

Page 12: De liddo & Buckingham Shum jurix2012

Despite the success of the web annotation paradigm…People seems to struggle to make semantic connections, moreover too many semantics produce often redundancy and duplication.

This brought to the second design iteration:

A new simplified data model and a new interface for connection making….

Page 13: De liddo & Buckingham Shum jurix2012

The issue: lack of evidence of OER effectiveness

Experimenting CCI in a real case of Educational Policy: The OLnet project

olnet.org

Page 14: De liddo & Buckingham Shum jurix2012

OLnet project: The wider research question

RQ: How can we help researchers and practitioners in the OER field to contribute to the evidences of OER effectiveness and to investigate these evidences collaboratively?

Page 15: De liddo & Buckingham Shum jurix2012

Our approach to CI focuses on: capturing the hidden knowledge of the OER movement and leveraging it so that can be: ü  debated (building and confronting arguments), ü  evaluated (assessing evidence), and ü  put in value (distilling claims used to inform OER

policy and practice)

Approach: Contested Collective Intelligence

Page 16: De liddo & Buckingham Shum jurix2012

What&Why: The Evidence Hub provides ü the OER community with a space to harvest the

evidence of OER effectiveness ü policy makers with a community-generated

knowledge base to make evidence based decision on Educational Policy.

Page 17: De liddo & Buckingham Shum jurix2012

 Social  Ecosystem  •  People  (Contributors)  •  Projects  •  Organiza/ons    Discourse  Ecosystem  •  Key  challenges  •  Issues,    •  Solu/ons,    •  Claims  •  Evidence    •  Resources    

The Evidence Hub: Mapping the social and discourse ecosystem

olnet.org

Themes  

Page 18: De liddo & Buckingham Shum jurix2012

Social Ecosystems (Org and Projects)

Page 19: De liddo & Buckingham Shum jurix2012

Social Ecosystems (Org and Projects)

Page 20: De liddo & Buckingham Shum jurix2012

Social Ecosystems (Org and Projects)

Page 21: De liddo & Buckingham Shum jurix2012

Social Ecosystems (Org and Projects)

Page 22: De liddo & Buckingham Shum jurix2012

Social Ecosystems (Org and Projects)

Page 23: De liddo & Buckingham Shum jurix2012

Social Ecosystems (Org and Projects)

Page 24: De liddo & Buckingham Shum jurix2012

The Discourse Ecosystem Elements: A Simplified Data Model

Page 25: De liddo & Buckingham Shum jurix2012

The Discourse Ecosystem Elements: A Simplified Data Model

Page 26: De liddo & Buckingham Shum jurix2012

The Discourse Ecosystem Elements: A Simplified Data Model

Page 27: De liddo & Buckingham Shum jurix2012

The Discourse Ecosystem Elements: A Simplified Data Model

Page 28: De liddo & Buckingham Shum jurix2012

The Discourse Ecosystem Elements: A Simplified Data Model

Page 29: De liddo & Buckingham Shum jurix2012

The Discourse Ecosystem Elements: A Simplified Data Model

Page 30: De liddo & Buckingham Shum jurix2012

The Discourse Ecosystem Elements: A Simplified Data Model

Page 31: De liddo & Buckingham Shum jurix2012

Evidence Types: From simple data, anecdotes and stories to literature Analysis and Experimental Results

Page 32: De liddo & Buckingham Shum jurix2012

Social  Ecosystem  •  People  (Contributors)  •  Projects  •  Organiza/ons    Discourse  Ecosystem  •  Key  challenges  •  Issues,    •  Solu/ons,    •  Claims  •  Evidence    •  Resources    

Loca4ng:  Adding  geo-­‐loca/on  info  

CI: Community Actions

Page 33: De liddo & Buckingham Shum jurix2012

CI: Community Actions

Social  Ecosystem  •  People  (Contributors)  •  Projects  •  Organiza/ons    Discourse  Ecosystem  •  Key  challenges  •  Issues,    •  Solu/ons,    •  Claims  •  Evidence    •  Resources    

Loca4ng:  Adding  geo-­‐loca/on  info  

Following:     Expressing  Interest  

Page 34: De liddo & Buckingham Shum jurix2012

Social  Ecosystem  •  People  (Contributors)  •  Projects  •  Organiza/ons    Discourse  Ecosystem  •  Key  challenges  •  Issues,    •  Solu/ons,    •  Claims  •  Evidence    •  Resources    

Connec4ng:  Adding  to  Widget  

Loca4ng:  Adding  geo-­‐loca/on  info  

Following:     Expressing  Interest  

CI: Community Actions

Page 35: De liddo & Buckingham Shum jurix2012

Social  Ecosystem  •  People  (Contributors)  •  Projects  •  Organiza/ons    Discourse  Ecosystem  •  Key  challenges  •  Issues,    •  Solu/ons,    •  Claims  •  Evidence    •  Resources    

Connec4ng:  Adding  to  Widget  

Loca4ng:  Adding  geo-­‐loca/on  info  

Following:    

Promo4ng:    Ordering-­‐Priori/zing  lists    Vo/ng  Connec/ons  

Expressing  Interest  

CI: Community Actions

Page 36: De liddo & Buckingham Shum jurix2012

Social  Ecosystem  •  People  (Contributors)  •  Projects  •  Organiza/ons    Discourse  Ecosystem  •  Key  challenges  •  Issues,    •  Solu/ons,    •  Claims  •  Evidence    •  Resources     Theming:  Adding  OER  Themes  

Connec4ng:  Adding  to  Widget  

Loca4ng:  Adding  geo-­‐loca/on  info  

Following:    

Promo4ng:    Ordering-­‐Priori/zing  lists    

Vo/ng  Connec/ons  

Expressing  Interest  

CI: Community Actions

Page 37: De liddo & Buckingham Shum jurix2012

Social  Ecosystem  •  People  (Contributors)  •  Projects  •  Organiza/ons    Discourse  Ecosystem  •  Key  challenges  •  Issues,    •  Solu/ons,    •  Claims  •  Evidence    •  Resources     Theming:  Adding  OER  Themes  

Connec4ng:  Adding  to  Widget  

Loca4ng:  Adding  geo-­‐loca/on  info  

Following:    

Promo4ng:    Ordering-­‐Priori/zing  lists    

Vo/ng  Connec/ons  

Expressing  Interest  

CI: Community Actions

Page 38: De liddo & Buckingham Shum jurix2012

A simplified UI for connection making….

Page 39: De liddo & Buckingham Shum jurix2012

Widget Interface for Connection Making

Page 40: De liddo & Buckingham Shum jurix2012

Widget Interface for Connection Making

Page 41: De liddo & Buckingham Shum jurix2012

The Evidence Hub Some Facts and Figures The Evidence Hub alpha version launched in April 2011. With 50 users, from 35 different countries, including key OER people.

Page 42: De liddo & Buckingham Shum jurix2012

olnet.org

Some Facts and Figures

Opened to the public at OpenEd11 in Utah.

Page 43: De liddo & Buckingham Shum jurix2012

Some facts and figures: Engagement

olnet.org

- 108 contributors,

- received 3,054 visits from 1,053 unique visitors from 57 different countries

Page 44: De liddo & Buckingham Shum jurix2012

Some facts and figures on Content

olnet.org

304 OER projects and organizations

129 OER research claims

79 OER issues

89 proposed solutions

323 Evidence and

553 Resources

Page 45: De liddo & Buckingham Shum jurix2012

Reflection on initial User Testing & Interviews

olnet.org

Feedback  from  users  shows  that  the  EH  is  perceived  as:    

“relevant”,  “organized”,  “desirable”  and  “engaging”    

but  some/mes  “sophis1cated”  and  “complex”.    

 

improving  the  user  experience  by  crea4ng  summary  views,  

facilitate  and  simplify  content  seeding,  

 be:er  displays  and  filters  on  the  content.  

Page 46: De liddo & Buckingham Shum jurix2012

Feedback from Lab-Based User Testing

Fragmented approach to argument construction: widget interface

•  Easy to contribute to but •  Increases miscathegorization : interpretation biases on

how content should be labeled under specific argumentation categories;

•  Increases duplication of content •  Decreases argumentation coherence

This lead to the third design Iteration….

Page 47: De liddo & Buckingham Shum jurix2012

Third design Iteration: The CoPHV Evidence Hub

Page 48: De liddo & Buckingham Shum jurix2012

Research  by  Children  and  Young  People  Evidence  Hub:  A  mixed  threaded/widget  interface    

Page 49: De liddo & Buckingham Shum jurix2012
Page 50: De liddo & Buckingham Shum jurix2012

Collective Intelligence

Development Trajectories: Facilitating content seeding

1) Web Annotation to support seeding Evidence Hub bookmarklet to allow people to capture evidence by performing annotation of free web resources and OERs.

olnet.org

Allows users to highlight and annotate Web resources through an Evidence Hub bookmarklet

Page 51: De liddo & Buckingham Shum jurix2012

2) Combining Human and Machine Annotation: The Hewlett Grant Reports Project

RESULTS

template

report

XIP-annotated report

De Liddo, A., Sándor, Á. and Buckingham Shum, S. (2012) Contested Collective Intelligence: Rationale, Technologies, and a Human-Machine Annotation Study, Computer Supported Cooperative Work (CSCW)

Journal : Volume 21, Issue 4 (2012), Page 417-448

Page 52: De liddo & Buckingham Shum jurix2012

Discourse analysis with the Xerox Incremental Parser

BACKGROUND KNOWLEDGE: Recent studies indicate … … the previously proposed … … is universally accepted ...

NOVELTY: ... new insights provide direct evidence ...... we suggest a new ... approach ... ... results define a novel role ...

OPEN QUESTION: … little is known … … role … has been elusive Current data is insufficient …

GENERALIZING: ... emerging as a promising approach Our understanding ... has grown exponentially ... ... growing recognition of the importance ...

CONRASTING IDEAS: … unorthodox view resolves … paradoxes … In contrast with previous hypotheses ... ... inconsistent with past findings ...

SIGNIFICANCE: studies ... have provided important advances Knowledge ... is crucial for ... understanding valuable information ... from studies

SURPRISE: We have recently observed ... surprisingly We have identified ... unusual The recent discovery ... suggests intriguing roles

SUMMARIZING: The goal of this study ... Here, we show ... Altogether, our results ... indicate

Detection of salient sentences based on rhetorical markers:

De Liddo, A., Sándor, Á. and Buckingham Shum, S. (2012) Contested Collective Intelligence: Rationale, Technologies, and a Human-Machine Annotation Study, Computer Supported Cooperative Work (CSCW) Journal : Volume 21, Issue 4 (2012),

Page 417-448

Page 53: De liddo & Buckingham Shum jurix2012

PROBLEM_CONTRAST_ First, we discovered that there is no empirically based understanding of the challenges of using OER in K-12 settings.

XIP annotations to Cohere

Report Node Annotation Node

Name entities extracted by XIP convert into Tags

CONTRAST converts into semantic connection’s label :””describes contrasting ideas in”

PROBLEM converts into Annotation node icon: “Issue”=“Light Bulb”

De Liddo, A., Sándor, Á. and Buckingham Shum, S. (2012) Contested Collective Intelligence: Rationale, Technologies, and a Human-Machine Annotation Study, Computer Supported Cooperative Work (CSCW) Journal : Volume 21, Issue 4 (2012),

Page 417-448

Page 54: De liddo & Buckingham Shum jurix2012

Human annotation and machine annotation

~19 sentences annotated 22 sentences annotated 11 sentences = human annotation 2 consecutive sentences of human annotation

71 sentences annotated 59 sentences annotated 42 sentences = human annotation

1.

2.

De Liddo, A., Sándor, Á. and Buckingham Shum, S. (2012) Contested Collective Intelligence: Rationale, Technologies, and a Human-Machine Annotation Study, Computer Supported Cooperative Work (CSCW) Journal : Volume 21, Issue 4 (2012),

Page 417-448

Page 55: De liddo & Buckingham Shum jurix2012

3)  Collabora4ve  PDF  annota4on  

olnet.org

Future developments could be devoted to power the Evidence Hub with PDF annotation so that users can share evidence of Policy arguments and impact directly working with PDF.

A high % of policy report and documents are in PDF format: we have a concept demo of direct PDF annotation shared back to Cohere

Steve Pettifer, Utopia: http://getutopia.com/

Page 56: De liddo & Buckingham Shum jurix2012

Collective Intelligence Development Trajectories:

4) Better visualization and filtering of content 5)Discourse Analytics

When social and discourse elements become too many in

number and complexity how can we make sense of them?

Toward CI visualization and analysis…

adding more formal logics to evaluate arguments

& developing discourse analytics to create summaries,

identify gaps, localize interests, focus contributions

Page 57: De liddo & Buckingham Shum jurix2012

Watch the demo video at: http://www.youtube.com/watch?v=Fcn2ab9PYo4 Watch the Open Deliberation model video at: http://www.youtube.com/watch?v=vthygbKA2Mg

Discourse Network Visualization

Page 58: De liddo & Buckingham Shum jurix2012

Social Network Visualization

Page 59: De liddo & Buckingham Shum jurix2012

Theoretical questions for future work •  How to evaluate arguments? - authomatic (based on argument

computation) vs community lead mechanisms (such as voting and reputation systems)

•  How to make optimal use of both human and machine

annotation & argumentation skills? –  How to exploit machine consistency while reducing information

overload and noise? –  How to exploit the unique human capacities to abstract, filter for

relevance etc.?

•  How to cope with visual complexity (new search interface, focused and structured network searches, collective filtering, identifying argument structures)?

•  How do we crowdsource Policy Deliberation? What is the right interface? What is the architecture of Participation?

Page 60: De liddo & Buckingham Shum jurix2012

References •  De  Liddo,  A.,  Sándor,  Á.  and  Buckingham  Shum,  S.  (2012)  Contested  Collec/ve  Intelligence:  Ra/onale,  Technologies,  and  a  

Human-­‐Machine  Annota/on  Study,  Computer  Supported  Coopera/ve  Work  (CSCW)  Journal  :  Volume  21,  Issue  4  (2012),  Page  417-­‐448  

•  Buckingham  Shum,  Simon  (2008).  Cohere:  Towards  Web  2.0  Argumenta/on.  In:  Proc.  COMMA'08:  2nd  Interna4onal  Conference  on  Computa4onal  Models  of  Argument,  28-­‐30  May  2008,  Toulouse,  France.  Available  at:hap://oro.open.ac.uk/10421/  

•  De Liddo, Anna and Buckingham Shum, Simon (2010). Cohere: A prototype for contested collective intelligence. In: ACM Computer Supported Cooperative Work (CSCW 2010) - Workshop: Collective Intelligence In Organizations - Toward a Research Agenda, February 6-10, 2010, Savannah, Georgia, USA.  Available  at: http://oro.open.ac.uk/19554/

•  Buckingham  Shum,  Simon  and  De  Liddo,  Anna  (2010).  Collec/ve  intelligence  for  OER  sustainability.  In:  OpenED2010:  Seventh  Annual  Open  Educa4on  Conference,  2-­‐4  Nov  2010,  Barcelona,  Spain.  Available  at:  hap://oro.open.ac.uk/23352/

•  De  Liddo,  Anna  (2010).  From  open  content  to  open  thinking.  In:  World  Conference  on  Educa4onal  Mul4media,  Hypermedia  and  Telecommunica4ons  (Ed-­‐Media  2010),  29  Jun,  Toronto,  Canada.  Available  at:  hap://oro.open.ac.uk/22283/  

•  De  Liddo,  Anna  and  Alevizou,  Panagiota  (2010).  A  method  and  tool  to  support  the  analysis  and  enhance  the  understanding  of  peer-­‐-­‐to-­‐-­‐peer  learning  experiences.  In:  OpenED2010:  Seventh  Annual  Open  Educa4on  Conference,  2-­‐4  Nov  2010,  Barcelona,  Spain.  Available  at:  hap://oro.open.ac.uk/23392/  

•  Buckingham  Shum,  Simon  (2007).  Hypermedia  Discourse:  Contes/ng  networks  of  ideas  and  arguments.  In:  Priss,  U.;  Polovina,  S.  and  Hill,  R.  eds.  Conceptual  Structures:  Knowledge  Architectures  for  Smart  Applica4ons.  Berlin:  Springer,  pp.  29–44.  

Thanks for Your Attention! Anna De Liddo

[email protected]

http://people.kmi.open.ac.uk/anna/