45

UNESCO Expert Meeting on Aspects of Literacy Assessment

Embed Size (px)

Citation preview

Page 1: UNESCO Expert Meeting on Aspects of Literacy Assessment
Page 2: UNESCO Expert Meeting on Aspects of Literacy Assessment

Aspect of Literacy Assessment

Topics and Issues from the UNESCO Expert Meeting

(ED-2005NVW23) IO-12 June l Paris

Page 3: UNESCO Expert Meeting on Aspects of Literacy Assessment

n n n n n :’

L iteracy is a central concern for UNESCO and is a key element of the fundamental right to education. The importance of literacy has been confirmed through successive international conferences, the most recent ones being: The World Conference on Education for All (Jomtien, 1990), CONFINTEA V (Hamburg,

1997) and the Dakar Framework for Action (Dakar, 2000).

However, the present situation of literacy in the world is of great concern. This serious challenge has been recognized in the Dakar Framework that aims at achieving a 50 % improvement in levels of adult literacy by 2015. By the same token, the importance of quality and excellence of education has been recognized so that measurable learning outcomes are achieved by all, in particular in literacy, numeracy and life skills.

The UN Literacy Decade, which was launched in February 2003 in New York, provides a new thrust for providing literacy for all. The Literacy Decade adopts a new approach and vision of literacy, which is no longer seen as a single concept, but rather as ‘literacies’ referring to multiple types and levels.

Given the present situation, urgent action is needed to ensure that the Dakar goals will be reached. Action must be taken by countries, the international community, civil society and development partners. The aim of this meeting was to open the debate on many important issues and to lead to concrete action where is it most needed. This meeting, which is firmly embedded in the UN Literacy Decade, is a starting point, although it stems from previous initiatives and UNESCO’s intention is to continue the work already started.

The meeting addressed two crucial issues: the elaboration of a general understanding of the renewed vision of literacy, and that of literacy assessment.

There is a perceived need for better statistical data on literacy leading to a deeper understanding of literacy acquisition and practice. A considerable amount of research and effort has already been undertaken by a number of institutions to develop literacy

, assessment methodologies which lead to reliable data. A number of methodologies have been developed and tested, as well as national surveys conducted. These methodologies served as starting points for the discussions of the meeting.

This meeting’s aim was to develop a conceptual framework and an operational definition of literacy for literacy assessment and contribute to a major literacy assessment programme ‘Literacy Assessment and Monitoring Programme’ (LAMP), initiated by UIS in 2003.

This document deals in particular with the deliberations of the meeting on literacy assessment. It is expected that, based on its outcomes, further work in this domain will be undertaken.

Section for Literacy and Non-Formal Education Division of Basic Education

UNESCO

Paris, June 2005

Page 4: UNESCO Expert Meeting on Aspects of Literacy Assessment

T his document is the outcome of the expert meeting on aspects of literacy assessment, which was jointly organised by the Division of Basic Education, the UNESCO Institute for Education and the UNESCO Institute for Statistics and held from 10 to 12 June 2003 at UNESCO Paris. The document, which

has been written by Clinton Robinson, consultant to UNESCO, summarizes the various topics and issues discussed during the meeting and also introduces some reflections and discussions that are presently taking place on literacy and literacy assessment, in order give a broader perspective to this paper. One of the outcomes of the meeting is a proposed operational definition of literacy for literacy assessment, which is an excellent starting point for further work in this area.

We would like to thank in particular the author for his outstanding work, and Margarte Sachs-Israel for the editorial coordination of this document, as well as the international experts, who made this document possible, namely Ms. Amina Ibrahim, Mr. Ahmed Oyinlola, Mr. Laouali Moussa, Mr. Abdulwahid Adbulla Yousif, Mr. R. Govinda, Mr. llapavuluri Subbarao, Mr. Ochirkhuyag Gankhuyag, Ms. Isabel Infante, Ms. Sophia Valdivielso Gomez, Mr. Siddiqur Rahman, Ms. Chilisa Bagele, Mr. Alan Rogers, Mr. Patrick Werquin, Mr. Scott Murray, Mr. Irwin Kirsch. Mr. David Archer, Mr. Jean-Pierre Jeantheau, Mr. Kazi Rafiqui Alam, Ms. Vera Maria Masagao Ribiero, and Mr. Chander Daswani.

Section for Literacy and Non-Formal Education Section Division of Basic Education

Paris, June 2005

--

Page 5: UNESCO Expert Meeting on Aspects of Literacy Assessment

PREFACE ...................................................................................................................... 3

ACKNOWLEDGEMENTS .............................................................................................. 5

INTRODUCTION ........................................................................................................... 9

BACKGROUND AND PURPOSE .................................................................................. 12

n NATURE OF THIS PAPER ............................................................................................... 12

RENEWED VISION OF LITERACY ................................................................................ 14

n LITERACY IN USE ........................................................................................................ 15

DEFINITIONS OF LITERACY . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

ASSESSMENT ............................................................................................................... 10

n WHAT ARE THE PURPOSES OF ASSESSMENT? .................................................................... 10

n WHAT IS TO BE ASSESSED? .......................................................................................... 12

l Assessing literacy skills and competencies ........................................................ 12

l Assessing literacy use ........................................................................................ 13

l Assessing the impact of literacy ......................................................................... 14

l What can be compared? ................................................................................... 15

n How WILL ASSESSMENT BE CARRIED OUT? ....................................................................... 16

l Developing national assessment processes ....................................................... 16

l Selecting assessment methods .......................................................................... 17

DOMAINS ...................................................................................................................... 20

n DETERMINING DOMAINS ................................................................................................ 20

n LITERATE ENVIRONMENT: AN ALTERNATIVE CONCEPT ............................................................. 22

LEVELS ......................................................................................................................... 23

THE ROAD AHEAD ....................................................................................................... 27

REFERENCES ............................................................................................................... 29

ANNEX: LIST OF PARTICIPANTS ................................................................................. 30

Page 6: UNESCO Expert Meeting on Aspects of Literacy Assessment

L iteracy assessment is a critical component of monitoring educational progress. In adopting the Education for All (EFA) goals and the Millennium Development Goals (MDGs), individual countries and the international community recognise the need for consistent data based on clear indicators and reliable methods. It

is well known that literacy rates currently draw on a variety of indicators, many of them serving as proxies for levels of actual literacy use. This situation is in need of urgent attention, given the renewed emphasis on enhancing literacy learning opportunities for the world’s 800+ million adults who at present have no direct access to the world of written communication.

This situation led UNESCO to convene an Expert Meeting on Literacy at UNESCO Headquarters in Paris, 10 - 12 July 2003, to look at the way forward. The meeting focused on an examination of UNESCO’s draft Literacy Position Paper and on Literacy Assessment - the latter being the focus of the current paper. The meeting was timely in that it responded to a growing international consensus that little progress could be made in our knowledge of the use, distribution, acquisition and programming of adult literacy without significantly improved assessment processes. This consensus had found expression in a number of international statements and declarations.

Literacy as a learning tool is a cross-cutting theme of the six Dakar EFA goals, and is a specific component of three of them; the MDGs do not include literacy as a goal, but see literacy rates as an indicator of effective and quality primary schooling and of the elimination of gender disparities in education. Meaningful use of these MDG indicators, as well as knowledge of how well the world is implementing the EFA goals, will depend on literacy assessment processes which are transparent, useable, reliable and adaptable to a wide variety of contexts.

For this reason, the Dakar Framework for Action calls for work on ‘establishing targets and indicators’ and states that ‘robust and reliable education statistics, disaggregated and based on accurate census data, are essential if progress is to be properly measured, experience shared and lessons learned’ (UNESCO 2000: 21). The international Strategy to put the Dakar Framework for Action into operation (UNESCO 2002a) identified literacy as one of the areas of EFA where better indicators were needed. With particular reference to literacy, the 1997 Hamburg Declaration and Agenda for the Future of CONFINTEA V linked monitoring and evaluation with the improvement of the quality of literacy programmes; this would be achieved

Page 7: UNESCO Expert Meeting on Aspects of Literacy Assessment

. ..by designing an international programme for the development of literacy monitoring and evaluation systems and feedback systems that promote local input and participation by the community in the improvement of the programme at the international, regional and national levels, and by establishing a worldwide information base for promoting policies and management and for improving the quality, efficiency and sustainability of such efforts. (UNESCO 1997: 31)

The emphasis on local and contextualised approaches raises questions of the purposes, content and methods of literacy assessment which are crucial to the debate and which will find echoes later in this analysis. It is in the planning of the UN Literacy Decade that we find the most explicit concern expressed regarding the need for better ways of assessing literacy:

For the success of the Literacy for All Programme, it is necessary to build functional monitoring information systems across various programmes and various levels (institutional/sub-national/national/international). The systems should be designed to provide reliable and meaningful information on the status of literacy among the population, on the uses and impact of literacy and on the performance and effectiveness of literacy programmes. Relevant actions proposed are:

a) Refine literacy indicators and methodologies to enable countries to systematically collect and disseminate more and better information, with particular attention to providing information on gender gaps

b) Promote widespread and better use of population data, for example through demographic censuses and surveys, in monitoring literacy status, use and impact among the population;

c) Develop cost-effective methods for assessing literacy levels of individuals for use in literacy surveys, as well as in the regular evaluation of learning outcomes at the programme level;

d) Build information systems to support policies and management of non-formal education among agencies, programmes, learners and educators;

e) Establish long-term tracking systems of new literates for studying the impact of literacy on the quality of life.

(UN 2002: 8)

This focus on the methods and information systems of literacy assessment leaves open the question of what is to be assessed in different contexts and whether the results will be comparable within and between countries. The EFA Global Monitoring Report 2002 drew attention to this, as well as to the question of how far local (ie sub-national) realities should shape assessment processes:

- - - - “ - I . . . ” 1 - - - - - -

Page 8: UNESCO Expert Meeting on Aspects of Literacy Assessment

At the national and international levels, [assessment] implies the identification of typical literacy scales, practices and criteria, and ways and means to apply them in national contexts and local environments. [. . .] Today most countries use a ‘functional definition’ of literacy in national assessments - one that captures the ability of people to use literacy to carry out everyday tasks. However, these common tasks vary according to local context, culture and requirements , which may not make the results strictly comparable between one place and another. (EFA GMR 2002b: 66)

The following year’s edition of the same report (EFA GMR 2003/4) notes, once again, the lack of reliable data on literacy, bemoaning the dubious validity of literacy rates that are

. . . based on self-proclaimed literacy, or on the assumption that an individual is literate when he or she has completed a certain number of years of basic education. From various school surveys it can be concluded that this assumption is too optimistic. (UNESCO 2003b: 86)

The report calls therefore for ‘genuine literacy data, based on direct assessment and supported by information on the ‘educational history’ of the individual, and refers to the Literacy Assessment and Monitoring Programme (LAMP) initiative as a step in this direction - LAMP is a focus of discussion throughout this paper.

These references to international meetings and reports serve to underline the importance and attention accorded to the need for real change in literacy assessment towards more solid, comparable, reliable and transparent methods. They also highlight some of the issues - definition, context, comparability, indicators - which the Expert Meeting was convened to address.

Page 9: UNESCO Expert Meeting on Aspects of Literacy Assessment

W W W Background and Purpose W W W n I

The specific objectives of the meeting were twofold:

l Drafting a general conceptual framework for the development of an assessment methodology for literacy, in particular in the context of developing countries, thus contributing the ‘Literacy Assessment and Monitoring Programme (LAMP),

l Reaching consensus on an operational definition including levels and domains of literacy to be assessed, in given socio-cultural context on which literacy assessment methodologies can be based

LAMP - the Literacy Assessment and Monitoring Programme - is an initiative of the UNESCO Institute of Statistics in cooperation with UNESCO Headquarters, UIE, the World Bank and other international and technical partners. Its scope and shape were presented at the meeting in order that participants’ insights and contributions might give further input. There is therefore reference to LAMP throughout this paper, with the understanding that steps have already been taken to implement a pilot phase in four countries.

The Expert Meeting thus took place as an important event within a broad and ongoing international debate about the nature of literacy, its links to development, its place within practices of communication, as well as how to assess it. Participation in the meeting reflected the range of stakeholders in these debates: academics and researchers, civil society and NGO representatives, national educational planners and administrators, representatives of UNESCO institutes and headquarters. The first part of the meeting considered the draft of the UNESCO Literacy Position Paper, and this provided particular input for discussion of an operational definition of literacy. In its turn, the Position Paper situates itself in the maelstrom of ideas surrounding the redefinition of literacy as social practice - the first time UNESCO has sought to grapple with the implications of such a concept. Thus the conceptual undercurrents of the discussion of assessment were strongly marked by what was for UNESCO as an institution a new way of conceptualising literacy.

n NATURE OF THIS PAPER

This paper draws on the input and discussions of the UNESCO Expert Meeting on Literacy, particularly its second theme of assessment. However, it is not strictly a report of that meeting. While remaining faithful to the underlying principles expressed at that event, and presenting the essential elements of the debate and the points of view of participants, it draws out some of the implications of the discussions and elaborates on a number of concepts which time did not permit full discussion of during the meeting.

Page 10: UNESCO Expert Meeting on Aspects of Literacy Assessment

In this paper we use literacy, in the singular, as a generic concept which implies the plural notion of literacies. Thus the use of the singular should not be taken to mean that literacy takes a single form, nor that it can be acquired or used in one particular way. It implies literacy-in-context, multiple purposes and manifestations, multiple languages and ways of learning - in fact a multi-dimensional concept. This is in line with the expression of feeling of the Expert Meeting which endorsed an understanding of the multi-dimensional nature of literacy, but wished to maintain the singular use of the term ‘literacy’ in order to preserve some continuity with past practice. How far ‘literacies’ and ‘multi-dimensional literacy’ convey divergent notions will be explored in a preliminary way below.

Page 11: UNESCO Expert Meeting on Aspects of Literacy Assessment

W W W Renewed Vision of Literacy W H H W W

T he UN Literacy Decade clearly states the need for a renewed vision of literacy if real progress is to be made in enabling the excluded to gain access to the means of written communication. It calls for such a vision to go beyond earlier conceptions of literacy, echoing the message of Jomtien

that an expanded vision of basic education must the basis for Education for All. The renewed vision of literacy.. .

. . .situates Literacy for All at the heart of Education for All. Literacy is central to all levels of education, through all delivery modes-formal, non-formal and informal. Literacy for All encompasses the educational needs of all human beings in all settings and contexts, in the North and the South, the urban and the rural, those in school and those out of school, adults and children, boys and girls, and men and women. (UN 2002: 4)

Literacy must be seen as a tool for communication and for learning, not as a technique or skill valuable for its own sake. Literacy has no meaning apart from what it enables communities and individuals to do better. This vision goes beyond a mere re-assertion of the essential need for literacy, or a response to the scandalously large numbers without access to literacy, or a commitment to re-invigorate tired programmes. It calls for a re-connection of literacy with the rest of life and a recognition that literacy practices are embedded in social structures, political processes, personal circumstances, economic opportunity and globalising influences. In this sense it espouses a plural and multidimensional view of literacy. Thus the vision for literacy is about promoting relevant and meaningful learning for social transformation, justice, and personal and collective freedom. It is part of a broad debate about learning, education, diversity, development and power. These multiple connections shape the value and use of literacy and so influence how it is acquired on the ground. It is not so much a matter of promoting literacy as of offering literacy as one means of communication and learning in specific contexts and places.

There is a consensus in the international community that poverty alleviation should be the leitmotiv of development efforts and cooperation. Just as poverty alleviation must address all aspects of life, and not just economic levels, so literacy is intersectoral and must connect with people’s goals, values, aims, aspirations, challenges, difficulties, hopes and fears, as well as with the physical, social, cultural and political realities of the local and broader contexts.

Page 12: UNESCO Expert Meeting on Aspects of Literacy Assessment

W LITERACY IN USE

Literacy is not in itself liberating - that depends on the way it is acquired and used, aspects that are socially determined. Literacy may be a means of domination, for example when it is taught to promote particular ideologies or where new readers are served a diet of propaganda. More subtly, literacy promotion often serves to socialise learners into the dominant social discourse, rather than opening up new opportunities of expression and creative diversity. While the theory of literacy has moved to a social view of literacy, policies of literacy promotion have lagged far behind. Many official policies continue to deliver literacy as a functional and standardised skill, with little attention to differences of social context and scant regard for dominating or liberating effects.

In the Expert Meeting, this broader and socially embedded view of literacy was further linked to citizenship, identity, the use of indigenous languages and knowledge, and the promotion of equity (gender, minorities) and human rights. Improving the quality of literacy learning opportunities figured significantly in the suggestions for action, both through enhancing delivery (training of literacy personnel, more relevant learning processes, languages and materials), through building stronger connections with fields of literacy use (health, justice, urban and rural development), and through better feedback, monitoring and evaluation systems.’ As the outlook broadened to take in the whole social canvas into which the literacy thread was woven, the influence of context, purpose, language, ideology and other parameters became clear and compelling. It was no longer possible to speak of a singular literacy-the uses and practices of literacy were clearly plural. The communication landscape consists, as far as the written dimension is concerned, of literacies.

The distinctiveness between literacies of different kinds derives from the multiplicity of different connections with the social context in which it is used. Thus literacy, for example, in the rural nomadic and pastoral communities of northern Kenya has a different shape and meaning from that of employees in government tax offices in Nairobi -the languages, purposes, medium, and mode of acquisition may all differ. There is nevertheless a dynamic continuity between the two: first, because the two social groups may need to interact in some way through written communication, and second, because a member of the northern Kenyan community today may tomorrow become a government employee in Nairobi, reshaping their literacy practices. Literacies are dynamic and inter-related, even as they are observably different. The concept and practice of “literacies” are in constant and dynamic evolution, with new perspectives reflecting societal change, globalising influences on language, culture and identity, and the growth of electronic communication.

In opting for a multi-dimensional concept of literacy the Expert Meeting accepted and endorsed the notion that literacy is plural. The meeting also agreed, however, that it preferred to use the expression ‘literacy: a plural notion’ rather than the term ‘literacies’, as the way for UNESCO to articulate its position on literacy.* What significance may be attributed to the reluctance to use the term ‘literacies’? Two considerations appeared to be pertinent in the participants’ deliberations:

l- See also UNESCO. 2003a. 2- See UNESCO. 2004 [UNESCO Education Position Paper]

Page 13: UNESCO Expert Meeting on Aspects of Literacy Assessment

l First, terminological unease: the leap to the use of the rather unusual plural of an English abstract noun indicated, in the view of some, rather too radical a discontinuity with previous UNESCO and international usage, outside of academic circles;

l Second, conceptual reserve: the newness of the idea, coupled with an uncertainty as to its full possible implications, caused some to prefer a qualified version of the familiar ‘literacy’.

This hesitancy, however, did not detract from UNESCO’s and the participants recognition that a monolithic and universal view of literacy belongs to outmoded discourse.

16

n

Page 14: UNESCO Expert Meeting on Aspects of Literacy Assessment

n n n n n Definitions of Literacy -:i W W

T he last fifty years have seen a constant succession of attempts to define literacy, with full engagement of UNESCO in that process. International conferences on education, lifelong learning and adult education have made particularly strong contributions.3 These attempts signal the desire, on the part of some, to come

up with a universal definition of literacy, as well as the difficulty in doing so. The reasons for seeking a single definition are rarely spelled out - they appear to relate to the need for those involved in promoting literacy both to define their own goals and approaches, and to establish a consensus around which active cooperation can be built.

The Expert Meeting did not wish to launch yet another general or universal definition of literacy, but rather to agree on an operational definition of literacy for assessment purposes - a definition therefore which would serve as a basis for the elaboration of indicators, domains and levels in assessment in different contexts. Thus the definition was to serve as no more than a starting point for application to assessment design - agreement on it would enable assessment results to be compared internationally.

It is worth noting the reasons why the search for a universal definition is so difficult, although the reasons themselves beg the question of what literacy is, in a rather circular way:

l A plural or multidimensional understanding means that one definition will never be valid or applicable across the multiple literacies to be found in a community, or even within the repertoire of an individual;

l Literacy is a language-based activity and thus is shaped by and affects the range of other events and practices which are language-based - oral communication, multilingualism, dialogue, interaction and relationships, argument and discussion, propaganda and polemics, poetry, artistic creation and wordplay.

l The multiple and close connections of literacy use with other learning outcomes means that a single definition is never adequate to express its impact - for instance, literacy links with lifeskills, critical thinking, community participation and political voice, and could be defined in relation to any or all of such impacts (and of course others as well):

l The process or acquiring, teaching or organizing the provision of literacy is complex-for instance, involving both the individual and the social, a wide variety of disciplines in intervention, cooperation among many institutions;

l Literacy is an evolving concept, where its nature and uses constantly change and adapt to new technologies, new circumstances and new demands; in this sense not just societies, but also individuals manifest and use an evolving range of literacies.

3- See Yousif 2003.

Page 15: UNESCO Expert Meeting on Aspects of Literacy Assessment

Literacy is thus a multi-dimensional concept which defies neat and brief definition. Some in the Expert Meeting called for a more radical recognition of this characteristic, positing the concept of ‘integral literacy’. This notion seeks to encompass both the individual and social aspects, together with change over time and embeddedness in/ sensitivity to context. Its dimensions may be outlined as follows:

lnteriqr dimensions w

l Intentional and cultural

l Subjective and inter-subjective knowledge

l Construction of the individual and collective identity

l Emotional literacy and cultural literacy

l Communicative learning

l Learning to be

l Learning to live together

Source: Valdivielso Gomez 2003.

Exterior dimensions m

Behavioural and social

Objective knowledge

Transmission of objective knowledge

Functional literacy

Instrumental learning

Learning to do

Learning to know how

Discussion of these notions led to asking how international assessment can capture this complex and multidimensional reality. At this stage of our knowledge there is no single instrument which can reflect such complexity, but rather multiple approaches and tools which assess selected dimensions of literacy. These perspectives are not ‘integral’ in the sense of the above model, but rather seek to make a whole out of a part. According to this scheme, any definition of ‘integral’ literacy should be all-encompassing and integrate all the dimensions shown the chart.

In working towards a definition of literacy for the purposes of assessment a number of specific questions were raised at the Expert Meeting, of which the following were prominent in the presentations and debate:

l How broad should the definition be? Should it include forms of communication, such as the spoken word, which are concomitant upon literacy and without which literacy cannot have its maximum impact? Is literacy ‘learning to participate in communicative events’?4

l What place to give to context in defining literacy? The definition of literacy cannot be defined in the abstract, but only within the cultural and national context, which is linked to languages, scripts, and socio-cultural realities. Thus, definitions of literacy will be different according to national contexts. If literacy is a moving target, if it is relativeand contextual, isacommon definition possible, and, cruciallyforassessment purposes, is comparison of literacy levels between countries possible?

4- see lnfante 2003.

18

- - . . - “ - __l_l,_ , “ “ - ^ . . “ . -~ l ^ - I - . _ , _l.-“l- - . . I .__ I

Page 16: UNESCO Expert Meeting on Aspects of Literacy Assessment

l Is numeracy to be included in literacy? If so, can it be assumed that assessments result in comparative literacy rates that mean something in terms of levels of numeracy also?

l Is it valid to maintain the literate / non-literate dichotomy? Can assessment move away from this towards seeing literacy as a continuum which does not therefore put people into negative categories and label them with such a stigma?

l How far should the results of literacy use (lifeskills, active citizenship, political voice , . ..) be evoked in the definition, compared to an emphasis on abilities or skills? How true is it to say: ‘Literacy is what literacy does’?

l Can the definition be broad enough to allow for assessment in school and out- of-school contexts?

The division of people into ‘literate’ and ‘illiterate’ has often been part of the discussion of the definition of literacy: one of the reasons to define literacy was to find out how many had not got it. The categorisation of people by what they lacked has increasingly been seen as unacceptable, let alone the difficulty of trying to determine what ‘being illiterate’ actually means. Too often it has been used as a portmanteau term to include a whole range of elements deemed to be missing from those who do not or cannot participate in a so-called ‘modern’ society (see also ‘Levels’ section below). Stigmatised ‘illiterates’ were considered to be on the margins of society and passive consumers of culture which they had no part in shaping. Greater awareness of the nature of societies with a strong oral tradition, as well as an emphasis on human rights, has moved the debate away from a ‘literate/illiterate’ dichotomy to an understanding of literacy as a continuum of communication. This too has resulted in moving definitions of literacy towards a focus on how it is used, rather than merely as an individual skill.

Initial discussions of LAMP suggested that the IALS definition should serve as a starting point. The IALS defines literacy as:

The ability to understand and employ printed information in daily activities, at home, at work and in the community - to achieve one’s goals, and to develop one’s knowledge and potential. (IALS n.d. 4)

It is worth noting that the IALS also recognised the links which literacy has with other domains of individual and social action - couching these not in terms of what literacy is, but what it does:

Literacy skills yield many benefits. For individuals, literacy contributes to personal development through improved participation in society and the labour market. Literacy also contributes to the economic and social performance of society. It is a necessary ingredient for citizenship and community participation, and shapes the labour force of a country, through higher participation rates, higher skill composition and lower chances of unemployment. (IALS n.d. 4)

Page 17: UNESCO Expert Meeting on Aspects of Literacy Assessment

The IALS thus reflects an understanding of literacy as a multi-faceted and complex phenomenon, rather than as a condition which people do or do not have, to be assessed by a single measure. Other suggestions in the Expert Meeting explicitly linked the definition of literacy to its broader impact, with echoes of Freire’s process of ‘transforming the world’ and conscientisation:5

Literacy [...I refers to the set of essential learning tools comprising of knowledge, values and skills that enable the learner to better understand her or his environment and transform it to improve the quality of life, both individually and collectively. (Subba Rao 2003)

On the basis of its definition and in order to measure proficiency levels, the IALS set up three different kinds of literacy domain: prose, document and quantitative literacy. In each case the IALS presented the definition in terms of the ‘knowledge and skills required’ to undertake the literacy activities. In further definition of the criteria of assessment IALS details what ‘the reader’ should be able to do within each domain. This discourse sets the survey firmly in the framework of an autonomous view of literacy, where the focus is on individual competencies rather than socially contextualised practices, and where literacy is conceived essentially as consuming text (reading), rather than producing it (writing, keyboarding). The domains themselves are defined as what already exists in the environment which ‘the reader’ may have to cope with, such as books, forms, magazines and so on. There is little sense of who or what lies behind the production of these texts, or what kinds of literacy practices led to their creation. In other words, the patterns of institutional and social literacy behaviour are assumed, and unexplained. Literacy of certain kinds is assumed, in this framework, to be part of the way (Western) society necessarily functions - alternative or critical literacies, creative literacies, personal literacies, or literacies which may challenge the existing power relations between institutions and the individual were not part of the survey. Thus the IALS definition raises more questions about the nature of literacy than it answers - if it is taken as a general definition. However, it is more appropriate to see it as an example of a contextualised definition, given the settings in which IALS was carried out and its purposes. It was undertaken in OECD countries with the purpose of understanding more fully the links between literacy levels and individual participation in the market economy. The IALS definition illustrates the need to define literacy with the purposes and intended outcomes of the assessment in mind.

These issues raise the question of how far literacy is defined as ‘functional’, a term which has been used over the years to indicate everything from development information to vocational skills to literacy for social purposes. Within the debate on the plural nature of literacy, the Expert Meeting raised questions as to what constitutes functional and dysfunctional literacy. This distinction was seen to relate to the participation or non- participation of certain groups in society - whether they have voice or not. It was also suggested that functional literacy is not related to skills that people have or do not have, but to the capacity to access and learn new literacies when they need to. Functionality asks the question: can a person process and use the written materials to be found in

5- Cf Freire 1972

Page 18: UNESCO Expert Meeting on Aspects of Literacy Assessment

her or his community? On this basis the ultimate conclusion is that literacy can only be defined, in functional terms, by individuals in their own circumstances - that, however, would be to minimise the importance of the social context.

In discussion, a number of key elements were identified as essential to an operational definition; literacy will be understood as having the following characteristics:

l Understood in the framework of communication, as one strategy among; others;

l Related to text and the written word;

l Including the manipulation of numbers - numeracy;

l Giving importance to context;

l Implying some ability, skill or knowledge;

l Concerned with use in relation to life goals;

l Linked to participation in society;

l Multi-dimensional, with connections among all of these characteristics.

Based on these elements a proposed operational definition was agreed - assessment design in particular contexts may, it is hoped, avail themselves of this definition as a starting point for elaborating indicators and methods which lead to internationally comparable results (see box).

- __--.- .-_

Page 19: UNESCO Expert Meeting on Aspects of Literacy Assessment

n WHAT ARE THE PURPOSES OF ASSESSMENT?

Everyone agrees that there is an urgent need for more reliable and more accurate data on literacy, and it is clear that the current unsatisfactory situation derives in large measure from the use of inadequate or inappropriate assessment methods. However, the choice of methods depends on what the purposes of the assessment are.

The question of ‘why’ was linked strongly to ‘who’ - who is the assessment carried out for and what were their purposes. This relates closely to levels, for example:

l For purposes of comparison international agencies look for ways of generating comparable statistics across countries;

l National governments may look for literacy assessment data as input into policy and programme planning;

l Provincial authorities, education departments and NGOs may want data for planning or improving the delivery of literacy at community or programme level;

l Instructors/teachers may want to know what is happening at group/classroom level in order to reflect on and improve educational practice;

l Assessment also serves learners themselves, providing feedback - in some cases learners actively desire evaluation and certification.

These actors have a variety of uses for assessment data:

l To set priorities for resource allocation at national or international level;

l To survey achievement and the effectiveness of delivery systems;

l To inform and improve literacy acquisition methodologies;

l To identify neglected or hard-to-reach population groups and provide a basis for innovative programmes;

l To illuminate areas for concerted action and cooperative ventures;

l To strengthen the movement for literacy and provide a basis and incentive for good practice;

l To give feedback to learners and raise their levels of confidence;

l To motivate literacy workers, showing them that they are able to make a difference.

_.. . ..- -___ ..-. -.--_-- -,-. II --.- _-.. -- ,_ ._-......_...-. --__ ._ _..- “” .._- -_.

Page 20: UNESCO Expert Meeting on Aspects of Literacy Assessment

Assessment processes and data often serve as means of establishing accountability - and thus increased trust - between government, civil society, and funding agencies. It is clear that different actors have very different purposes for assessment, and yet often the same instruments are expected to produce data for some or all of these purposes. This is problematic and probably cannot be done. We must therefore make a strong statement of caveat: no single method or package of methods will be able to satisfy all the purposes of assessment, and no such claims or expectations should be entertained

for a particular survey.

Indeed, this problem is rendered even more complex by the conflicting nature of these different purposes. It may be of little more than academic interest to learners or instructors to know how an aggregated literacy rate for the whole country relates to that of countries in other continents. What is more pressing is to know what sorts of literacy will give access to broader opportunities in the local environment and other environments which people wish to access. Thus an understanding of how literacy is used and of the literacy demands in the context will be more useful than being able to rank learners on a national/international scale. In practical terms this may lead to competing claims on the resources available nationally for literacy. It may interest the government to be able to present statistics which are internationally comparable, giving it a measure of its relative progress and its place in the international league tables of development. Resources devoted to those purposes may not then be available for assessing the contextual fit of literacy programmes to local situations or for the support of local organisations. Governments and their partners have to make difficult choices -the main concern is that they should do so in full awareness of the purposes that a certain kind of assessment will and will not achieve.

The goal of the IALS was to “create comparable literacy profiles across national, linguistic and cultural boundaries” with special reference to the links, if any, between levels of literacy and economic potential, with respect both to individuals and to societies as a whole. The survey concludes that there is a “strong association between a country’s literacy skills and its economic performance” while not claiming direct causality. If this is so, then it is in a country’s economic interest to know which direction literacy is heading; this may not, however, be the same reason why an individual or an NGO wishes to assess uses of literacy. An individual may be concerned about broader participation in the literate environment, including personal communication, cultural expression or religious devotion. Similarly, an NGO programme which includes literacy may aim as much as at enabling ongoing learning through using literacy as one of the tools, or at developing self-confidence and solidarity through participation in a learning event. In the latter cases literacy will not be measured in terms of skills or competencies, but in terms of participation in social life, the capacity to take initiative, or the growth and vitality of community organisations.

Again, the importance of context emerges, now as a factor in determining the purposes of literacy assessment. This context is not only the social environment, existing literacy practices and institutional behaviour, but also the motivations, personal and community-

23

Page 21: UNESCO Expert Meeting on Aspects of Literacy Assessment

wide, which lie behind a desire to acquire and use literacy. These considerations have more to do with using assessment as a tool for evaluation and planning at the programme level, than with a national assessment for the purposes of international comparison. However, they mean that any attempt to organize assessment at a national level will find the picture increasingly complex the closer it comes to the level of community and learner. Further, a national literacy assessment which relies on the cooperation of local agents, such as NGOs, must first ascertain how far there is agreement on what is to be assessed.

Another factor in defining the purposes of assessment is the policy context: how can assessment of literacy and numeracy serve as a basis for policy formulation at national level, or more widely at a regional or international level? SACMEQ took this as a starting point in defining the assessment process out of a concern to collect and analyse data which would be maximally useful as policy input. Before the research design, consultations with decision-makers in education ministries identified high-priority policy concerns which would define and set limits to the data, methods and analysis of the assessment.

LAMP sets the purposes of assessment at two levels: national and international. At national level literacy data will serve to inform policy-making and programme design, while at international level it will be part of educational monitoring, particularly of EFA goals, and will give input into policy-making in multilateral institutions and funding agencies. Two further goals are related to the LAMP process: the elaboration of new and better methodologies for literacy assessment, and the building of capacity in the use of them.

n WHAT IS TO BE ASSESSED?

If it is possible to approach the definition of literacy in a number of ways, as discussed earlier, it is no surprise that there are questions about precisely what is to be assessed. Three large areas of literacy assessment were distinguished by the Expert Meeting, although there overlap between them; they are literacy skills and competencies, literacy use and the impact of literacy.

Assessing literacy skills and competencies

Current views of literacy are at pains to demonstrate that literacy is socially situated and should not be construed solely as an individual skill or competency. While the Expert Meeting developed a consensus around the need to understand literacy as social practice and to use these perceptions in assessment design, the participants nevertheless maintained a focus also on the fact that the effective use of literacy requires some skill acquisition. The proposed operational definition of literacy for assessment purposes demonstrates this by its use of the word ‘ability’: ‘Literacy is the ability to identify, understand, . . .’

As will be clear from the later discussion of assessment domains and levels, the notion of skills and competencies is prominent in designing assessment methods and tests. In fact, more attention is given to this kind of test than to assessing the use and impact

Page 22: UNESCO Expert Meeting on Aspects of Literacy Assessment

of literacy. This is true both of the existing schemes and experiences as well as of the proposed IAMP approaches.

To discover how far individuals have mastered the techniques of literacy, serves its own purposes, among which may be the improvement of acquisition processes based on psycho-linguistic insights. However, it must be clear what such assessment does and does not tell us. If the communication functions of literacy are in focus, a purely technical assessment of decoding and encoding skills will reveal nothing about what literacy can do for people, how it is of advantage to them how it enhances their daily lives or changes their relationships with individuals and institutions.

Some testing of skills and competencies focuses on the completion of certain levels of tasks - reading a simple story, understanding a procedural manual, filling in a form, expressing your own ideas, for instance. This goes some way towards bringing skills together with use, enabling an assessment of how far people are able to participate in the literate environment, and in the wider society through written communication. The interpretation of results of such assessments may maintain a focus on technical competence by highlighting the gradations of language (sentence length, accuracy of punctuation, range of vocabulary) or examine the capacity to use skills in terms of what is achieved. Some examples of the latter may be: successful application to an institution through filling in a form, regular contact with distant relatives by writing letters/e-mails, participating in meetings through reading/writing minutes, and so on. What will make the difference in focus is the extent to which the broader social context of literacy use is taken into account, as the next section shows.

Assessing literacy use

The Expert Meeting’s proposed operational definition of literacy included the words ‘interpret, create, communicate and compute’, thus signalling that it is what people do with literacy that is going to count - it is their use of literacy therefore which must be assessed.6 This, as well as the multi-dimensional concept of literacy highlighted in the UNESCO Literacy Position Paper, has a number of implications for assessment.

The first is that new measures need to be found which focus on use, rather than merely counting those who have succeeded in or attended school or adult literacy programmes. In examining the growth of literacy in Europe from the eighteenth to the twentieth centuries, Vincent (2000) made extensive use of the growth in the use of postal services. Other indicators of literacy use in today’s world might include newspaper and magazine publishing and consumption, book publication, computer ownership, e-mail and phone texting volumes. As these reveal different kinds of literacy, they may provide a basis for the development of additional useful indicators.

If literacies are defined by their use in context, then assessing how literate a community has become will differ from one community to another. Literacy means different things in different places and to different people. Locally relevant assessment procedures will give

6- The dimensions of literacy use are spelled out in the section above: Literacy in Use.

25

Page 23: UNESCO Expert Meeting on Aspects of Literacy Assessment

the best picture of the growing use of literacy, but their context-specific nature will make it less meaningful and more difficult to generalise across provinces/states, let alone arrive at national estimates. At an individual level, the processes of reading and writing involve a perception of the relationship between text and context - one aim of literacy being to enable learners to reflect critically on this dynamic and understand the power relations underlying literacy use in society.

Communication practices, both oral and written, personal and institutional, are part of the local context and so they should be taken into account in assessing literacy use. There are few models of this can be done, beyond an emphasis on ethnographic methods. Part of the communication context is the patterns of dialogue which pertain between institutions and the individual, particularly important in developing countries where agencies, governmental or non-governmental, intervene directly at community level and thus have an active interface with the population. Assessing the use of literacy would involve asking questions such as:

l What is the relative importance given to oral and written communication?

l What is the level of communicative access by the population to the institutional leaders and decision-makers?

l Is communication one-way or two-way?

l How open are the structures and processes of the institution to external input?

Answers to these questions will have a significant impact on the range and quality of literacy use in communities where development institutions exercise considerable influence or wield significant resources.

The way in which literacies are acquired may also be a factor in deciding how to assess use. Frequently, school-based literacy is taken as the norm -witness assessments which use the number of years of schooling as a measure of literacy achievement. People acquire literacy in many other ways - in family circles, as part of daily interpersonal relationships, or in religious contexts. Literacies in non-dominant languages have often been ignored or downplayed in assessment, giving priority to enumerating only those who could be considered literate in a dominant, official or educational language. Assessing all kinds of literacy practices will be important to obtain a clear picture of national and sub-national patterns of use, even if it makes international comparison more complex and demanding.

The example of a programme from Nepal (Community Literacy Project) demonstrates a structured approach to the promotion and assessment of the uses of literacy, rather than a focus on skills or competencies as such. The programme identified three areas in which people would engage to enhance their own development:

l Group literacy processes, such as obtaining a birth/marriage/death certificate or taking the minutes of meetings;

26

Page 24: UNESCO Expert Meeting on Aspects of Literacy Assessment

l Social action movements, here related to forestry management;

l Individual literacy practices.

Participants in the project could ‘enter’ at any one of these areas, adding others as they felt the need. Literacy use in communication was in focus, nor the acquisition of the skill, although people ended up learning and acquiring literacy as part of it. the point was not to focus on acquisition opportunities and delivery mechanisms as such, but to embed those processes in the contexts which demanded literacy in this community. Thus the question of a Nepali parliamentarian: “How many people have you made literate?” could not be directly answered. The project could however point to an increased number of community legal scribes, increased participation in social areas requiring literacy use, and so on. The project had its own logical framework and reported formally, though not by stating numbers of literates. Because it focused on use, it was not seen as a literacy project, but rather as ‘post-literacy’. It provides an example of designing and assessing literacy from the perspective of use.

Assessing the impact of literacy

For purposes of social analysis there are reasons to make an assessment of the impact of literacy, in terms of how far the acquisition and use of literacy gives access to opportunities and privileges. This was indeed one of the purposes of the IALS survey - seeking to understand how literacy levels relate to individual economic opportunities in the industrialised countries. In developing countries, correlations have been observed between literacy levels and fertility rates, infant and maternal mortality, poverty alleviation, and other indicators. For women particularly, the effects of literacy acquisition and use appear to facilitate other positive impacts in their lives and circumstances. Impact assessments of this kind are crucial to understanding where literacy as written communication fits in the broader scheme of social conditions and development.

However, care must be taken with such correlations, since there are many other factors outside of literacy which determine social and economic opportunities. There is a danger of using literacy levels as a proxy for other phenomena, such as socio-economic level or participation in society. In developing countries where literacy use is less widespread it is easy to assume that literacy opens doors for individuals and communities because of a superficial correlation between literacy and, for example, level of income. While these connections and correlations should be explored, the data must be sufficiently ‘thick’ to explain the interrelationships between a number of variables, including but by no means limited to literacy. As discussed below, these concerns argue for the inclusion of ethnographic methodologies in the design of literacy assessment.

Assessment may look at skills, use or impact - these imply an investigation of quite different phenomena, as the following chart exemplifies. These differences underline the need to select appropriate methods to assess different aspects of literacy, and to be clear first, what is meant by literacy in the first place, and second, what is understood to be the significant phenomena which will provide an assessment of its state:

27

Page 25: UNESCO Expert Meeting on Aspects of Literacy Assessment

Table Xx: A contrastive view: examples of skills, use, and impact

l decoding (reading) l understanding instructions

l encoding (writing) l filling in forms

l counting l extracting information

l calculating l writing down ideas

l using punctuation l keeping accounts

l social participation

l political voice

l economic opportunity

l access to institutions and networks

n What can be compared?

One of the principal aims of LAMP is to provide data for educational monitoring, particularly of EFA goals, as input into policy-making in multilateral institutions and funding agencies. The aim is to achieve tables of literacy rates, such as those in the EFA Global Monitoring Report, which are more reliable, which mean something in terms of the national reality they represent (i.e. are more than guesses, proxies or estimates) and which enable a sensible comparison for the purposes of allocating resources and effort.

This implies data which are comparable across countries and regions of the world. Given that this is the purpose, LAMP has a strong focus on facilitating standardised testing, where a basic set of literacy levels (q.v.) can be assessed in different contexts, allowing for some adaptation to local languages and circumstances. These extent and nature of these adaptations will emerge in the pilot projects. At this point, some considerations are already agreed, such as:

l What is being tested should relate to basic competencies and their use;

l Item development should be based on the socio-economic and cultural context of the country;

l Actual results of tests should contain messages on what to do to improve results.

However, we may ask the question: what will ultimately be compared? There seem to be two possibilities on the basis of the proposed approaches and levels envisaged by LAMP:

l If competencies and skills are measured in a standardised way, it will permit a ranking of countries along a scale which is as identical as possible. This will show, for instance, that X percent of country A are at level 4, while Y percent of country B have achieved that level. This will not indicate, however, what such achievement means in countries A and B. It is quite possible, because of

Page 26: UNESCO Expert Meeting on Aspects of Literacy Assessment

differing contexts, that level 4 in one country gives more or less opportunities and social advantage than in another. Nevertheless, comparison is technically sound, because it is made on the same basis.

l If competencies and skills are measured with regard to their use in context, it will permit a comparison of the extent to which literacy enhances opportunity. However, the scales would be adjusted for the context in terms of literacy demands, the literate environment and so on. Thus countries A and B reaching a comparable level using standardised tests might demonstrate divergent scores on locally determined scales; level 4 (on the standardised scale) in country A may give greater opportunities than in country B which would therefore have a lower literacy rate, in terms of what literacy does for people. The basis for assessment would be decided at national level, and international comparisons would need to make local parameters explicit.

It will be important to lay out the implications of choosing methodologies and approaches so that countries may determine exactly what they wish literacy assessment to show and in what ways the resulting data will be comparable with those of other countries, if the latter is a concern.

n HOW WILL ASSESSMENT BE CARRIED OUT?

The processes and methods of assessment relate to issues of purpose and content; as emphasised earlier, no single assessment method will be able to serve all the purposes of assessment. This section will look at two considerations which are important in building the framework for assessment - it will not propose specific methods in terms of instruments or techniques. The two areas are the development of sustainable and high-quality processes at national level, and the balance between qualitative and quantitative approaches.

Developing national assessment processes

One of the key messages of the Dakar World Education Forum in 2000 was the emphasis on strengthening national ownership of EFA. This is crucial in literacy assessment, since it is important that national governments and civil society believe and accept the results of assessment and use them for further planning, priority-setting and programme implementation. However, the results of assessment will only be accepted when the methods and processes involved are deemed to be valid, transparent and trustworthy. The SACMEQ process demonstrated this in a positive way, with more countries joining in once they saw that the assessment was able to capture the realities on the ground in a systematic, rational and open way (Saito 1999).

In order to foster a high level of ownership, one suggestion at the Expert Meeting was to set the development of assessment processes at national level in the context of a more general reflection on the nature and purposes of literacy in that country. This might involve the following:

Page 27: UNESCO Expert Meeting on Aspects of Literacy Assessment

l Reflecting critically on past practice and concepts of literacy;

l Developing their own new understanding of literacy/‘literacies’;

l Recognising that literacy is a continuum;

l Considering the practical uses of literacy;

l Examining and understanding the structure of the literate environment;

l Re-positioning literacy within a framework of broader communication practices;

l Developing rigorous assessment methods on the basis of these considerations.

As support is given by the international community in developing methods and undertaking pilot projects in a number of countries, the processes of how the generalisation of the methods to other countries must be addressed so that national assessments can be sustained in the future and in an ever-increasing number of countries. This will involve consideration of the following, at least:

l How the assessment framework will be first introduced into a country- to whom? government, NGOs, academics etc; and how? eg in a workshop - lasting how long?

l How the framework will be completed /adapted in each country- for example, in a workshop? With whom? Over how long? Adding what sort of contextual details to the framework? How will this space be used to encourage wider reflection / analysis of literacy? What are the non-negotiables if comparability is to still be ensured? What will be done in situations where there is a contradiction between national ownership and statistical rigour?

l How capacity building will be carried out in the use of the nationally adapted framework - how many people will be trained for how long? What prior background would they need? In the SACMEQ assessment emphasis was laid on building capacity through a ‘learning by doing’ approach - this has fostered the adoption of common approaches across countries of the Southern Africa region and therefore improved comparability of results.

l How long will data collection take?

l How will national-level analysis of data be done: who by? Over how long? With what help?

l How national publication will be ensured and how national debate / reflection will follow on from publication. What might be the connections to changes in policy and practice?

l How sustained capacity will be ensured in each country so that the process can be repeated five or ten years later.

l How these national-level publications will be consolidated into an international publication that can provide meaningful comparative results across countries.

Page 28: UNESCO Expert Meeting on Aspects of Literacy Assessment

l How countries beyond the first pilot countries go through all these processes without external UIS/LAMP support, or will there always be dependency?

Selecting assessment methods

Literacy statistics have often been based on self-reported data, where questions have been put in census and other surveys, asking about the literacy competence or use of the respondent and her or his household. At its most basic, the question may be: do you consider yourself to be literate? Other forms of the question may address the amount of printed material in the household, when the respondent last read or wrote something, or how many people in the household have completed primary education. Surveys such as these are open to all kinds of bias, particularly from over-reporting the level of literacy competence in the household. Moreover, the understanding of what ‘being literate’ means is unlikely to be uniform across households or individuals.

Current efforts are therefore based on a concern to use a more objective basis of assessment, primarily by testing actual competence among a sample of the population. There is broad consensus around the need to find new methods which are more reliable, more accurate and which measure in more objective ways. There is less agreement on what should be measured, as noted earlier, with regard to the relative importance of measuring competencies/skiIIs and use. There is even less agreement on how to carry out this measurement.

In general, literacy assessment requires both quantitative and qualitative methods. The aim of quantitative measurement is to establish standard levels of literacy, at least for a particular context, if not for a whole country or, eventually, for the whole world. On the other hand, qualitative data focuses on the use of literacy in context-what it enables people to do, how it impacts their lives and how literacy use connects with other aspects of the social fabric.

Statistical data are given a prominent place in representing social phenomena in all parts of the world. Numbers are often seen as neutral, objective and authoritative, whose validity is rarely questioned. The representation of social phenomena by a number enables further independent manipulation of the numbers, detached entirely from the parameters of the context. Psychometric testing (eg item response theory) is one approach which gives prominence to a statistical output; it tends to look at learners’ competencies, such as literacy, as uni-dimensional, capturing little of the complexity involved. Questions arise as to what a decontextualised number can really tell us. Could a description, be it of literacy use, learner performance or national situation, often be a more adequate way of representing progress and change?

This leads to the question of what qualitative methods might be employed. Much work which documents literacy as social practice adopts ethnographic methods which generate ‘thick data’ - exploring and documenting the multiple facets and connections of literacy with communication practices, social relations, practical outcomes and the institutional landscape. With its emphasis on the socio-cultural context an ethnographic approach:

Page 29: UNESCO Expert Meeting on Aspects of Literacy Assessment

32

l Helps get the ‘construct’ right which is to be assessed by providing input into its conceptualisation;

l Helps make the ‘construct’ useful to those who are being assessed;

l Generates ownership of change by giving attention to context;

l Enables proposals to build on what people know, since ethnography discovers this;

l Helps understand the nature of the differences between contexts and thus how to transform them (and judge the impact of literacy within that).

l Facilitates grounded comparison, again by giving attention to context;

l Assessment usually starts with setting standards (eg levels of literacy), but ethnography starts from practice, how communication (including literacy) is patterned on the ground;

l Improves quality control mechanisms (rigour, validity), by making explicit the assumptions underlying analysis and interpretation.

At i/i& ~~;ubpi$@n S6m&i(LIK) on /it&acy &s&sment; pattk$pan~s disct&ed * / s..,<,~.‘x: ,“^ I. ,‘~ry.~*: a .# ̂* ,i,.,~ ;:. &.&stinctidn betweeii ethnog&hic a& gtatist& &~@&&s. Thai &$I&$ ii%as expressed that’thii may not be the essential dividing line; nor is ihe &id& %ie ’ be&& natumf science and social science. Rather the distinction is between a. ,d’v5 s.,, *.~~<s&” .“f”lk. ). ,.< *p ,b ,,~“.,~~;,“~,‘*,~~.~,~~:E~~~i~~“~. :;a<* +.u:“- /‘ ‘b,-’ ,“,“.,:,;*.14;,,P- “G,? ,“.;. 1’ ;b;osr6vism - b&euml;t the, world tq

;,* ,,;, ;: I’ ; J;*:;“‘dp” I,” ,F:. z;;$;~“..‘:~;.:,f be thep,~,siiply reedy!? bt$&&ti& y&$? .: f <:” ‘f j I+,“;- :+ ijniJ+mtism, tbelihti* tiat thb.tij.s&y.g~ khibh ongj~ok$at the ,wwld arse ,:

* -&- “‘$.<i :.;$Tf& .‘&,:;sgd~;$ff’T’ 2 “i :; :;,+;i ,Y,.- ,‘” ‘CF ,:*-i &---4---4]$. fijs ,~ t6 .~ distjnctj~n ard‘nd *he cdii.t of~$#f-tj;;‘o;rm ‘; ;;J;. +?$yy) +&$:.;?Jp&~. , ;$ “\I$ I : ‘:” ’ :,-I ; i ;* ‘$6 l,f :.“;:i I,,‘:

t l$&eq ~~.p#t&rnode! oy a “rp+l~of’, @da pti@ q$$o; $Y&& h$f :, 1’ 8’ *, -/ j ;a r

l A ‘model of is an analysis ofphenomena, shapedbya conceptual framework _ th~‘sfirff~f,.&&&fj; ,

* A ‘mod&i f$’ m/a&$ to poi&y and the ,eh for change and !mpro&nent. <I”,‘), j_,

Educational modelling is almbst always a ‘model for’, as policy-maktjrrs @ok for ideals. It was clear that ‘models for’ @o/Icy) are best based on ‘models dP (&searcti and abalykis), a/thou& this is not &/ways thi, cai&i%VhW th& fwo becbme det%hed from*each’d!her; jW6j ;Comes to be b4sed on’ot~~agenda~.(oli~~t,’ sb;dal, . . .) as a way diignOnh$ cOmplexBy or avoiding risk and unc&taint#

, This same meeting proposed adding an ethnographic component to the LAM? process, by documenting one of the LAMP experiences from an ethnographic point of view, particularly in order to understand how far dominant and non-dominant literacies/issues are captured or missed. This may be a helpful way to evaluate the combined use of quantitative and qualitative.assessment methods as a basis for further methodological development.

Page 30: UNESCO Expert Meeting on Aspects of Literacy Assessment

Ethnographic methods could provide case studies of literacy use and of literate environments as a supplement to the quantitative data which LAMP will generate. Taken together, these two sources of data will not only generate further useful questions about the place and impact of literacy in particular contexts, but also allow a careful analysis of how far each method leads to similar or different conclusions.

One of the fundamental aims of LAMP is to develop a better methodology for assessing literacy in developing countries. This will be a gradual process, starting with experimental work in a small number of countries, but leading over time to instruments and a process which can become the standard survey method for gathering data on literacy across the world. This echoes the goal, regional rather than international, of the SACMEQ process which set out to use state-of-the-art methodologies and tools, involving the creation or adaptation of computer-based methods of entering, cleaning, storing and analysing data. SACMEQ progress so far indicates useful advances in these methods, enabling more reliable and accurate capture and processing of educational data in a developing country context. While the methods were elaborated essentially for use in assessing formal schooling, performance in literacy was a key focus of interest. In view of the effectiveness of SACMEQ’s tools, it is important to consider how far they may be applicable to the goals of literacy assessment outside the school system.

As LAMP draws on previous surveys in the design of its methodologies and instruments, the cultural context in which they were elaborated must be taken into account. This may particularly apply in using IALS experience which was designed for an industrialised context and made assumptions about how comparable the different national contexts were. The LAMP pilot projects should demonstrate what adjustments need to be made for different contexts and cultures, with particular attention to environments where social research is less common, such as rural contexts. The very act of asking a question, or of writing down people’s responses can have different meanings and imply widely different purposes and relationships according to local cultural norms and practices. Without sensitivity to these issues, the results of surveys may be skewed or even invalidated.

If an item-response methodology is adopted for assessment surveys, as suggested by LAMP the question arises as to which area of social life the items should be drawn from. Some participants in the Expert Meeting felt that items from those areas which literacy learners are most likely to encounter should be used in preference, for example from the mass media.

learners do it themselves or external assessment - learners can do it themselves if they wish to assess whether they can meet the literacy needs of their environment, but not for decontextualised national or international comparison. Back to the purposes...

Among other methods, the next section explores the notion of the literate environment as an alternative assessment tool - one which is context-sensitive and can assess production as well as consumption of written communication.

Page 31: UNESCO Expert Meeting on Aspects of Literacy Assessment

A s soon as attempts are made to assess literacy, it becomes clear that literacy is used in many circumstances, for many purposes, in many forms and in many different environments. If such variation is not taken into account, an assessment of literacy risks being of little use, unable to indicate how the use

of literacy relates to other aspects of life. After all, literacy is only as valuable as the uses to which it can be put.

As indicated earlier, these insights have contributed to the plural notion of literacies, which participants in the Expert Meeting encouraged UNESCO to take forward as a multi-dimensional view of literacy. The idea of ‘domains’ seeks to capture this notion in terms of assessment. Rather than assessing literacy use across the board in a single approach, its use in certain domains may be more readily investigated. How are such domains to be defined or delimited?

n DETERMINING DOMAINS

One of the simplest ways of defining domains is to use the basic functions or skills of literacy: reading, writing and numeracy. However, this begs the question of how and in what context these functions are exercised, and for what purposes. Since context and purpose are fundamental determining factors of literacy use, it would seem more helpful to define domains more specifically.

The IALS identified three domains within which literacy was assessed on the basis of the type of materials required to used in socio-economic activities:

l prose literacy: understanding and using texts such as newspapers, novels and poems

l document literacy: locating and using information from timetables, maps, employment applications, forms of all kinds and graphics.

l quantitative literacy: applying arithmetic operations and using/manipulating numbers embedded in printed materials, for example calculating a tip or interest, or completing an order form.

This approach has the advantage of including numeracy firmly within literacy assessment, but almost entirely neglects writing and creative literacies. It defines domains by the type of printed material (cf the IALS definition of literacy noted earlier) which a person is likely to encounter, with of course the attendant difficulty of defining exactly where distinctions are to be drawn.

Page 32: UNESCO Expert Meeting on Aspects of Literacy Assessment

Other approaches define domains by the context or environment in which written material is used, such as

l daily life

l work and social participation

l in educational settings’

Others define domains more by the nature of the language used or the style of writing than by the kind of materials:

l document literacy - distinct from the IALS definition above in that here the reference is to the use of bureaucratic or administrative style and language, not strictly the kind of document;

l narrative literacy-the kind of language found in consecutive text, such as novels, descriptions or articles;

l expository literacy - language used in ways that are intended to explain, exhort or convince, including procedural manuals, pamphlets or leaflets for propaganda or advertising, for example.

The SACMEQ surveys use basically these same three domains, calling them ‘narrative prose’, ‘expository prose’ and ‘documents’. However, they define narrative and expository prose in terms both of the kind of language used (to entertain, to describe or explain) and of the nature of the material (continuous text): document literacy, on the other hand is defined largely in terms of the kind of material (tables, maps, graphs, lists, sets of instructions).

The Expert Meeting listed three basic domains: reading, writing and numeracy. Further optional domains included: uses of acquired literacy skills, literate environment, and lifeskills. It would be left to individual countries to decide whether they wished to include these optional domains. In the three core areas, the following indications were given for putting assessment into operation:

l The assessment of reading should use different types of supports and formats of written texts as used in different contexts of everyday life (e.g. texts, letters, newspapers, forms, leaflets, posters, signs/labels, tables, notices, magazines, books, charts/graphs)) at different levels;

l The assessment of writing should consider a range of everyday writing tasks (e.g. messages, letters, forms), used in the context of everyday life at different levels; NB: The interrelationship between reading and writing should be taken into consideration when constructing the test;

l The assessment of numeracy should cover oral and written operations for calculation (problem solving) in everyday situations at different levels.

7- Cf lnfante 2003: 7.

Page 33: UNESCO Expert Meeting on Aspects of Literacy Assessment

Rather than focusing closely on skills and text materials, domains may also be delimited in terms of the learning outcomes which literacy may lead to, for instance in terms of generic or specific life skills. Generic skills might include planning, problem-solving, critical thinking, negotiating and relationship-building. Specific skills depend on context and might, in a rural context of the developing world, include understanding how government works, setting up an organisation, knowledge of the official language, or aspects of health and childcare. Little work has been done on how to assess lifeskills of any kind, let alone on how to assess the part that literacy plays in acquiring and using them.

n LITERATE ENVIRONMENT: AN ALTERNATIVE CONCEPT

A different way of conceiving the domains of literacy use is to develop the notion of the literate environment. A literate environment is a context (or set of contexts) within which written communication is used, and so

l It situates literacy within the wider context of communication, including oral and non-verbal practices;

l It focuses on production of text as much as on consumption;

l It sets the acquisition and use of literacy by individuals and communities in the context of patterns of existing use of text;

l It facilitates an understanding of the opportunities and constraints which literacy users face:

l It links local contexts dynamically with broader contexts (and eventually global contexts), since literate environments overlap and lead into one another.

In terms of assessment a focus on the literate environment would include the following questions:

l What are the existing literacy practices and who participates in them?

l In what languages do people communicate and how are these distributed in oral and written communication?

l How is the use of multilingual literacies structured and what kinds of access to information, education or other benefits do they enable?

l What text-related materials exist and how are they produced (newspapers, books, web pages, posters, TV/video programmes, and so on)?

l What are the opportunities for sustained, active use of literacy, writing as well as reading?

l What are the facilities/lack of them regarding the production of local text-related materials?

l What institutions are the users and producers of text and who controls them?

36

Page 34: UNESCO Expert Meeting on Aspects of Literacy Assessment

l What institutions promote literacy acquisition and what are their goals in doing so?

l What methods of literacy acquisition are employed and what space do they give for expression of local knowledge and culture, for broader awareness and knowledge of the world?

An analysis of these factors might lead to a quite different kind of assessment, with a focus on community and societal literacy practices, rather than an assessment of individual competencies. The advantage of examining the literate environment is that it may explain how and why people use literacy, the extent to which literacy is a valued means of communication, and thus the place that its acquisition and use actually has in people’s lives. This approach avoids making assumptions about the need (or lack of need) for literacy in particular contexts and provides pointers for designing programmes for literacy acquisition based on the potential of use.

Page 35: UNESCO Expert Meeting on Aspects of Literacy Assessment

I n the past literacy assessment has resulted in a categorisation of people into literates

and illiterates. The purpose in establishing levels of literacy is to move away from

that approach and adopt instead 3 continuous scale. This not only avoids invidious

categories, but also attempts more accurately to reflect current and evolving

competencies. The debate on assessing skills and/or use continues to flow through the

analysis of levels.

In the past levels have been determined according to the complexity of texts to be read

- based on sentence length, range of ,vocabulary, overall length of text, and so on. This

approach reflects a technical, ‘autonornous’ view of iteracy, and assessment now rarely

focuses so exclusively on decoding skills. In many programmes, a mix of criteria is used

to set levels, combining decoding skills with the use to which they are put in the local

environment. Writing also now receives greater prominence, for which levels are set both

on the basis of encoding skills and the kinds of writing which the learner may need in

their context. The five levels of the literacy programmes of the Dhaka Ahsania Mission

(DAM) in Bangladesh illustrate this mix of criteria:*

Table XX: Literacy Assessment Levels of the Dhaka Ahsania Mission, Bangladesh

4 I I Able to read text with clear and correct Able to fill in simple forms. punctuation. I

5 I I Able to read different books, magazines, Ab e to write at least one page on a specific daily newspapers and explain ihe matter issue expressing their own ideas.

1 to others.

Undertaking this kind of assessment would require further work to define terms like

‘simple’ and to determine what kind of writing is relevant to particular contexts. Other

projects divide assessment levels differently and may assign categories to learners

in addition. The Literacy Assessment Project case study details four levels - none,

prerequisite, basic, advanced - and three stages for [he learners - beginner, progressing

8- See Alam 2003. 6 9- ILI 2002. Subba Rao 2003

38

Page 36: UNESCO Expert Meeting on Aspects of Literacy Assessment

and completed.g There is no explanation of how these various categories combine or relate to one another. Such schemes are an attempt to organise what are basically messy data, dependent on interlocking literate environments, undergoing constant change and facing shifting demands from learners.

DAM offers the example of an approach closer to that of the Nepal example cited earlier, where neither the literacy skills as such are assessed, nor direct use of literacy, but the impact or outcomes of such use. A community water management programme developed a set of indicators where literacy is embedded or implied in the particular skills which the programme required. This moves away from assessing literacy skills as such, to an assessment of whether people are equipped to function in accordance with the learning needs of water management. Although the following indicators related to the literacy programme associated with the water management programme, it is noteworthy that literacy itself does not figure among them:

l Extent to which the participants acquire the skills to handle the water problem locally

l Types of skills acquired

l Are the skills transferable and sustainable?

l Is there scope for further improvement of skills?

l Are the skills in use?

l Effects resulting from skill development?

There is an observable progression (for instance, acquisition, use, transfer, . . .), but these indicators are not cast in the form of levels. The reason for this is their embedding in a particular context of use: what matters is whether people are able to manage water effectively, not whether they have reached a particular level of literacy. In fact, it is likely that those involved in water management have different levels of literacy, which, for individuals and collectively, meet the literacy demands of the situation.

LAMP takes the IALS as its starting point in establishing levels of literacy, particularly for the higher levels. Based on its definition of literacy as the ‘ability to understand and employ printed information in daily activities’ in various contexts and for various purposes, the IALS set up five levels:

l Level 1: indicates persons with very poor skills, where the individual may, for example, be unable to determine the correct amount of medicine to give a child from information printed on the package;

l Level 2 respondents can deal only with material that is simple, clearly laid out, and in which the tasks involved are not too complex. It denotes a weak level of skill, but more hidden than Level 1. It identifies people who can read, but test poorly. They may have developed coping skills to manage everyday literacy demands, but their low level of proficiency makes it difficult for them to face novel demands, such as learning new job skills.

Page 37: UNESCO Expert Meeting on Aspects of Literacy Assessment

l Level 3 is considered a suitable minimum for coping with demands of everyday life and work in a complex, advanced society. It denotes roughly the skill level required for successful secondary school completion and college entry. Like higher levels, it requires the ability to integrate several sources of information and solve more complex problems.

l Levels 4 and 5 describe respondents who demonstrate command of higher- - order information processing skills.

The assumptions seem to be that levels 1 and 2 are concerned mainly with the passive use of text (reading) while writing is implied from level 3 upwards, since the ‘demands of everyday life’ in industrialised societies would included writing or keyboarding. At each level, it is the use of literacy skills which is to be tested, rather than merely the decoding of text.

In its current stage of development, LAMP also proposes five levels of literacy skills, with emphasis on what it calls the ‘lower levels of the literacy scale’, corresponding to an expansion of the IALS Level 1. This was felt necessary to allow for greater differentiation at this level for developing countries. However, IALS level 1 gives rise to some methodological problems, since it is defined above as a negative concept, in terms of what a person can hardly do. This level, like the others, could only be tested by assessing how well a subject can cope with a literacy task, such as the example quoted. Thus, again, it is the use of literacy which is in focus, even at this level.

The five LAMP levels are as follows:

l Component 1: listening comprehension (in the language of assessment) - comprehension of vocabulary in context and comprehension of the overall text;

l Component 2: recognition of grapheme (letter, syllable, word component symbol or other depending on the writing system) - speed and accuracy;

l Component 3: word recognition - speed and accuracy;

l Component 4: sentence reading - speed and accuracy;

l Component 5: passage reading - speed, accuracy and comprehension.

The focus is on reading, with little evidence that writing will be part of the assessment - the latter depends partly on whether the test is administered orally or on paper. The numeracy component also remains to be elaborated. The first four components refer only to decoding skills, not to the use of literacy for informational or communication purposes. Only the fifth component mentions comprehension as a way of assessing how useful literacy is to the learner. The Expert Meeting, in common with current thinking on literacy, brought literacy use into focus in its definition for assessment purposes, calling therefore for an appreciation of how far learners can ‘identify, understand, interpret, create, communicate and compute, using printed and written materials’. The five components above do not as yet represent this full range of uses, being limited to identifying and

Page 38: UNESCO Expert Meeting on Aspects of Literacy Assessment

understanding written text. In effect, only Components 4 and 5 could be used to assess literacy which is of practical use: the identification of graphemes is of little value, and the understanding of words has only limited application.

It should be noted that LAMP envisages an extensive background questionnaire where the profile of the respondent and the nature of her/his practical use of literacy will be provided. With this and the tests on the above five levels, a three-way approach to assessing literacy was noted by some participants in the meeting:

l assessing decontextualised skills - decoding and encoding;

l assessing skills in practice-tasks such as letter writing, following instructions or engaging in transactions with money;

l use of literacy - daily practices, participation in the literate environment.

Component 1 of the LAMP levels raises questions about language issues in the implementation of the assessment. It appears not to assess literacy at all, but rather understanding of the language of assessment. This raises three questions:

l It suggests that the language of assessment may not be the language which the subject knows best;

l If the language of assessment is one in which the subject is less than fluent or knows only partially, the assessment of literacy per se, through the subsequent components, becomes a mixture of a language and a literacy assessment, where the results may not be attributable to level of literacy alone;

l If the language of assessment is the usual or dominant language of the subject, this component is redundant or, if employed, may be seen rather as an assessment of the subject’s knowledge or intelligence.

The importance of context, repeatedly underlined in the Expert Meeting, implies that the dominant or first language of the subject should be used in the assessment, in order not only that the highest level of literacy use is identified, but also that responses to the background questionnaire are as full and as explicit as possible. Specifically, participants agreed on the following parameters for establishing levels:

l Levels should reflect a range of abilities in a population. This range will depend on the use of literacy in a given society.

l Levels should be useful and refer to the definition of literacy.

l Levels should inform policy, instruction and learning. They should be based on what information is required and for what uses. Empirical studies need to be undertaken for their construction.

These comments and observations argue for a review of the levels of assessment and the way they are implemented in the assessment process; the LAMP pilot projects afford an opportunity for learning lessons and adjusting the definition of levels. The SACMEQ

Page 39: UNESCO Expert Meeting on Aspects of Literacy Assessment

experiences also proves illuminating. After a careful research design process in the context of developing countries, the SACMEQ surveys established levels according to the use of literacy, rather than focusing on decoding. These surveys, undertaken in fifteen countries in Africa, required the subject or pupil (SACMEQ is entirely school-based) to respond to 59 items:

l ‘Verbatim’ level: if the pupil was required to identify the exact word or phrase in the passage. We should note that this is not word recognition in the sense intended by LAMP, but an ability to respond with the correct piece of text to a question on the content of the passage;

l ‘Main idea’: the pupil is required to grasp the overall meaning of the passage. Again, this goes beyond the ability to read whole sentences or paragraphs, demanding skills in understanding and freely re-phrasing the text;

l ‘Apply rules’: pupils were required to use the newly acquired information in the passage as a rule to solve a new problem in a question.

Although these are systematically referred to as ‘reading skills’ the very process of taking the test involves writing skills also. Each of the three levels addresses how literacy - as written communication - may be used in everyday, practical ways, an approach entirely consonant with the operational definition of the Expert Meeting. This approach received further echoes at the meeting where a simple-complex scale would involve:

l Identifying or finding explicit information in a text;

l Establishing relationships between ideas or situations located in different parts of a text;

l Inferring one’s own ideas from the text.

Even so, each of these points on the scale needs to be defined and specific criteria designed to obtain meaningful results in particular contexts, both school and non-school.

While the SACMEQ levels may need adapting for the purposes of adults and contexts outside of school, they provide a relevant basis for setting up levels for broader literacy assessment.

All these schemes beg the question, again, of how far universal levels of literacy can be established as criteria for assessment across different contexts. Participants in the Expert Meeting recalled the inevitable influence of context, even when seeking to use pre-established levels:

Despite the fact that those levels can be set according to some defined degrees of ability, they will not be able to be homogeneous and they will vary in meaning according to the context. (Infante 2003: 3).

42

Page 40: UNESCO Expert Meeting on Aspects of Literacy Assessment

All these approaches to defining levels envisage a sliding scale of literacy competences and use. They move away from the dichotomy of illiterate/literate, for two reasons:

l The use of the term ‘illiterate’, or its more recent euphemism ‘non-literate’, has pejorative connotations which are often taken to indicate other negative characteristics. Those thus labelled may be further stigmatised as ignorant, backward, unintelligent, uninformed or naive. Lack of access to literacy or lack of use do not imply any of these characteristics-those who use oral communication entirely should not be tarred with these brushes.

l To some extent, everyone is engaged with their literate environment. Describing someone as illiterate can give the impression that they neither participate in nor are impacted by the use of text. This is plainly not the case, since text is most likely to be present in their immediate environment in some way, and decisions based on text certainly affect their lives.

Nevertheless, there was some feeling that certain institutions would always look for or require a two-way categorisation, even though LAMP is constructed to avoid doing so. Where categorisation is on the agenda, the notion of a literacy threshold provides a more positive and flexible approach. This puts literacy use in context firmly at the centre of literacy assessment since a threshold can only be set up for certain situations and uses. In each context the question would be posed: what kinds of literacy do people require to function as active citizens participating on an equal basis with others in social life? This dovetails with the concept of a literate environment since a threshold would depend on the nature, amount and circulation of text-related material in a given situation. A threshold approach, applied mechanically, leads back to the literate/illiterate dichotomy, but it need not do so. Literate environments are dynamic and linked to other kinds of literate environment - for example the literate environment at a community level is organically linked to the national literate environment. Thus a series of thresholds, nested one within the other, would provide a scale along which learners are constantly moving as their use of literacy increases and changes.

Page 41: UNESCO Expert Meeting on Aspects of Literacy Assessment

T his analysis has raised the issues relating to the new approaches to literacy assessment, based on the papers, deliberations and output of the UNESCO Expert Meeting. In line with the aim of the meeting to shape the LAMP process, discussion, critique and suggestions for that initiative have been woven into

the themes of the analysis. As LAMP is piloted in a number of countries, the issues raised here will form part of the discussions around its design and implementation in specific contexts. Other initiatives and projects of literacy assessment may also benefit from considering what the participants in the meeting identified as key issues. Expressed as necessary steps on the way forward these are:

l Defining what concept of literacy is to underlie assessment;

l Deciding on the focus of assessment: skills and competencies, use or impact, or a balance between two or more of these aspects;

l Achieving clarity on the purpose of the assessment - what kinds of results or outputs for whom and for what practical use?

l Considering what relative weight to give to quantitative and qualitative methods;

l Defining domains, on the basis of the focus and purposes of the assessment, and recognising that these choices, coupled with the methods selected, will produce only certain kinds of data;

l Possibly defining levels, again depending on the focus and purpose of the assessment.

Literacy assessment, like literacy itself, is a complex and multi-faceted process, with connections to many other aspects of society and of life. This should not deter us. People continue to be excluded from using literacy as one of their communication and learning tools - we need to know where and why, so that everyone may enjoy the opportunities literacy may offer in each situation. As the UN Literacy Decade moves forward, this endeavour deserves the energetic commitment and strenuous effort of UNESCO, its member governments and all their partners.

Page 42: UNESCO Expert Meeting on Aspects of Literacy Assessment

Alam, Kazi Rafiqul. 2003. Operational definition of literacy for assessment purpose: literacy to meet basic needs. Paper presented to the Expert Meeting on Literacy Assessment, Paris, 10 - 12 June 2003.

Freire, Paulo. 1972. Pedagogy of the Oppressed. Harmondsworth: Penguin.

IALS. n.d. Highlights from the Final Repoti of the lntemational Adult Literacy Survey: Literacy in the lnfonnatkm Age - Skills for the Twenty-First Century

Infante, Isabel. 2003. On the measurement of literacy Paper presented to the Expert Meeting on Literacy Assessment, Paris, 10 - 12 June 2003.

ILI (International Literacy Institute). 2002. Ana/ytic Review of four UP (Literacy Assessment Practices) country case studies. Philadelphia: ILI.

Rahman, Siddiqur. 2003. Literacy in Bangladesh: findings of a study conduction by Education Watch. Paper presented to the Expert Meeting on Literacy Assessment, Pads, 10 - 12 June 2003.

SACMEQ (Southern Africa Consortium for Monitoring Educational Qualii. 2000. Translating Educational Assessment Findings into Educational Policy and Refom Measures: Lessons fro the SACMEQ Initiative in Africa. Paper presented to the World Education Forum, Dakar, Senegal, 26-28 April 2000.

Saito, Mioko. 1999. “A Generalizable Model for Educational Policy Research in Developing Countries” in Journal of International Cooperation in Education 2(2): 107-l 17.

Subba Rao, I.V. 2003. Leaming tools and learning contents: texts, contexts and assessment Paper presented to the Expert Meeting on Literacy Assessment, Paris, 10 - 12 June 2003.

UIS (UNESCO Institute for Statistics). 2003. LAMP: Background, objectives, and progress to date. Paper presented to the Expert Meeting on Literacy Assessment, Paris, 10 - 12 June 2003.

UNESCO. 1997. The Hamburg Declaration and the Agenda for the future. Fiih International Conference on Adult Education. Paris: UNESCO.

UNESCO. 2000. The Dakar Framework for Action. Paris: UNESCO.

UNESCO. 2002a. An lntemational Strategy to put the Dakar Framework for Action into operation. Paris: UNESCO.

UNESCO. 2002b. Education for All: is the World on Track? EFA Global Monitoring Report 2002. Paris: UNESCO.

UNESCO. 2W3a. Literacy as Freedom: a UNESCO Roundtable. Pads: UNESCO.

UNESCO. 2W3b. Gender and Education for All: the Leap to Equality EFA Global Monitoring Report 2003/4. Paris: UNESCO.

UNESCO. 2004. The Plurality of Literacy and its Implications for Policies and Programmes [UNESCO Education Position Paper]. Paris. UNESCO.

United Nations. 2002. United Nations Literacy Decade: education for all: international Plan of Action; implementation of Gene& Assembly Resolution 56/7 76. [UN General Assembly A/57/21 81.

Valdivielso Gomez, Sophia. 2003. From literacies to integral literacy Paper presented to the Expert Meeting on Literacy Assessment, Paris, 10 - 12 June 2003.

Vincent, David. 2000. The rise of mass literacy: reading and writing in modem Europe. London: Polity Press.

Yousif, Abdelwahid Abdalla. 2003. Literacy: an Overview of Detininitions and Assessment Paper presented to the Expert Meeting on Literacy Assessment, Paris, 10 - 12 June 2003.

Page 43: UNESCO Expert Meeting on Aspects of Literacy Assessment

cn s a Q .- c) .- r a

CL

0

cn .-

n n

X

a

s

s

a

46

W Africa

Ms. Amina lbrahim National EFA Coordinator Education for All Unit Office of the Honourable Minister Abudja, Nigeria Tel. 0803 4080983 Fax: 234 9 3143990 Email: [email protected] or [email protected]

Dr. Ahmed Oyinlola Executive Secretary, National Mass Literacy Commission, PMB 295 AREA III, Federal Ministry of Education Garki-Abuja, Nigeria Tel: 234 9 2344031 Fax: 234 9 3143990

Dr. Laouali Moussa Direction de I’Alphabetisation Ministere de I’Education de base et de I’Alphabetisation BP 525, Niamey, Niger Fax: 227 72 21 37 Email: [email protected]

W Arab States

Dr. Abdulwahid Abdulla Yousif Advisor to the Minister of Education Kingdom of Bahrain Tel. 973.687287 Fax 973.680161 Email: awahidvousifQbahrain.qov.bh

W Asia

Dr. R. Govinda Professor, National institute for Educational Planning and Administration (NIEPA) New Dehli, India Tel: 91 11 26510135 Fax: 91 11 26 85 30 41 Email: [email protected] Dr. llapavuluri Subbarao Principal Secretary to Government, Education Department Government of Andhra Pradesh, Hyderabad, India Tel 91 40 2 34 52 403 (office) Tel: 91 40 2 33 58 675 (home) Fax: 91 40 2 34 50 563 Email: [email protected]

Mr. Ochirkhuyag Gankhuyag Education Officer Mongolian National Commission for UNESCO PO. 38 Ulaanbaatar, Mongolia

n

Tel: 976 11 315652 Fax: 976 11 322612 Email: [email protected]

W Latin America

Dr. Isabel lnfante Ministry of Education, Chile Fax: 56 2 65 51 046 Email: [email protected]

W Universitieshstitutes

Dr. Sophia Valdivielso Gomez Universidad de Las Palmas de Gran Canaria Facultad de Formation del Profesorado cf Santa Juana de Arco no1 E-35004 Las Palmas de Gran Canaria, Spain Tel: 34 928 37 2097 (home) Mobile: 34 646 490678 Fax: 34 928 45 2880 (university) Email: [email protected] (home) or [email protected] (university)

Dr. Siddiqur Rahman Professor of Education, Institute of Education and Research, University of Dhaka, Dhaka , Bangladesh Tel: 880 2 8628757 Email: [email protected] Dr. Chilisa Bagele Head of Department of Education Foundation University of Botswana, P/Bag 0022 Gaborone, Botswana Tel: 267 355 2408 Fax: 267 318 5096 Email: [email protected]

Prof. Alan Rogers Uppingham Seminars in Development 5 Adderley Street, Uppingham Rutland UK LE15 9PP Tel: 01572 821282 Email: [email protected]

Dr. Patrick Werquin OECD 2, rue Andre Pascal - 75016 Paris, France Tel : 33 (0) 1 45 24 97 58 Fax:33(0)1 45249098 Email: [email protected]

Dr. T. Scott Murray Statistics Canada Main Building Tunney’s Pasture Ottawa, Ontario, KlA OT6 - Canada Tel: 1 613 951 9035 Fax: 1 613 951 9040 Email: [email protected]

Page 44: UNESCO Expert Meeting on Aspects of Literacy Assessment

Dr. Irwin Kirsch Educational Testing Service (ETS) Rosedale Road M/S 02-R Princeton, New Jersey 08541 - USA Tel:1 6097341516 Fax: 1 609 734 13 09 Email: [email protected]

n NGOs

Mr. David Archer Action Aid Hamlyn House Macdonald Road, Archway London, N19 5PG, UK Tel: 44 20 75 61 75 61 Fax: 44 20 72 72 08 99 Email: [email protected]

Dr. Jean-Pierre Jeantheau Agence nationale de lutte centre I’illetrisme 1 place de I’Ecole, BP 7080 69348 Lyon Cedex 07, France Tel:33(0)437371865/33686879441 Fax:33(0)437371681 Email : [email protected] Mr. Kazi Rafiqul Alam Executive Director Dhaka Ahsania Mission House No. 19, Road No. 12 (New) Dhanmondi, Dhaka 1209, Bangladesh Tel: 8802 811 5909 / 811 9521 22 / 912 3420 / 912 3402 Fax: 8802 811 3010 / 811 8522 Email: [email protected]

Dr. Vera Maria Masagao Ribeiro NGO Acao Educativa Rua General Jardim, 660 01223 010 Sao Paulo - SP, Brazil Tel/Fax: 55 11 31512333 E-mail:vera@acaoeducativa o . r q

W Resource Persons

Dr. Chander Daswani Remedia Trust, 157 Sahyog Apts. Mayur Vihar-I New Delhi, 110091, India Tel: 91 11 2275 8073 I2275 7305 Fax: 91 11 26114639 Email: [email protected]

Dr. Clinton Robinson 38 Middlebrook Road, High Wycombe HP1 3 5NJ - United Kingdom Tel/Fax: 44 1494 637880 Mobile: 44 7766 600 751 Email: [email protected]

H UNESCO

Mr. John Daniel, Assistant Director General for Education

Ms. Aicha Bah Diallo, Deputy Assistant Director- General for Education

Mr. Shigeru Aoyagi, Chief, Literacy and Non- Formal Education Section

Ms. Margarete Sachs-Israel, Programme Specialist, Literacy and Non-Formal Education Section

Ms. Susanne Schnuttgen, Programme Specialist, Literacy and Non-Formal Education Section

Ms. Ushio Miura, Assistant Programme Specialist, Literacy and Non-Formal Education Section

Mr. lbrahima Bah Lalya, consultant, Division of Basic Education, UNESCO

UNESCO, Education Sector

7 place de Fontenoy, 75352 Paris 07 SP, France Tel: (33) 1.45.68.10.00 Fax: (33) 1.45.68.56.26/7

Mr. Adama Ouane, Director, UNESCO Institute for Education (UIE) Feldbrunnenstr. 58,20148 Hamburg, Germany Tel: (49) 40.44.80.41.30 Fax. (49) 40.41.07.723

Mr. Doug Lynd, Senior Programme Specialist, UNESCO Institute for Statistics (UIS)

Ms. Benedicte Terryn, Assistant Programme Specialist, UNESCO Institute for Statistics (UIS)

UNESCO Institute for Statistics (UIS)

5255, Decelles, 7e etage, Montreal, (Quebec), H3T 281, Canada Postal Address: C.P. 6128, Succ. Centre-Ville, Montreal (Quebec) H3C 3J7, Canada Tel : (1) 514.343 6880 Fax: (1) 514.343. 6882/72

Ms. Mioko Saito, Programme Specialist, International Institute for Educational Planning (IIEP) 7-9, rue Eugene Delacroix, 75116 Paris, France Tel: (33) 1. 45.03.77.70 Fax: (33) 1. 40.72.28.36

Mr. Camara Boubacar, Senior Programme Specialist, UNESCO Office in Accra 32, Nortei Ababio Street, Airport Residential Area, PO. Box CT 4949, Accra, Ghana Tel: (233) 21.76.54.97/99 Fax: (233) 21.76.54.98

Page 45: UNESCO Expert Meeting on Aspects of Literacy Assessment

The designations employed and the presentation of material throughout this publication do not imply the expression of any opinion whatsoever on the part of UNESCO concerning the legal status of any country, territory, city or area or of its authorities, or concerning the delimitation of its frontiers or boundaries.

The author is responsible for the choice and the presentation of the facts contained in this work and for the opinions expressed therein, which are not necessarily those of UNESCO and do not commit the Organization.

Published by the United Nations Educational, Scientific and Cultural Organization

7 place de Fontenoy, 75352 Paris 07 SP, France

0 UNESCO 2005

Printed in France

_ -._ ---.-- -_-.. -