21
Global Social Policy 2016, Vol. 16(1) 47–67 © The Author(s) 2015 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav DOI: 10.1177/1468018115571420 gsp.sagepub.com gsp Tracing the sub-national effect of the OECD PISA: Integration into Canada’s decentralized education system Clara Morgan United Arab Emirates University, United Arab Emirates Abstract Although education scholars have examined the globalization effect of the Organisation for Economic Co-operation and Development (OECD) Programme for International Student Assessment (PISA) and its impact in several countries, few have explored its effect at the sub-national level. Taking the Canadian federation as its case study, I argue that the PISA, as a universalizing project for education, is being uncritically replicated through the implementation of student assessments at the national level. By drawing on the policy studies and policy sociology literature, I find evidence of policy discursive practices and techniques, which led to the creation and replication of a PISA-modeled assessment sub-nationally in the form of the Pan- Canadian Assessment Program. Three key themes emerge that facilitate the modeling of universalizing educational projects such as the PISA by sub-national entities: (1) a preoccupation with the international benchmarking of student performance, (2) a shift away from a curriculum-based assessment to a competency-based one, and (3) the adoption of organizational systems and processes of assessment aligned with supranational assessment practices. I suggest that domestic conditions in the Canadian federation were conducive to the rapid integration of the PISA sub-nationally despite the decentralized structure of the Canadian elementary and secondary education system. Keywords Canada, education policy, federal states, Organisation for Economic Co-operation and Development, Programme for International Student Assessment Corresponding author: Clara Morgan, Department of Political Science, United Arab Emirates University, P.O. Box 15551, Al-Ain, United Arab Emirates. Email: [email protected] 571420GSP 0 0 10.1177/1468018115571420Global Social PolicyMorgan research-article 2015 Article at eFada PARENT on March 14, 2016 gsp.sagepub.com Downloaded from

Tracing the sub-national effect of the OECD PISA: Integration into Canada’s decentralized education system

Embed Size (px)

Citation preview

Global Social Policy2016, Vol. 16(1) 47 –67© The Author(s) 2015

Reprints and permissions: sagepub.co.uk/journalsPermissions.nav

DOI: 10.1177/1468018115571420gsp.sagepub.com

gsp

Tracing the sub-national effect of the OECD PISA: Integration into Canada’s decentralized education system

Clara MorganUnited Arab Emirates University, United Arab Emirates

AbstractAlthough education scholars have examined the globalization effect of the Organisation for Economic Co-operation and Development (OECD) Programme for International Student Assessment (PISA) and its impact in several countries, few have explored its effect at the sub-national level. Taking the Canadian federation as its case study, I argue that the PISA, as a universalizing project for education, is being uncritically replicated through the implementation of student assessments at the national level. By drawing on the policy studies and policy sociology literature, I find evidence of policy discursive practices and techniques, which led to the creation and replication of a PISA-modeled assessment sub-nationally in the form of the Pan-Canadian Assessment Program. Three key themes emerge that facilitate the modeling of universalizing educational projects such as the PISA by sub-national entities: (1) a preoccupation with the international benchmarking of student performance, (2) a shift away from a curriculum-based assessment to a competency-based one, and (3) the adoption of organizational systems and processes of assessment aligned with supranational assessment practices. I suggest that domestic conditions in the Canadian federation were conducive to the rapid integration of the PISA sub-nationally despite the decentralized structure of the Canadian elementary and secondary education system.

KeywordsCanada, education policy, federal states, Organisation for Economic Co-operation and Development, Programme for International Student Assessment

Corresponding author:Clara Morgan, Department of Political Science, United Arab Emirates University, P.O. Box 15551, Al-Ain, United Arab Emirates. Email: [email protected]

571420 GSP0010.1177/1468018115571420Global Social PolicyMorganresearch-article2015

Article

at eFada PARENT on March 14, 2016gsp.sagepub.comDownloaded from

48 Global Social Policy 16(1)

Introduction

International studies and global social policy scholars have pointed to the role transna-tional actors play in the internationalization of domestic policy (see Orenstein, 2008; Skogstad, 2008, 2011), including the internationalization of education policy (Martens, Knodel, and Windzio, 2014). Scholars have noted the increasingly important role an international organization (IO) such as the Organisation for Economic Co-operation and Development (OECD) plays as a transnational actor (Deacon, 2007; Mahon and McBride, 2008) and its capacity to dominate educational discourse and set agendas for educational reform (Rubenson, 2008). As part of the global educational governance toolkit, scholars have identified the growing influence of international comparative indicators (Davis et al., 2012; Henry, Lingard, Rizvi, et al., 2001; Rizvi and Lingard, 2010), particularly the role of the OECD’s Programme for International Student Assessment (PISA) (Meyer and Benavot, 2013; Morgan and Shahjahan, 2014; Sellar and Lingard, 2013). They have examined the effect of the PISA on educational governance by studying its regional impact (Grek, 2009) and national effects in countries such as England (Ozga, 2009), Germany (Bieber, Martens, and Niemann, 2014; Ertl, 2006), Japan (Takayama, 2008), Finland (Rautalin and Alasuutari, 2009; Simola, 2005), and Turkey (Gur, Celik, and Ozoglu, 2012). However, except for a few studies at the sub-national level – for example, in the German state of Lower Saxony (Hartong, 2012), in the Province of Ontario, Canada (Martino and Rezai-Rashti, 2013), and in the Swiss cantons (Bieber, 2014) – the repercussions of the PISA in a federal context and at the local scale have not been suffi-ciently explored by scholars.

My article contributes to the literature on IO transnational governance by examining the sub-national impact of the PISA in the Canadian federation. I am interested in explor-ing the process of ‘governing by numbers’ (Grek, 2009; Miller, 2001) via the PISA using a ‘locally-framed view’ (Schweisfurth, 2013: 121). Through this analysis, I gain an understanding of how educational issues are problematized sub-nationally using meas-urement practices such as assessment tools, and the ways in which PISA facilitates a certain understanding of these problems across scales.

PISA’s effects and its integration into federal education policies vary across OECD member states. For example, in Germany, the federal ministry of education shares edu-cational jurisdiction with the Länder (states) in several areas including performance assessment (Federal Ministry of Education and Research, 2012). When the first PISA results were released, the Germans were shocked by their students’ performance (Ertl, 2006). PISA contributed to the integration of PISA benchmarking practices into educa-tion policy (Bieber et al., 2014). In the Swiss federation, education is highly decentral-ized. The effect of the PISA was to accelerate neoliberal educational reforms and harmonization of educational standards among the 26 Swiss cantons (Bieber, 2014: 185).

In Australia, even though the educational system is decentralized, PISA’s effect has been very visible. The federal government incorporated into the Australian Education Act 2013 the objective of becoming one of the ‘top 5 highest performing countries based on the performance of school students in reading, mathematics and science’ by 2025 (Gorur and Wu, 2014: 2). At the same time, since 2008, the federal government has insti-tutionalized a national accountability regime through the National Assessment Program

at eFada PARENT on March 14, 2016gsp.sagepub.comDownloaded from

Morgan 49

– Literacy and Numeracy (NAPLAN; Lingard, 2010). The federal Australian Curriculum, Assessment and Reporting Authority (ACARA) administers the NAPLAN. ACARA annually tests students in Grades 3, 5, 7, and 9 and publicly reports the results on its My School website (http://www.myschool.edu.au/) by school, suburb, and postal code.

In contrast to other OECD federal states, the PISA effect in the United States has been minimal. The education system is decentralized with states having primary responsibility for education and with the US Department of Education focused on promoting student achievement and fostering educational excellence (US Department of Education, n.d.). The federal government has in place the National Assessment for Educational Progress since 1969, which measures student learning outcomes in Grades 4, 8, and 12 (National Center for Education Statistics, 2011). The OECD has connected directly with some local US school boards, which have implemented PISA-based Test for Schools (Rustkowski, 2014: 3). Interestingly, Canada appears to be the only federation among the ones cited to have created an assessment closely modeled after the PISA.

Grek (2009) identifies PISA’s key ingredients as ‘de-contextualization, commensura-bility, and policy orientation’ (p. 27). My analysis reveals that the Pan-Canadian Assessment Program (PCAP), which replicates the PISA, has a similar set of ingredients through which it becomes a tool for governing by numbers in the Canadian federal space (Martino and Rezai-Rashti, 2013). The first ingredient is that student knowledge in the PCAP is ‘de-contextualized’ since what is to be measured is not curricular knowledge but a set of competencies. The second ingredient is that student learning outcomes across provinces and territories are rendered ‘commensurable’ through a standardized measure of assessment. The third ingredient is that a common ‘policy orientation’ that is shared sub-nationally is geared toward a ‘pragmatic view of education’s worth’ (Grek, 2009: 27). Such a pragmatic view of education values skills for the labor market and is aligned with the policy frame espoused by the OECD.

Given that political authority in education is shifting from the national to the suprana-tional arena with IOs such as the OECD positioning themselves as designers of universal educational solutions (Beech, 2009: 345), it is imperative for scholars to critically ana-lyze how educational governance across scales is enacted sub-nationally. As Beech (2009) notes, IOs promote an abstract universal model of education for the information age encompassing ‘principles of decentralization, school autonomy, the professionaliza-tion of teachers, a curriculum based on competencies and the setting up of central evalu-ation systems’ (p. 348). Through the creation of this model, IOs are in a position to construct indicators for measuring progress toward the model, to define the problems associated with attaining objectives, and to provide solutions for improving educational outcomes. Within this context, PISA becomes a very effective tool for measuring pro-gress toward a universal model of education while at the same time serving as a univer-salizing educational project as its results reverberate domestically. Researchers have found that the domestic effect of the PISA league tables has generally been to create ‘uncertainty’ (Hartong, 2012) or ‘worry’ (Gur et al., 2012) when rankings are low. In the Canadian case, new accountability techniques were instituted despite the fact that the Canadian provinces ranked among the top performers in the PISA league tables.

I begin by providing a brief overview of the OECD, the PISA, and educational govern-ance in the Canadian federation. Drawing on a policy studies and sociology framework, I

at eFada PARENT on March 14, 2016gsp.sagepub.comDownloaded from

50 Global Social Policy 16(1)

then elaborate the concepts I use for my analysis and for tracing the domestic conditions for the integration of the PISA. Taking the Canadian federation as my case study, I identify three key themes: (1) a preoccupation with the international benchmarking of student per-formance, (2) a shift away from a curriculum-based assessment to a competency-based one, and (3) the adoption of organizational systems and processes of assessment aligned with supranational assessment practices. I conclude that domestic conditions in the Canadian federation were conducive to the rapid sub-national integration of the PISA, including the creation of a national assessment modeled after the PISA, despite the decen-tralized structure of the Canadian elementary and secondary education system and the de jure policy autonomy enjoyed by the provinces and territories over the field of education.

Overview of the OECD, the PISA, and educational governance in Canada

The OECD has 34 member countries, with Canada being 1 of the 19 original founding members (Gurria, 2011) and the seventh largest contributor to the OECD (White, 2011). Woodward (2009) describes the OECD as an IO that ‘sows the seeds of inter-state consen-sus and cooperation’ (p. 5). The OECD governs globally and exercises influence through four inter-related areas. These areas include (1) cognitive governance, by engendering a sense of community among its members; (2) normative governance, though knowledge and idea dissemination; (3) legal governance, through the production of ‘soft law’ with enforcement relying on surveillance and peer pressure; and (4) palliative governance, through its capacity to facilitate processes within the global arena (see Woodward, 2009: Chapter 3).

The OECD Council provides oversight and strategic direction to the OECD (n.d.). The Council has two sets of members: the Ministerial Council, which meets annually, and the Council of Permanent Representatives, which meets monthly and is equivalent in its role to the ‘Board of Directors’ of a business corporation (Carroll and Kellow, 2011: 12). Member country representatives adopt one of these types of ‘acts’: Decisions, Recommendations, and Resolutions. Decisions are binding with all members voting on it; Recommendations are non-binding but are ‘softly’ enforced through peer group pres-sure and negative impact on a member’s reputation; and Resolutions are related to the internal workings of the organization (Carroll and Kellow, 2011: 13).

The OECD Council delegates responsibilities to committees. Among these committees is the Education Policy Committee (EDPC), which aims to help both members and non-members build efficient and effective educational systems and improve learning outcomes (OECD, 2012: 389). Two seats are informally provided to Canada on the EDPC (other federal states can also request additional seats): a national seat, represented by the federal department of Employment and Skills Development Canada, and a sub-national seat rep-resented by one of the provinces or territories, which is nominated by the Council of Ministers of Education, Canada (CMEC), to speak on behalf of all Canadian provinces and territories. CMEC is an intergovernmental provincial and territorial body that pro-vides a forum for discussing educational issues and for undertaking educational projects and initiatives. National and sub-national discussions take place in advance of the EDPC meetings in order to endeavor to ‘speak with one voice’ and to ensure an ‘approach that is

at eFada PARENT on March 14, 2016gsp.sagepub.comDownloaded from

Morgan 51

inclusive of both levels of government and that is pan-Canadian’ (Personal Correspondence, Foreign Affairs, Trade and Development Canada, November 11, 2014).

The OECD’s policy influence includes its role as a purveyor and legitimator of ideas (Mahon and McBride, 2009; Morgan and Shahjahan, 2014) as well as its knowledge-production capacities as an IO (Morgan, 2009; Porter and Webb, 2004). It enacts soft modes of regulation, including publication of comparative data such as educational and social indicators, and peer reviews involving country and thematic reviews (Kallo, 2009; Mahon and McBride, 2009). Furthermore, its influence derives from demarcating norms and practices that further liberal, market friendly, economic policies (Henry et al., 2001; Rizvi and Lingard, 2010).

The OECD has played a key role in the production and distribution of some of the most significant international educational indicators and assessment tools used in educa-tional policy circles (Sellar and Lingard, 2013). Among these soft governance tools, the PISA has become ‘an influential element of education policy making processes at the national level’ (Breakspear, 2012: 27). Launched in 1997, PISA is a triennial student assessment of 15-year-olds and is viewed as a global benchmark of student achievement across educational systems. Whereas 32 states participated in the initial PISA in 2000, by 2012, the number had climbed to 65 ‘economies’. The OECD emphasizes that PISA is not a curriculum-based assessment, but one that assesses students’ knowledge and skills in reading, mathematics, and science literacy (Morgan, 2009).

Canada is often showcased as being among the top 10 performers on the PISA and its educational level of equity is often highlighted (Gurria, 2011). The OECD chose to examine Canada’s school system and its educational reform and innovation in its publi-cation, What Makes School Systems Perform? School Systems through the Prism of PISA, because of its outstanding performance on the PISA (OECD, 2004). However, Canada does not have a national department of education. Jurisdiction for education lies at the sub-national level with 13 provinces and territories, each having its own ministry of education with its own minister of education. Despite the decentralized nature of the Canadian education system and the lack of a federal department of education, provinces and territories have maintained a de facto pan-Canadian policy framework for elemen-tary and secondary education (Wallner, 2014: 5). They have adopted similar policies and programs and coordinated educational projects and initiatives via CMEC.

Typically, a ministry of education addresses areas such as planning, school finance, curriculum development and assessment, special education, language programs, and renovation/construction of school buildings (Levin and Young, 1998: 35). School boards are responsible for the day-to-day operation and administration of schools. A profes-sional administration headed by a superintendent or director of education assists school board members (Levin and Young, 1998: 41). School boards are viewed as important players in educational governance as leaders of educational change and implementers of provincial innovations (Henley and Young, 2008: 6).

Through CMEC, sub-national entities are active participants in the OECD and are exposed to OECD educational discursive practices and policy prescriptions (Wallner, 2014: 211). The PISA Governing Board (PGB), composed of OECD member states and non-member ‘observer’ states, determines PISA policy priorities. A federal representative and a CMEC representative participate on the PGB. In terms of national implementation,

at eFada PARENT on March 14, 2016gsp.sagepub.comDownloaded from

52 Global Social Policy 16(1)

PISA National Project Managers for Canada include a representative from CMEC and from Statistics Canada. The participation of CMEC is crucial to the successful administra-tion of the PISA in Canada’s decentralized educational system. It is important to note that the provinces and territories resent any federal interference in education. CMEC is viewed as contributing to the exclusive jurisdiction of provinces and territories over education (CMEC, n.d.-a). With respect to student assessments, CMEC is involved in the design, implementation, and analysis of national and international student assessments including the PCAP, PISA, Trends in International Mathematics and Science Study (TIMSS), and Progress in International Reading Literacy Study (PIRLS; CMEC, n.d.-b). Recently, CMEC launched an assessment newsletter that reports on and explains results of provin-cial and territorial national and international assessments. By adopting such initiatives, CMEC reinforces the perceived importance of these measurement tools in the Canadian federation.

The adoption of national student assessments further support Wallner’s (2014) claim that Canadian provinces and territories have developed a pan-Canadian policy framework for education. In terms of the national implementation of student assess-ments, CMEC launched the School Achievement Indicators Program (SAIP) in 1993 with the objective of assessing both 13- and 16-year-olds in mathematics, reading and writing, and science. The SAIP was administered nine times with the last assessment conducted in 2004. In 2007, CMEC launched a new national assessment, PCAP, that allows provincial jurisdictions to validate their students’ results against the PISA (CMEC, n.d.-c). The PCAP assesses only 13-year-olds in mathematics, reading, and science, with one of these subjects being a major domain every 3 years; this major domain is the same as that of the PISA which will be administered to 15-year-olds two years later. Similar to PISA, PCAP produces contextual reports that analyze the results of contextual questionnaires administered to students, teachers, and school principals (CMEC, 2009, 2012b).

Conceptual framework

Before detailing the three themes that facilitate the adoption of PISA by sub-national entities, I wish to expand on my conceptual framework. My analysis draws on concepts from the policy studies and sociology literature to make sense of educational policy mak-ing. I make use of the concept of a ‘universal model of education’ in order to describe an educational modeling process that has become ubiquitous and globally applicable (Beech, 2009). In the case of the PISA, this universal model also ranks and assigns value to learning outcomes in reading, science, and mathematics. I draw on the concept of ‘framing’ (Juillet, 2007) to describe how the OECD PISA offers domestic policy-makers an authoritative ‘storyline’ that helps them legitimize their policy reforms. A policy frame such as the PISA reduces the complexity of educational problems as it offers a ready-made, credible, legitimate, and widely accepted storyline on educational quality and outcomes whose source is the prestigious OECD (Juillet, 2007: 259). These concepts provide me with the tools to critically analyze how a policy is framed, how it works, how it is implemented, and whose interests it serves, while also examining its relationship with the needs of the state and the economy (Ball, 2008; Juillet, 2007).

at eFada PARENT on March 14, 2016gsp.sagepub.comDownloaded from

Morgan 53

I organize the methods used in my analysis into two broad categories. The first set of methods focuses on discursive practices. I analyze the language actors deploy to support a particular policy, who benefits from the policy, and which ideas do actors privilege in articulating a particular policy approach. For example, I identify a specific discourse that is supportive of international benchmarking and of alignment with the PISA to further human capital development by sub-national entities in Canada. The second broad cate-gory is associated with techniques that are deployed in association with a policy. These techniques could encompass a set of adjustments, procedures, and capabilities; tech-niques through which human forces are organized to implement a policy; and how a system functions to make policies (Ball, 2008). For example, I analyze how sub-national entities mobilize resources in support of creating a national assessment that is aligned with the PISA.

In order to analyze the conditions for PISA’s domestic integration, I reviewed and analyzed documents such as SAIP, PCAP, and PISA assessment reports published by CMEC and the federal government, as well as OECD PISA documents. I also analyzed publicly available information related to these assessments such as newsletters, website information, news releases, and speeches. I used key words to code my secondary data and looked for patterns to identify common themes (Bogdan and Biklen, 1992). With these methodological tools in hand, I reveal in the following analytic sections the three key themes that facilitate the modeling of universalizing educational projects by sub-national entities.

Domestic conditions for the sub-national integration of the PISA

A preoccupation with international benchmarking of student performance

In the Canadian case, the presence of a sub-national testing culture contributes to a pol-icy approach that values international benchmarking of student performance. A testing culture encompasses the widespread belief and acceptance that the standardized testing of students accurately reflects student learning and the quality of schooling (see for example Sacks, 1999). Provincial and territorial ministries of education increasingly rely on the results of standardized tests as a foundation for improving both student achieve-ment and schools. Their reliance grew as sub-national entities shifted their educational objectives away from equity toward quality in education as part of the broader account-ability movement in education of the 1980s and 1990s (Wallner, 2014: 209). Canadian provinces made investments in developing capabilities for assessing their students’ learning outcomes and, through these investments, contributed to fostering a testing cul-ture in their respective provinces and territories (Crundwell, 2005; Klinger, DeLuca and Miller, 2008; Mawhinney, 1998). As can be seen from Table 1, all the provinces/territo-ries, except for Saskatchewan, have instituted sub-national standardized tests in core subject areas across grades.

To support educational quality reforms, provinces and territories made efforts to col-laborate on developing educational indicators and a national assessment for testing stu-dents. Through CMEC, the provinces and territories delivered standardized tests such as

at eFada PARENT on March 14, 2016gsp.sagepub.comDownloaded from

54 Global Social Policy 16(1)

the SAIP from 1993 to 2004, and more recently, the PCAP starting in 2007. In addition, the provincial and territorial engagement in international student assessments such as the International Association for the Evaluation of Educational Achievement’s (IEA) PIRLS and TIMSS, as well as the OECD PISA, reinforces the testing culture. The growth of a performative testing culture among Canadian sub-national entities has also created the need for ‘bridging’ the assessments in order to make sense of results across sub-national, national, and international scales (Brochu, 2007).

The emergence of a sub-national testing culture is closely associated with the transfer of ideas known as the new public management (NPM) that began to appear in the early 1980s. Driven by fiscal constraints and growing deficits, governments turned to NPM, which offered alternative approaches to managing, organizing, and structuring the public service (Barzalay, 2001; Lindquist, 1997). NPM practices included a focus on results, outputs, and performance, and the construction of accountability regimes to address government

Table 1. Provincial/territorial assessments by subject and grade (2013).

Province/territory Subjects assessed per grade

Language/Literacy

Math/Numeracy

Sciences Social Science

Academic subjects

Alberta 3, 6, 9 3, 6, 9 6, 9 6, 9 11/12British Columbia 4, 7, 10 4, 7, 10 10 11/12Manitoba 3, 8 3, 7 New Brunswick-Anglophone Sector

2, 7, 9 5, 8

New Brunswick-Francophone Sector

2, 4, 5, 8, 3,5, 8 5, 8 11/12

Newfoundland & Labrador (English school system)

1, 2, 3, 6, 9 3, 6, 9 11/12

Newfoundland & Labrador (French school system)

6 3, 6, 9 9 11/12

Northwest Territories 3, 6, 9 3, 6, 9 6, 9 6, 9 11/12Nova Scotia 3, 6, 8, 10 4, 6, 8, 10 Nunavut 3, 6, 9 3, 6, 9 6, 9 6, 9 11/12Ontario 3, 6, 10 3. 6, 9 Prince Edward Island 3, 6, 9 3, 6, 9 Quebec (English school system)

6 6, 10 10 11/12

Quebec (French school system)

4, 6, 8 6, 10 10 11/12

Saskatchewana 11/12Yukon 4, 7, 10 4, 7, 10 10 11/12

Source: Klinger et al. (2008), CMEC (2013a), updates by the author.a Saskatchewan’s government does not have any province-wide standardized tests. The Government of Sas-katchewan tried to introduce these in its 2013 Budget but faced strong opposition from parents, teachers, and other stakeholders (see, for example, Markewich, 2013; Spooner and Orlowski, 2013). At this stage, there are no plans to introduce province-wide testing (Stinson, 2014).

at eFada PARENT on March 14, 2016gsp.sagepub.comDownloaded from

Morgan 55

inefficiencies (Rizvi and Lingard, 2010: 119). More specifically, with the recession of the 1990s and the jobless recovery, the Canadian federal government reduced spending on social programs and financial transfers to the provinces and territories, which affected sub-national budgets (Lindquist, 1997: 53). With tightening educational budgets, ministries of education adopted deficit reduction plans and accountability regimes that included reforms such as consolidating school boards, changing school curricula, and instituting perfor-mance-based measures and standards (Lindquist, 1997: 55; Wallner, 2014: 210).

In Ontario, large-scale student assessments were adopted in the early 1990s following a recommendation made by the Royal Commission on Learning, which spurred the crea-tion of the Education Quality and Accountability Office (EQAO), an arms-length agency that is accountable to Ontario’s Minister of Education (EQAO, 2014; Volante, 2007). With a budget of US$32 million, EQAO develops and administers achievement tests to students in Grades 3, 6, and 9 in reading, writing, and mathematics with the aim of ‘gaug[ing] the quality in Ontario’s publicly funded education system’ (EQAO, 2013). Similarly, British Columbia (BC) relies on standardized tests to evaluate how well students are achieving basic skills and to improve student achievement. The BC Foundation Skills Assessment has been administered annually since 2000 to students in Grades 4 and 7 in reading, writ-ing, and numeracy (British Columbia Ministry of Education, n.d.). The last province to implement student testing was Prince Edward Island (PEI) following the recommenda-tions made by the Task Force on Student Achievement to ‘administer common assess-ments to Island students at grades 3, 6, 9, and for designated subjects at the senior high school level’ (Kurial, 2005: 4). Think tanks that support market-oriented education poli-cies reinforce the sub-national testing culture by publishing ‘report cards’. For example, the Atlantic Institute for Market Studies produces the Report Card on Atlantic Canadian High Schools and the Fraser Institute publishes the School Report Card.

Provinces also participate in international assessments such as the TIMSS, PIRLS, and PISA. In the case of TIMSS and PIRLS, provinces fund their respective participation in these IEA assessments; BC, Alberta, Ontario, and Quebec have participated in TIMSS since its inception in 1995 and in PIRLS since 2001. However, in the case of the PISA, the federal government provides funding for provincial participation (Brochu et al., 2013), which reflects the federal interest in educational outcomes as a measure of human capital performance. Federal and provincial actors frame Canadian participation in the PISA in terms of measuring the value of public educational investments in the interest of Canadians (Knighton, Brochu and Gluszynski, 2010: 10). Involvement in international assessments lends itself to developing practices and an infrastructure for linking out-comes at the international level to outcomes at the national and sub-national levels, fur-ther reinforcing a testing and indicators culture.

The reporting of PISA results contributes to the creation and sustenance of a testing and indicators culture, as these results become part of the sub-national and national dis-course. CMEC’s storyline on educational indicators emphasizes the need for data in order to prepare Canadians for the challenges of an uncertain future (CMEC, 2012a). CMEC’s (2012a) language also reflects the need for international educational data as something in the interest of all Canadians and as ‘essential’. For example, Minister Jennex, the previous Chair of CMEC, explains that such internationally comparable data ‘help us understand how our education systems are doing, so we can direct government

at eFada PARENT on March 14, 2016gsp.sagepub.comDownloaded from

56 Global Social Policy 16(1)

efforts and taxpayer resources to where they are most needed’ (CMEC, 2012a). Even though Canadian students do comparatively well on international assessments, justifica-tion for reforming educational systems stems from the rationale that Canadian schools might fall behind if they do not consistently seek improvements. Reporting on the results of PISA 2009, BC’s Education Minister Margaret MacDiarmid noted that ‘more jurisdic-tions are improving their results and it’s a reminder that while we have a good education system, we need to make improvements to remain competitive on a global scale’ (Government of British Columbia, 2010). PISA 2012 results showed that reading scores for Canadian students had generally not improved and, furthermore, science scores had declined. The reporting of these results created a worrying discourse, with the current Chair of CMEC noting that ‘we cannot be complacent in the face of a downward trend’ (CMEC, 2013b). CMEC reacted to these results by noting that provinces and territories would ‘work together to ensure that Canadian students not only continue to place near the top of PISA performance but improve on their results over time’ (CMEC, 2013b). Based on this language, it appears of paramount importance for CMEC that sub-national entities should continuously be adopting benchmarking practices in order to rank high in PISA international rankings.

The growing alignment of interests among federal and sub-national entities also con-tributes to a policy approach that values international benchmarking of student perfor-mance. Globalization has encouraged this alignment of interests given federal and provincial concerns with workforce competitiveness. At both the national and sub-national levels, Canada has increasingly integrated into the regional and global economy through liberalizing trade agreements such as the North American Free Trade Agreement, the World Trade Organization regime (Skogstad, 2008), and, more recently, the Canada and European Union Comprehensive Economic and Trade Agreement. The opening up of the Canadian economy creates pressure for improving Canada’s competitiveness and productivity, particularly with regard to its human capital formation. The Council of the Federation, a body representing the 13 Premiers of the provinces and territories, empha-sizes its commitment to continue taking action to ensure that Canada is ‘well positioned to protect and strengthen [its] economic advantage, which requires a strong and skilled labour force’ (Council of the Federation, 2012). A pragmatic view of education in terms of its contribution to human capital and skill formation drives the alignment of interests across scales. The policy discourse links education with labor market policy and is informed by the idea that investments in human capital result in economic growth and prosperity (McBride, 2000). With educational policy indicators becoming surrogates for the economy’s strength (Rizvi and Lingard, 2010: 123), it is not surprising that sub-national entities have rapidly integrated the PISA into Canada’s decentralized educa-tional structure.

I find that sub-national entities become consumed with international benchmarking when they have invested in pre-existing collaborative efforts to implement large-scale assessments and to produce educational indicators. These investments act as feedback loops, reinforcing the institutionalization of large-scale assessment practices. In the Canadian case, provinces and territories invested resources to develop educational indica-tors and to create a national assessment first with the SAIP and subsequently with a PISA-modeled assessment, the PCAP. Canadian provinces and territories have participated in

at eFada PARENT on March 14, 2016gsp.sagepub.comDownloaded from

Morgan 57

the OECD’s Indicators of Educational Systems (INES) program since 1988, which pro-vides the data for the OECD’s flagship publication, Education at a Glance (Statistics Canada and CMEC, 2014: 11). Through the Pan-Canadian Education Indicators Program (PCEIP), a joint venture between Statistics Canada and CMEC, sub-national entities have collaborated and provided national educational indicators since 1996. In 2009, the prov-inces and territories decided to publish another PCEIP product, Education Indicators in Canada: An International Perspective, a report that compares Canadian educational indi-cators with those of other OECD countries. This report ‘harmonizes a selection of indica-tors from the OECD’s Education at a Glance’ based on ‘policy relevance and the availability of data’ (Statistics Canada and CMEC, 2014).

In order to put into place assessments at the national level, provinces and territories worked together to achieve specific objectives that required technical collaboration. In 1993, they launched the SAIP, which aimed at measuring student achievement based on provincial and territorial curricular objectives. In effect, the development of large-scale assessments at the national level contributed to developing common goals among the provinces (Crocker, 2002: 26). Putting in place techniques and practices for national assessments such as the SAIP reaffirms the capacity among provinces to work together and, at the same time, builds capacity sub-nationally for provincial-wide assessments (Wallner, 2014: 225). Having taken the step to collaborate nationally and achieve such common goals, sub-national entities can compare themselves internationally and bench-mark their performance through similar collaborative efforts. The evidence suggests that by institutionalizing assessment and evaluation practices, sub-national entities created the conditions for institutional path-dependent behavior (Pierson, 2000).

This section focused on analyzing the ways in which sub-national preoccupation with international benchmarking facilitated the rapid integration of PISA into the Canadian educational system. I analyzed the practices and discourses of the Canadian testing and indicators culture for evaluating educational performance, the alignment of interests among provincial/territorial jurisdictions on the importance of measuring student learn-ing outcomes in a competitive global economy, and the institutionalization of collabora-tive efforts and processes among sub-national entities. I turn to the second theme that facilitated the modeling of a universal educational project at the sub-national level.

A shift toward competency-based assessments

My analysis finds that sub-national entities have shifted away from a curriculum-based assessment to a competency-based one that models the PISA. In initiating discussions on the SAIP in 1989, provinces chose achievement in school subjects as a worthwhile indicator of the performance of an education system, thereby opting for a curriculum-based national assessment (Slevinsky, 1996). As the 1994 Memorandum of Understanding among the CMEC Ministers of Education indicated, SAIP assessment instruments were to reflect the curriculum requirements and orientations of the partici-pating provinces and territories. This was the first time that the provincial ministers of education agreed on the elements of a national assessment (Slevinsky, 1996). More significantly, in establishing a national assessment, CMEC demonstrated its commit-ment to pursuing an accountability agenda that ‘redefine[d] the traditional pattern of

at eFada PARENT on March 14, 2016gsp.sagepub.comDownloaded from

58 Global Social Policy 16(1)

provincial control over educational reform agendas’ (Mawhinney, 1998: 102). However, the focus remained on measuring achievement based on provincial/territo-rial curricula, which was also consistent with the overall approach to international large-scale assessments such as the PIRLS and the TIMSS. In developing the SAIP assessment, the test creators were concerned with ensuring that it was consistent with provincial and territorial curricula (CMEC, 2000: 6).

With the provinces participating regularly in the PISA starting in 2000, I find that the PISA approach to assessing student competencies was given a higher degree of legiti-macy compared to the approach to assessing student subject-matter knowledge. A paper produced by the Canadian Education Statistics Council explicitly notes that due to PISA, there is a need to ‘rethink the function of SAIP’ and to move toward measuring ‘more generic outcomes’ or competencies:

The emergence of PISA as an ongoing assessment of core outcomes has led to the need to rethink the function of SAIP. The most useful new direction would be to use this project as a vehicle for developing assessments of more generic outcomes. (Crocker, 2002: 26)

Two years later, in 2004, CMEC administered the final round of SAIP and a new assess-ment, the PCAP, was launched in 2007 to assess only 13-year-old students based on a set of competencies rather than curricular knowledge.

Sub-national entities began to adopt discursive practices around assessing broader goals or ‘generic’ skills, which are closer to the PISA model. In constructing the PCAP 2007 reading assessment, there was a deliberate attempt to align the test items with sev-eral large-scale assessments, including the PISA. For example, the PCAP reading assess-ment was based on a set of competencies including comprehension, interpretation, and response and reflection as major organizing aspects of reading literacy (CMEC, 2012c: 3). The PCAP 2007 report emphasized that adopting a definition of scientific literacy similar to that of the PISA enhances the possibility of finding some areas of comparabil-ity between PISA and PCAP assessments (CMEC, 2008: 32). PCAP discursive practices bolster the adoption of a competency-based assessment by noting the difficulty and com-plexity in reconciling school curricula across jurisdictions. The discourse underlines the need for measuring competencies by indicating that Canadian students learn many skills similar to the ones that PCAP assesses, thereby minimizing the importance of assessing curriculum knowledge (CMEC, 2008: 1).

Literacy assessment frameworks translate the competency-based assessment approach into concrete assessment items. These frameworks organize learning competencies into coherent conceptual maps and provide a rationale for assessing a particular subject domain using a set of skills. For example, the purpose of assessment frameworks such as the scientific literacy framework is to define the competencies required by students to be scientifically literate. The PCAP 2007 report states that there are three competencies, which are reflected in most science curriculum documents that demonstrate scientific literacy: science inquiry, problem solving, and decision making (CMEC, 2008: 32). The three scientific literacy competencies that the PISA aims to assess include explaining phenomena scientifically, identifying scientific issues, and using scientific evidence (OECD, 2009: 138). Test creators designed similar types of literacy frameworks for

at eFada PARENT on March 14, 2016gsp.sagepub.comDownloaded from

Morgan 59

reading and mathematics. For example, in mathematics, the three PCAP processes that are assessed include communication/representation, reasoning/connections, and problem solving (CMEC, 2011). For PISA, the three mathematics competency clusters measured are reproduction, connections, and reflection (OECD, 2009). Given that both the PCAP and PISA competencies are organized under broad literacy frameworks, it appears that alignment of test items across these two assessments is likely easier than alignment of test items under subject-matter frameworks. In other words, sub-national entities are integrating a universal model of education into their educational systems by standardiz-ing learning outcomes that are based on literacy frameworks.

In summary, I find evidence that sub-national entities shifted away from a curriculum-based assessment to a competency-based one that is aligned with the PISA. Provinces and territories adopted a discourse that revolved around measuring learning outcomes of non-content knowledge and generic skills. They also chose to measure learning out-comes that were based on literacy frameworks rather than subject-matter frameworks. With these adjustments to testing student learning, sub-national entities facilitated the integration of a universal model of education into Canada’s decentralized educational system. I now turn to the final theme that supports my argument.

Adoption of organizational systems and processes of assessments

I identify the adoption of organizational systems and processes of assessments as a key theme for facilitating the integration of the PISA sub-nationally. I find that ministries of education and local school boards draw on assessment results as part of their institutional practices. As an example, the Ottawa Carleton District School Board makes use of inter-national (PIRLS, TIMSS, and PISA), national (PCAP), and provincial data (EQAO) to inform its improvement plans and its yearly achievement reports. In this way, the report-ing of international assessment results is integrated into local educational planning. Specific directorates or divisions within the educational bureaucracies are responsible for developing assessment and evaluation policies and for analyzing the results of large-scale assessments. Both ministries of education and local school boards are employing assessment specialists to crunch the numbers. Furthermore, colleges of education, such as the Ontario Institute for Studies in Education, are expanding their higher education specialization fields in large-scale assessments to meet the increase in demand for trained educational psychometricians.

The replication of supranational assessment practices is also manifest in the adoption of more technical and less consultative approaches. In the first national assessment that they developed, sub-national entities opted to use a collaborative process to set levels for student achievement on SAIP (CMEC, 1999, 2000, 2002, 2003, 2005). For example, for the 1999 SAIP Science assessment, a ‘panel’ consisting of 93 participants was organized representing a cross-section of interest groups from across Canada:

This anonymous panel consisted of teachers, students, parents, university academics and curriculum specialists, Aboriginal teacher trainers, business and industry leaders, community leaders, and members of national organizations with an interest in science education. (CMEC, 2000: 27)

at eFada PARENT on March 14, 2016gsp.sagepub.comDownloaded from

60 Global Social Policy 16(1)

Furthermore, SAIP framework development and criteria involved extensive reviews by ministries of education, teachers, and other specialists:

As the SAIP Science Assessment Framework and Criteria evolved, each ministry of education reviewed draft proposals in the context of its curriculum and according to its own consultation procedures. Classroom teachers and professional groups also reviewed the proposed assessment framework and criteria. Student evaluation and curriculum specialists from the universities, science experts, and representatives from nongovernmental organizations also reviewed the criteria. (CMEC, 2005: 15)

With PCAP, test creators took a different approach in setting the levels of achieve-ment that students need to attain. Instead of a panel approach, PCAP uses what it terms ‘outside validators’ to determine the ‘cut score’. The method is called the ‘bookmark method’, which involves a technique for ‘determining the relative difficulty of the full set of assessment instruments and delineating the point that defines the achievement of each level of success, thus determining the “cut score”’ (CMEC, 2008: 6). In its PCAP reports, CMEC does not reveal who these ‘outside validators’ are nor which institutions they represent. Overall, the approach appears very technical and less transparent than the SAIP approach to developing assessments.

Finally, organizational practices in administering the PISA and the PCAP follow a similar pattern of assessment schedules. The PISA survey takes place triennially as do the PCAP surveys. PISA tests 15-year-olds whereas the PCAP tests 13-year-old Canadian students 2 years prior to their taking the PISA. The major domains follow a similar pat-tern with PCAP focusing on the domains that PISA will test 2 years later. For example, when CMEC first administered the PCAP in 2007, it emphasized reading as its major domain. Two years later, the PISA 2009 also focused on reading as its major domain. PCAP 2010 emphasized mathematics while PISA 2012 had as its major domain mathe-matics. PCAP 2013 had science as its focus while the PISA 2015 will also have science as its major domain. This means that the PCAP has completed a whole cycle of reading–mathematics–science assessments between 2007 and 2013 whereas the PISA will have completed its second cycle between 2009 and 2015.

In this final section, I analyzed how the adoption of specific organizational processes contributed to the adoption of a universal model of education. I noted that sub-national ministries of education and local school boards integrated the analysis of large-scale assessments into their institutional practices. In addition, CMEC adopted a technical approach to developing the PCAP rather than a consultative one. Finally, both PCAP and PISA are administered following a similar pattern of assessment schedules. In combina-tion, these organizational practices facilitated the rapid integration of PISA into the Canadian educational system.

Conclusion

This article traced the sub-national effect of the PISA in Canada’s decentralized educa-tional system. It identified three themes that facilitated the modeling of the PISA by sub-national entities: (1) a preoccupation with the international benchmarking of sub-national

at eFada PARENT on March 14, 2016gsp.sagepub.comDownloaded from

Morgan 61

educational performance, (2) a shift away from a curriculum-based assessment to a com-petency-based one, and (3) the adoption of organizational systems and processes of assessment aligned with supranational assessment practices.

I suggest that domestic conditions in the Canadian federation were conducive to the rapid integration of the PISA sub-nationally, including the creation of a national assess-ment modeled after the PISA, despite the decentralized structure of the Canadian ele-mentary and secondary education system and the de jure policy autonomy enjoyed by the provinces and territories over the field of education. I find that the integration of the PISA contributes to the internationalization of domestic educational policy in the Canadian federation. Under SAIP, provinces and territories created a common set of cur-riculum categories in administering this assessment, while retaining a locally informed approach to the test. In the case of PCAP, the thrust was to align the national assessment with a supranational assessment.

With the adoption of PCAP, not only was there rapid integration of transnational ideas in terms of competencies and measurement techniques, but there was also an effort to adopt the learning outcomes of the PISA. Yet, Canada’s sub-national systems of educa-tion are not uniform but diverse. This diversity allows for experimentation to occur within each system and for the adoption of promising practices. Since provinces and territories have jurisdiction over education, they are able to create a public school system that reflects their specific demographic, cultural, and regional character (Henley and Young, 2008). As sub-national entities increasingly move toward adopting universaliz-ing educational principles and concepts espoused in assessments such as the PISA, they risk losing this capacity to experiment. As well, they risk becoming less attentive to fac-tors which are not measured by the PISA, but which play a significant role in student success and well being (e.g. see Berliner, 2009; Dronkers and De Heus, 2013; Ma, Jong, and Yuan, 2013; Meyer and Schiller, 2013; Zhao and Meyer, 2013). Provinces and ter-ritories also risk not taking into account locally informed knowledge about education. A universalizing educational project such as the PISA has the tendency to standardize knowledge and narrow the vision for what is possible in education by limiting the diver-sity of views and ideas for educational reform.

It is important to consider that staff members representing their countries in IOs are educated domestically and are usually attuned to national developments and debates. More specifically, scholars need to consider that IOs are not operating in a vacuum, iso-lated from the debates taking place in member states. For example, in Canada, teachers’ unions have contested educational reforms and proposed alternative educational visions and objectives to those proposed by ministries of education. Grassroots initiatives such as the Great Schools Project (Chudnovsky, 2013) or People for Education mobilize com-munity members to find alternative ways to improve schools. In Saskatchewan, parents, teachers, and academics mobilized in order to oppose the institutionalization of prov-ince-wide standardized testing (Spooner and Orlowski, 2013). Future research could explore in greater depth how domestic debates influence discursive practices of IOs such as the OECD.

As Mahon and McBride (2009) note, OECD policy advice tends to be most extensive among member states that have aligned with the OECD. Canada and its jurisdictions have generally been supportive of the OECD educational agenda and its educational policy

at eFada PARENT on March 14, 2016gsp.sagepub.comDownloaded from

62 Global Social Policy 16(1)

prescriptions and programs. In the Canadian context, PCAP is a manifestation of how OECD education policy advice is transferred across scales to the sub-national level through an assessment that models the PISA. It is important to analyze and reveal these forms of educational borrowing as they determine the path taken in future educational reforms and policies. Canada’s sub-national jurisdictions need to be cautious about undertaking educa-tional reforms borrowed from transnational actors such as the OECD, while neglecting to adopt a locally informed lens – a lens that considers the impact of such reforms from diverse perspectives including those of students, teachers, and communities.

Acknowledgements

The author is grateful for the valuable comments provided by the reviewers of the journal.

References

Ball SJ (2008) The Education Debate. Bristol: The Policy Press.Barzalay M (2001) The New Public Management: Improving Research and Policy Dialogue.

Berkeley, CA: University of California Press.Beech J (2009) Who is strolling through the global garden? International agencies and educa-

tional transfer. In: Cowen R and Kazamia A (eds) International Handbook of Comparative Education, vol. 22. London and New York: Springer, pp. 341–357.

Berliner DC (2009) Poverty and Potential: Out-of-School Factors and School Success. Boulder, CO: Education and the Public Interest Center, University of Colorado; Tempe, AZ: Education Policy Research Unit, Arizona State University.

Bieber T (2014) Cooperation or conflict? Education politics in Switzerland after the PISA study and the Bologna process. In: Martens K, Knodel P and Windzio M (eds) Internationalization of Education Policy: A New Constellation of Statehood in Education? Basingstoke: Palgrave Macmillan, pp. 179–201.

Bieber T, Martens K and Niemann D (2014) Soft governance through PISA benchmarking – German reforms in secondary education. In: Lawn M and Normand R (eds) Shaping of European Education: Interdisciplinary Approaches. London: Routledge, pp. 50–65.

Bogdan RC and Biklen SK (1992) Qualitative Research for Education. Boston, MA: Allyn & Bacon.

Breakspear S (2012) The policy impact of PISA: An exploration of normative effects of interna-tional benchmarking in school system performance. OECD Education Working Papers, no. 71, 22 February. Paris: OECD Publishing.

British Columbia Ministry of Education (n.d.) Foundation Skills Assessment (FSA). Available at: http://www.bced.gov.bc.ca/assessment/fsa (accessed 31 October 2014).

Brochu P (2007) Bridging IEA studies and policy making. Presented at the 48th IEA general assembly meeting, Hong Kong, 8–11 October.

Brochu P, Deussing MA, Chuy M., et al. (2013) Measuring Up: Canadian Results of the OECD PISA Study. The Performance of Canada’s Youth in Mathematics, Reading and Science. 2012 First Results for Canadians Aged 15. Toronto, ON, Canada: CMEC.

Carroll P and Kellow A (2011) The OECD: A study of organisational adaptation. Cheltenham: Edward Elgar Publishing.

Chudnovsky D (2013) The great school project. Our Schools/Our Selves 2013. Available at: http://www.policyalternatives.ca/sites/default/files/uploads/publications/National%20Office/2010/11/The%20Great%20Schools%20Project%20-%20David%20Chudnovsky.pdf .

at eFada PARENT on March 14, 2016gsp.sagepub.comDownloaded from

Morgan 63

Council of Ministers of Education, Canada (CMEC) (1999) SAIP 1998: Report on Reading and Writing Assessment. Toronto, ON, Canada: CMEC.

Council of Ministers of Education, Canada (CMEC) (2000) SAIP 1999: Report on Science Assessment. Toronto, ON, Canada: CMEC.

Council of Ministers of Education, Canada (CMEC) (2002) Report on Mathematics Assessment III School Achievement Indicators Program 2001. Toronto, ON, Canada: CMEC.

Council of Ministers of Education, Canada (CMEC) (2003) SAIP 2002: Report on Writing Assessment III. Toronto, ON, Canada: CMEC.

Council of Ministers of Education, Canada (CMEC) (2005) Report on Science III Assessment. School Achievement Indicators Program 2004. Toronto, ON, Canada: CMEC.

Council of Ministers of Education, Canada (CMEC) (2008) PCAP-13 2007: Report on the Assessment of 13-Year-Olds in Reading, Mathematics and Science. Toronto, ON, Canada: CMEC.

Council of Ministers of Education, Canada (CMEC) (2009) PCAP-13 2007: Contextual Report on Student Achievement in Reading. Toronto, ON, Canada: CMEC.

Council of Ministers of Education, Canada (CMEC) (2011) PCAP-2010: Report on the Pan-Canadian Assessment of Mathematics, Science, and Reading. Toronto, ON, Canada: CMEC.

Council of Ministers of Education, Canada (CMEC) (2012a) Canadian education systems make the grade in new OECD report. Available at: http://cmec.ca/278/Press-Releases/Canadian-Education-Systems-Make-the-Grade-in-New-OECD-Report.html?id_article=570 (accessed 31 October 2014).

Council of Ministers of Education, Canada (CMEC) (2012b) PCAP-13 2010: Contextual Report on Student Achievement in Mathematics. Toronto, ON, Canada: CMEC.

Council of Ministers of Education, Canada (CMEC) (2012c) PCAP- 2007: Report on Reading Strategies and Reading Achievement. Toronto, ON, Canada: CMEC.

Council of Ministers of Education, Canada (CMEC) (2013a) Provincial and Territorial Assessments. Toronto, ON, Canada: CMEC.

Council of Ministers of Education, Canada (CMEC) (2013b) New OECD report shows high levels of achievement by Canadian students. 3 December. Available at: http://cmec.ca/278/Press-Releases/New-OECD-Report-Shows-High-Levels-of-Achievement-by-Canadian-Students.html?id_article=778 (accessed 31 October 2014).

Council of Ministers of Education, Canada (CMEC) (n.d.-a) About. What is CMEC? Available at: http://www.cmec.ca/11/About/index.html (accessed 31 October 2014).

Council of Ministers of Education, Canada (CMEC) (n.d.-b). Assessment. Available at: http://www.cmec.ca/131/Programs-and-Initiatives/Assessment/Overview/index.html (accessed 31 October 2014).

Council of Ministers of Education, Canada (CMEC) (n.d.-c) Pan-Canadian Assessment Program (PCAP). Available at: http://www.cmec.ca/240/ (accessed 31 October 2014).

Council of the Federation (2012) Premiers collaborate to secure Canada’s economic success. Available at: http://www.conseildelafederation.ca/en/latest-news/15-2012/127-premiers-col-laborate-to-secure-canada-s-economic-success (accessed 31 October 2014).

Crocker RK (2002) Learning Outcomes: A Critical Review of the State of the Field in Canada. Toronto, ON, Canada: Canadian Education Statistics Council. Available at: http://cesc.ca/pceradocs/2003/LearningOutcomes_StateoftheField_RCrocker2002_e.pdf (accessed 31 October 2014).

Crundwell R (2005) Alternative strategies for large scale student assessment in Canada: Is value-added assessment one possible answer. Canadian Journal of Educational Administration and Policy 41: 1–21.

Davis K, Fisher A, Kingsbury B, et al. (2012) Governance by Indicators. Global Power through Classification and Rankings. Oxford: Oxford University Press.

at eFada PARENT on March 14, 2016gsp.sagepub.comDownloaded from

64 Global Social Policy 16(1)

Deacon B (2007) Global Social Policy and Governance. London: Sage.Dronkers J and De Heus M (2013) Immigrant Children’s Academic Performance: The influence

of origin, destination and community. In: Meyer HD and Benavot A (eds) PISA, Power, and Policy: The Emergence of Global Educational Governance. Oxford: Symposium Books, pp. 247–265.

EQAO (2013) About EQAO. Available at: http://www.eqao.com/AboutEQAO/AboutEQAO.aspx?Lang=E (accessed 31 October 2014).

EQAO (2014) Governance framework. Available at: http://www.eqao.com/AboutEQAO/GovernanceFramework.aspx?Lang=E (accessed 31 October 2014).

Ertl H (2006) Educational standards and the changing discourse on education: The reception and consequences of the PISA study in Germany. Oxford Review of Education 32(5): 619–634.

Federal Ministry of Education and Research (2012) Cooperation between the Federal Government and the Länder. Available at: http://www.bmbf.de/en/1263.php (accessed 31 October 2014).

Gorur R and Wu M (2014) Leaning too far? PISA, policy and Australia’s ‘top five’ ambitions. Discourse: Studies in the Cultural Politics of Education. Epub ahead of print 30 June. DOI:10.1080/01596306.2014.930020.

Government of British Columbia (2010) B.C. Students continue to do well in global assess-ments. 7 December. Available at: http://www2.news.gov.bc.ca/news_releases_2009-2013/2010EDUC0135-001540.htm (accessed 31 October 2014).

Grek S (2009) Governing by numbers: The PISA ‘effect’ in Europe. Journal of Education Policy 24(1): 23–37.

Gur BS, Celik Z and Ozoglu M (2012) Policy options for Turkey: A critique of the interpretation and utilization of PISA results in Turkey. Journal of Education Policy 27(1): 1–21.

Gurria A (2011) Canada and the OECD: 50 years of converging interests. In: Public Policy Forum conference (Remarks), Ottawa, ON, Canada, 2 June.

Hartong S (2012) Overcoming resistance to change: PISA, school reform in Germany and the example of Lower Saxony. Journal of Education Policy 27(6): 747–760.

Henley D and Young J (2008) School boards and education finance in Manitoba: The politics of equity, access and local autonomy. Canadian Journal of Educational Administration and Policy 72: 1–26.

Henry M, Lingard B, Rizvi F, et al. (2001) The OECD, Globalization and Education Policy. Oxford: Pergamon Press.

Juillet L (2007) Framing environmental policy: Aboriginal rights and the conservation of migra-tory birds. In: Orsini M and Smith M (eds) Critical Policy Studies: Contemporary Canadian Approaches. Vancouver, BC, Canada and Toronto, ON, Canada: UBC Press, pp. 257–275.

Kallo J (2009) OECD education policy. A comparative and historical study focusing on the the-matic reviews of tertiary education. PhD Thesis, Research in Educational Sciences, no. 45, Finnish Educational Research Association, Helsinki.

Klinger D, DeLuca C and Miller T (2008) The evolving culture of large-scale assessments in Canada. Canadian Journal of Educational Administration and Policy 76: 1–34.

Knighton T, Brochu P and Gluszynski T (2010) Measuring Up: Canadian Results of the OECD PISA Study. The Performance of Canada’s Youth in Reading, Mathematics and Science. Ottawa, ON, Canada: Statistics Canada.

Kurial R (2005) Excellence in education: A challenge for Prince Edward Island. Final report of the task force on student achievement. December. Prince Edward Island Task Force on Student Achievement. Available at: http://www.gov.pe.ca/photos/original/task_force_edu.pdf (accessed 9 February 2015).

Levin B and Young J (1998) Understanding Canadian Schools: An Introduction to Educational Administration. Toronto, ON, Canada: Harcourt Brace Canada.

at eFada PARENT on March 14, 2016gsp.sagepub.comDownloaded from

Morgan 65

Lindquist E (1997) The bewildering pace of public sector reform in Canada. In: Lane JE (ed.) Public Sector Reform: Rationale, Trends and Problems. London: Sage, pp. 47–63.

Lingard B (2010) Policy borrowing, policy learning: Testing times in Australian schooling. Critical Studies in Education 51(2): 129–147.

McBride S (2000) Policy from what? Neoliberal and human-capital theoretical foundations of recent Canadian labour-market policy. In: Burke M, Mooers C and Shield J (eds) Restructuring and Resistance: Canadian Public Policy in the Age of Global Capitalism. Black Point, HI: Fernwood Publishing, pp. 159–177.

Mahon R and McBride S (eds) (2008) The OECD and Transnational Governance. Vancouver, BC, Canada: UBC Press.

Mahon R and McBride S (2009) Standardizing and disseminating knowledge: The role of the OECD in global governance. European Political Science Review 1(1): 83–101.

Markewich C (2013) Standardized testing won’t bring anything new to SK teachers. CKOM, 13 February. Available at: www.newstalk650.com/story/standardized-testing-wont-bring-any-thing-new-sk-teachers/96473 (accessed 31 October 2014).

Martens K, Knodel P, Windzio M, et al. (eds) (2014) Internationalization of Education Policy: A New Constellation of Statehood in Education? Basingstoke: Palgrave Macmillan.

Martino W and Rezai-Rashti G (2013) ‘Gap talk’ and the global rescaling of educational account-ability in Canada. Journal of Education Policy 28(5): 589–611.

Mawhinney H (1998) Patterns of social control in assessment practices in Canadian frameworks for accountability in education. Educational Policy 12(1–2): 98–109.

Ma X, Jong C and Yuan J (2013) Exploring reasons for the East Asian success in PISA. In: Meyer HD and Benavot A (eds) PISA, Power, and Policy: The Emergence of Global Educational Governance. Oxford: Symposium Books, pp. 225–245.

Meyer HD and Benavot A (eds) (2013) PISA, Power, and Policy: The Emergence of Global Educational Governance. Oxford: Symposium Books.

Meyer HD and Schiller K (2013) Gauging the role of non-educational effects in large-scale assess-ments: Socio-economics, culture and PISA outcomes. In: Meyer HD and Benavot A (eds) PISA, Power, and Policy: The Emergence of Global Educational Governance. Oxford: Symposium Books, pp. 207–224.

Miller P (2001) Governing by numbers: Why calculative practices matter. Social Research 68(2): 379–396.

Morgan C (2009) The OECD Programme for International Student Assessment: Unraveling a Knowledge Network. Saarbrucken: VDM Verlag Dr. Muller.

Morgan C and Shahjahan R (2014) The legitimation of OECD’s global educational governance: Examining PISA and AHELO test production. Comparative Education 50(2): 192–205.

National Center for Education Statistics (2011) The NAEP Primer. Washington, DC: US Department of Education, National Center for Education Statistics. Available at: http://nces.ed.gov/nationsreportcard/about/newnaephistory.aspx#beginning (accessed 31 October 2014).

Orenstein MA (2008) Privatizing Pensions: The Transnational Campaign for Social Security Reform. Princeton, NJ: Princeton University Press.

Organisation for Economic Co-operation and Development (OECD) (2004) What Makes School Systems Perform? School Systems through the Prism of PISA. Paris: OECD.

Organisation for Economic Co-operation and Development (OECD) (2009) PISA 2009 Assessment Framework: Key Competencies in Reading, Mathematics and Science. Paris: OECD.

Organisation for Economic Co-operation and Development (OECD) (2012) Directory of Bodies of the OECD 2012. Paris: OECD.

Organisation for Economic Co-operation and Development (OECD) (n.d.) Who drives the OECD? Available at: http://www.oecd.org/about/whodoeswhat/ (accessed 31 October 2014).

at eFada PARENT on March 14, 2016gsp.sagepub.comDownloaded from

66 Global Social Policy 16(1)

Ozga J (2009) Governing education through data in England: From regulation to self-evaluation. Journal of Education Policy 24(2): 149–162.

Pierson P (2000) Increasing returns, path dependence, and the study of politics. American Political Science Review 94(2): 251–267.

Porter W and Webb M (2004) The role of the OECD in the orchestration of global knowledge net-works. Paper prepared for the international studies association annual meeting, 17 March, Montreal, QC, Canada.

Rautalin M and Alasuutari P (2009) The uses of the national PISA results by Finnish officials in central government. Journal of Education Policy 24(5): 539–556.

Rizvi F and Lingard B (2010) Globalizing Education Policy. London and New York: Routledge.Rubenson K (2008) OECD Education policies and world hegemony. In: Mahon R and McBride S

(eds) The OECD and Transnational Governance. Vancouver, BC, Canada: UBC Press, pp. 242–259.

Rustkowski D (2014) The OECD and the local: PISA-based Test for Schools in the USA. Discourse: Studies in the Cultural Politics of Education. Epub ahead of print 8 August. DOI:10.1080/01596306.2014.943157.

Sacks P (1999) Standardized Minds: The High Price of America’s Testing Culture and What We Can Do to Change it. Cambridge, MA: Perseus Publishing.

Schweisfurth M (2013) Editorial. The comparative gaze: From telescope to microscope. Comparative Education 49(2): 121–123.

Sellar S and Lingard B (2013) The OECD and the expansion of PISA: New global modes of gov-ernance in education. British Educational Research Journal. Epub ahead of print 6 August. DOI:10.1002/berj.3120.

Simola H (2005) The Finnish miracle of PISA: Historical and sociological remarks on teaching and teacher education. Comparative Education 41(4): 455–470.

Skogstad GD (2008) Internationalization and Canadian Agriculture: Policy and Governing Paradigms. Toronto, ON, Canada: University of Toronto Press.

Skogstad GD (ed.) (2011) Policy Paradigms, Transnationalism, and Domestic Politics. Toronto, ON, Canada: University of Toronto Press.

Slevinsky K (1996) The school achievement indicators program 1988–1996. MA Thesis, University of Alberta, Edmonton, AB, Canada.

Spooner M and Orlowski P (2013) Standardized testing (almost) comes to Saskatchewan. Our Schools/Our Selves 2013. Available at: http://www.policyalternatives.ca/sites/default/files/uploads/publications/National%20Office/2013/11/osos113_StandardizedTestingComesToSK.pdf

Statistics Canada and CMEC (2014) Education Indicators in Canada: An International Perspective 2013. Ottawa, ON, Canada: Canadian Education Statistics Council.

Stinson W (2014) Government of Saskatchewan eliminates plan for standardized testing. Global News, 11 April. Available at: http://globalnews.ca/news/1266875/government-eliminates-standardized-testing/ (accessed 31 October 2014).

Takayama K (2008) The politics of international league tables: PISA in Japan’s achievement crisis debate. Comparative Education 44(4): 387–407.

US Department of Education (n.d.) About us. Available at: http://www2.ed.gov/about/landing.jhtml?src=ft (accessed 31 October 2014).

Volante L (2007) Educational quality and accountability in Ontario: Past, present, and future. Canadian Journal of Educational Administration and Policy 58: 1–21.

Wallner J (2014) Learning to School: Federalism and Public Schooling in Canada. Toronto, ON, Canada: University of Toronto Press.

at eFada PARENT on March 14, 2016gsp.sagepub.comDownloaded from

Morgan 67

White W (2011) 50 years of productive partnership. OECD Observer, No. 284. Available at: http://www.oecdobserver.org/news/fullstory.php/aid/3523/50_years_of_productive_partnership.html

Woodward R (2009) The Organisation for Economic Co-operation and Development (OECD). London and New York: Routledge.

Zhao Y and Meyer HD (2013) High on PISA, low on entrepreneurship? What PISA does not meas-ure. In: Meyer HD and Benavot A (eds) PISA, Power, and Policy: The Emergence of Global Educational Governance. Oxford: Symposium Books, pp. 267–278.

Author biography

Clara Morgan is Assistant Professor in Political Science at UAE University. Her research focuses on the global governance of education and its regional and local interaction. She has recently pub-lished in Comparative Education and Discourse: Studies in the Cultural Politics of Education. Clara is currently pursuing research on education policy in the Arab region. Please address corre-spondence to: Clara Morgan, UAE University, Department of Political Science, P.O.Box 15551, Al-Ain, UAE. [email: [email protected]]

at eFada PARENT on March 14, 2016gsp.sagepub.comDownloaded from