Upload
others
View
5
Download
0
Embed Size (px)
Citation preview
María Pinto and Rosaura Fernández-Pascual 569
portal: Libraries and the Academy, Vol. 17, No. 3 (2017), pp. 569–593. Copyright © 2017 by Johns Hopkins University Press, Baltimore, MD 21218.
A Diagnosis of the Levels of Information Literacy Competency among Social Sciences UndergraduatesMaría Pinto and Rosaura Fernández-Pascual
abstract: Restricted to five Spanish public universities, this paper examines knowledge about information literacy competencies—that is, the objective dimension—among a population of social sciences students, as well as two subjective dimensions: students’ belief in the importance of information literacy, hereafter called belief-in-importance, and their perceptions of self-efficacy, their confidence in their ability to succeed. Common characteristics and substantial differences among students in different degree programs are also investigated. Understanding the factors underlying the one objective and two subjective dimensions and their mutual relationships was a major objective. Finally, common competencies in the three dimensions are provided.
Introduction
The interest of international organizations such as the Organisation for Economic Co-operation and Development (OECD) and the United Nations Educational, Scientific and Cultural Organization (UNESCO) in research about information
literacy (IL) and its impact on learning processes is continuously growing. At the Euro-pean level, a recommendation by the European Parliament concluded with the prepa-ration of a document on key competencies for lifelong learning, conceived as “those which all individuals need for personal fulfilment and development, active citizenship, social inclusion and employment” and defined as “a combination of knowledge, skills and attitudes appropriate to the context.” Among this set of key competencies, digital competence “involves the confident and critical use of Information Society Technology (IST) for work, leisure and communication.” It requires “a sound understanding and knowledge of the nature, role and opportunities of IST in everyday contexts” and in-This
mss
. is pe
er rev
iewed
, cop
y edit
ed, a
nd ac
cepte
d for
publi
catio
n, po
rtal 1
7.3.
A Diagnosis of the Levels of Information Literacy Competency570
cludes “the ability to search, collect and process information and use it in a critical and systematic way, assessing relevance and distinguishing the real from the virtual.”1 Along these same lines, the Joint Research Centre, the scientific arm of the European Commis-sion, proposed a framework for developing and understanding digital competence in Europe, involving 21 competencies distributed among five specific areas: (1) information, (2) communication, (3) content creation, (4) safety, and (5) problem-solving.2
In Spain, official university programs have introduced the concept of informational and digital competency.3 In parallel, various organizations have developed specific
documents for the integration of information skills within the university curriculum.4
Regardless of the type of competency—infor-mational or digital—the key is its assessment. As-sessment is a broad concept that basically should document the most varied circumstances affecting competency improvement, such as working ses-sions, library instruction, literacy programs, and,
logically, student learning outcomes.5 Numerous models and evaluation tools address-ing the various aspects of information competency have been developed, as will be discussed later. Responsibility for IL education should lie with “both librarians and teaching faculty members, working as partners to integrate information literacy into their institution’s offerings.”6
Research studies are scarce regarding the assessment of information competencies in two specific circumstances. One is the joint assessment of subjective and objective competency values. Another is the evaluation by fields of study or by learning com-munities, groups of students who share common academic goals and attitudes. The joint assessment of subjective and objective competencies allows for a comparison of psychological values—for example, students’ motivation and their self-efficacy, or their belief in their ability to master those competencies—with real values related to their actual levels of knowledge and skills. Such assessment can measure the potential impact of the instruction on students’ subjective values as a means of improving their learning outcomes in IL competencies. Taking into account that information literacy is discipline-specific, the evaluation by disciplinary field also affords a more in-depth understanding of the peculiarities of any learning community with regard to IL. This study focuses on a discipline not well represented in the literature—the social sciences. This study is included in a research and development project sponsored by the Spanish Ministry of Education.
Focusing on the particular learning community formed by social sciences students in the Spanish public universities, the main research questions are:
1. What are the actual levels of knowledge of IL competencies within a sample of social sciences students? Are the levels of knowledge related to their belief in the importance of information literacy and their perceptions of self-efficacy?
2. What are the main common characteristics and substantial differences in actual levels of knowledge of IL competencies among social sciences students, when comparing the various degree programs?
Regardless of the type of competency—information-al or digital—the key is its assessment.
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
María Pinto and Rosaura Fernández-Pascual 571
3. Which are the most relevant underlying factors with regard to social sciences students’ knowledge of IL competencies?
4. What relationships exist among the categories of social sciences students’ knowl-edge of IL competencies? Are these relationships similar to those previously uncovered with regard to social sciences students’ belief-in-importance and self-efficacy for IL competencies?
5. What latent factors—factors that cannot be directly observed but might underlie and explain the students’ perceptions—are common in the three dimensions of belief-in-importance, self-efficacy, and actual knowledge?
Literature Review
A number of bibliometric studies have shown an increase in IL publications over the years, mostly in the United States and United Kingdom.7 Hannelore Rader’s work on the evolution of IL publications, tracing the growing emphasis on assessment of students’ learning outcomes, deserves special attention for its wide scope.8 María Pinto, María Escalona, and Antonio Pulgarin piloted a bibliometric analysis of the scientific literature on IL from 1974 to 2011, including an analysis of research production in the social sci-ences and health sciences.9 Along this same line, Pinto conducted a study of scientific production in the field of IL assessment from 2000 to 2011, recognizing the most frequent keywords and associations, thus identifying major research trends.10 Other authors documented an increase in the implementation of evaluation models and measuring instruments in higher education.11
Outstanding among the more successful IL assessment tools is the standardized test with in-terviews and closed questions.12 Melissa Gross and Don Latham compared the estimates obtained in interviews with the analysis of the results of the Information Literacy Test (ILT) developed by James Madison University in Harrisonburg, Virginia, to identify differences between the perceptions of students and their objective capabilities.13 They concluded that students focused on results rather than processes, that motivation was critical to the success of informational tasks, and that students relied on personal sources for information seeking.
A number of standardized tests have become broadly familiar: Project SAILS (Stan-dardized Assessment of Information Literacy Skills),14 the Information Literacy Test (ILT),15 the iSkills Assessment of the Educational Testing Service (ETS),16 ISS (Information Skills Survey),17 and INFOLITRANS (Information Literacy for Translators).18 See Table 1 for a list of the most frequent standardized IL tests.
Regarding the combined application of IL assessment tools, María Pinto and Rosaura Fernández-Pascual conducted a pilot study using the Information Literacy Humanities and Social Sciences (IL-HUMASS) attitude scale and the EVALCI-KN knowledge test, its name a contraction of EVALuación de Competencias Informaciónales (Evaluation of Information Competencies). Both tests were designed to collect data in four categories of IL competencies or skills: (1) search, (2) evaluation, (3) processing, and (4) commu-
Outstanding among the more successful IL assess-ment tools is the standard-ized test with interviews and closed questions.
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
A Diagnosis of the Levels of Information Literacy Competency572
Tabl
e 1.
M
ost f
requ
ent s
tand
ardi
zed
test
s of i
nfor
mat
ion
liter
acy
Tar
get
Test
Inst
itut
ion
S
tand
ards
Com
pete
ncie
s
Tool
pop
ulat
ion
SAIL
S (S
tand
ardi
zed
K
ent S
tate
Uni
vers
ity,
Ass
ocia
tion
of C
olle
ge
• N
eed
for i
nfor
mat
ion.
C
ompu
teriz
ed
Hig
her
Ass
essm
ent o
f Inf
orm
atio
n K
ent,
Ohi
o an
d Re
sear
ch L
ibra
ries
• A
cces
ses n
eede
d in
form
atio
n m
ultip
le-c
hoic
e te
st
educ
atio
n L
itera
cy S
kills
),
(A
CRL
), 20
00
e
ffect
ivel
y an
d effi
cien
tly.
http
s://w
ww
.pro
ject
sails
.org
/
•
Eval
uatio
n of
info
rmat
ion.
•
Use
info
rmat
ion
ethi
cally
a
nd le
gally
.
ILT
(Info
rmat
ion
Lite
racy
Tes
t),
Cen
ter f
or A
sses
smen
t A
CRL
, 200
0 •
Nee
d fo
r inf
orm
atio
n.
Com
pute
rized
H
ighe
r w
ww
.mad
ison
asse
ssm
ent.c
om/
& R
esea
rch
Stud
ies,
• A
cces
ses n
eede
d m
ultip
le-c
hoic
e te
st
educ
atio
n as
sess
men
t-tes
ting/
info
rmat
ion-
Ja
mes
Mad
ison
in
form
atio
n eff
ectiv
ely
liter
acy-
test
U
nive
rsity
,
a
nd e
ffici
ently
.
Har
rison
burg
,
• Ev
alua
tion
of in
form
atio
n.
V i
rgin
ia
•
Use
info
rmat
ion
ethi
cally
a
nd le
gally
.
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
María Pinto and Rosaura Fernández-Pascual 573
ETS
(Edu
catio
nal T
estin
g Se
rvic
e)
Cal
iforn
ia S
tate
, A
CRL
, 200
0 •
Nee
d fo
r inf
orm
atio
n.
Onl
ine
test
of
Hig
h iS
kills
Ass
essm
ent,
U
nive
rsity
• A
cces
ses n
eede
d in
form
atio
n si
mul
atio
n-ba
sed
scho
ol
http
s://w
ww
.ets.o
rg/i
skill
s/ab
out
Long
Bea
ch
effe
ctiv
ely
and
effici
ently
. ta
sks
and
•
Eval
uatio
n of
info
rmat
ion.
high
er
• U
se in
form
atio
n to
educ
atio
n
a
ccom
plis
h a
purp
ose.
•
Use
info
rmat
ion
ethi
cally
a
nd le
gally
.
ISS
(Info
rmat
ion
Skill
s Sur
vey)
C
ounc
il of
Aus
tral
ian
A
ustr
alia
n an
d N
ew
• A
cces
ses n
eede
d in
form
atio
n Si
mul
ated
com
pute
r-
Hig
her
U
nive
rsity
Lib
raria
ns
Zeal
and
Inst
itute
effe
ctiv
ely
and
effici
ently
. ba
sed
test
of
educ
atio
n
(CA
UL)
fo
r Inf
orm
atio
n Li
tera
cy
• Ev
alua
tion
of in
form
atio
n.
perf
orm
ance
of I
L
(A
NZI
IL),
2004
•
Man
ages
info
rmat
ion
sk
ills
col
lect
ed o
r gen
erat
ed.
•
Use
new
info
rmat
ion
to co
nstr
uct
new
conc
epts
or c
reat
e ne
w
und
erst
andi
ngs.
•
Use
info
rmat
ion
ethi
cally
and
lega
lly.
INFO
LITR
AN
S (In
form
atio
n
Uni
vers
ity o
f Gra
nada
• Se
arch
ing
info
rmat
ion
Com
pute
rized
H
ighe
r Li
tera
cy fo
r Tra
nsla
tors
) an
d U
nive
rsity
Jaum
e I
•
Eval
uatio
n m
ultip
le-
educ
atio
n
of C
aste
llón,
Spa
in
•
Proc
essi
ng
choi
ce te
st
• C
omm
unic
atio
n of
info
rmat
ion
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
A Diagnosis of the Levels of Information Literacy Competency574
nication-dissemination.19 A structural equation model, a technique for building models based on a hypothesis and testing them, is applied in each of the four dimensions under consideration—belief-in-importance, self-efficacy, knowledge, and skills—to provide an estimation of the existing correlations between the four categories of IL competencies. Results revealed a strong correlation between pairs of categories (search-evaluation and evaluation-communication) concerning belief-in-importance and self-efficacy scales. For the knowledge scale, the strongest correlations were found among the categories of evaluation, processing, and communication. María Pinto, Rosaura Fernández-Pascual, and Susana Puertas generalized from the earlier analysis, assuming that there is a causal structure between latent variables and providing an outline of the possible influences among competencies. As a consequence, a foundation for subsequent intervention in the teaching-learning processes could be established.20
The choice of an instrument for IL assessment, as well as its combination with oth-ers, depends on the type of assessment to be made, the available time, and the cost of implementation. A further step in the evolution of IL assessment consists of authentic
assessment, in which students are asked to perform real-world tasks that demonstrate meaningful application of essential knowl-edge and skills. It is a new method that emerged within the context of phenomenog-raphy. According to Rosanne Cordell and Linda Fisher, “to assess the state of students’ knowledge of research processes indepen-dent of a particular assignment, instruction session, or course would be to attempt an authentic assessment of research behavior.”21 For Judith Gulikers, Theo Bastiaens, and Paul Kirschner, “Assessment involves interesting
real-life or authentic tasks and contexts as well as multiple assessment moments and methods to reach a profile score for determining student learning or development.”22 These methods can be used to create portfolios of completed tasks.23 Also, tools called evaluation matrices may be used to objectively evaluate degrees of information compe-tency against a number of criteria that are prioritized before the evaluation, with greater weighting to the skills of most importance. In this regard, Caroline Timmers and Cees Glas developed a reliable and valid measure to study the information-seeking behavior of Dutch undergraduates, with four underlying scales: (1) search strategies, (2) evaluation information, (3) referring to information, and (4) regulation activities.24
Later phenomenography research explored rubrics, scoring tools that provided clear descriptions of the performance expectations for varying levels of mastery. Jos van Helvoort started a research project based on the idea that “analytical scoring rubrics are good tools for the grading of the students’ information behavior during their study tasks.” He added, “Such an assessment instrument can be a good starting point for the design of information literacy training programmes or for integrating information literacy in discipline based curricula.”25 Nikolas Leichner, Johannes Peter, Anne-Kathrin Mayer, and Günter Krampen aimed “to present information search tasks and accompanying
A further step in the evolution of IL assessment consists of authentic assessment, in which students are asked to perform real-world tasks that demon-strate meaningful application of essential knowledge and skills.
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
María Pinto and Rosaura Fernández-Pascual 575
scoring rubrics that can be used to assess IL in undergraduate psychology students in an ecologically valid fashion as an alternative to the use of standardized tests.”26 The issue of assessment of IL competencies in higher education occupies a central position in recent research, and it remains an open matter that still has a long way to go.
Methodology
EVALCI-KN Knowledge Test Questionnaire
The design of the EVALCI-KN knowledge test questionnaire is based on an extensive body of literature in the field of IL assessment. Conceived in online format, it has two parts. One gathers data about the respondents, and the other comprises 26 questions on students’ actual levels of knowledge about IL competencies, organized into four categories:
1. Searching, which measures the student’s knowledge about using different tools of information search—printed and electronic sources, Internet, catalogs and secondary sources, terminology, and searching strategies;
2. Evaluation, to find out if students are able to determine the quality of resources—recognizing main ideas, looking for currency or updating, and knowing the most relevant authors and institutions in the subject;
3. Processing, to assess the actual capability to organize and manage data—recog-nizing text structure, schematizing and abstracting, using database and reference managers, and installing and handling statistical programs and spreadsheets; and
4. Communication-diffusion, evaluating a set of issues related to the presentation of new knowledge—communicating in public or in other languages, writing documents and academic presentations, knowing the ethical code of your field and the laws governing the use of information and intellectual property, and disseminating information on the Internet.
This objective test measures the levels of actual knowledge of the undergraduates of a series of information competencies. Originally, the questionnaire consisted of 78 items, with each of the 26 competencies addressed by three questions.
Since the structure of the questionnaire had already been validated in previous stud-ies, only the contents of the items and responses were subjected to a validation process.27 Eleven university specialists in information documentation and education were invited to participate, seven of whom agreed to do so. The validation criteria included: (1) rat-ing the questions’ relevance, with regard to comprehension and clarity of expression of the items, using a nine-point scale; and (2) assessment of the level of adequacy of the responses by the criterion of proximity to the correct answer.
The questionnaire obtained good to excellent scores and so was deemed relevant to measure the proposed indicators. Experts’ observations on the lack of conceptual clarity of some items were taken into consideration. Similarly, comments related to the multiple response options of some items were considered. A beta version of the test was developed and piloted within a group of students for a better appraisal of performance.28 Because
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
A Diagnosis of the Levels of Information Literacy Competency576
students took a long time to complete the test, it was reduced to just one question for each competency instead of three to leave time to answer queries of a more subjective nature. Once again, the advice of the expert panel was considered, individually weighting each of the 78 items and pointing out which best represented the knowledge to be measured in each competency. The resulting questionnaire contained 34 items. To reduce the number of items to 26 (one per competency), difficulty and discrimination indexes were needed.
Difficulty indices represent the ratio between the total number of students who correctly answered the item and the total number of students who answered. The more difficult the item, the lower its rating.29 Discrimination indices indicate the ability of each item to distinguish between students with a high or low score. Values above 0.30 indicate good quality, and greater than 0.39, excellent quality.30 Items entailing greater difficulty and discrimination capacities were selected. We then proceeded to the final design of the instrument in an online format, hosted on the server infocompetencias.org for its dissemination.
The Sample
The study population consists of students enrolled in programs of information docu-mentation, audiovisual communication, journalism, psychology, primary education, education, social work, and tourism during the 2013–2014 academic year. Five universi-ties in Spain participated: (1) the University of Granada, (2) the University of Murcia, (3) Jaume I University of Castellón, (4) the University of Málaga, and (5) Complutense University of Madrid.
Data selection was made through a process of stratified sampling with proportional allocation, dividing the population into smaller groups called strata and taking random samples of each group in proportion to its size. The process addressed the total number of students per university, degree program, and course of study. Third- and fourth-year students were selected, and a total of 1,575 valid surveys were obtained (see Table 2). The information collected is representative, allowing inferences with sufficient consistency.
The overall participation was 500 men and 1,075 women. The undergraduates were distributed as follows: 1,101 (70 percent) in the third year and 474 (30 percent) in the fourth year.
Analysis
Once the fieldwork was done and the online questionnaires had been administered, data processing and analysis were performed with SPSS and LISREL (linear structural relations) software. Nonparametric methods, wherein the data were not assumed to fit a normal distribution, were applied to study the possible influence of gender and degree program on the students’ competency scores. Such methods were appropriate due to the ordinal nature of the survey variables. Two statistical tests, the Mann-Whitney U and Kruskal-Wallis tests, allowed for comparison of distributions among groups of students with regard to gender, degree program, and institution.
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
María Pinto and Rosaura Fernández-Pascual 577
Tabl
e 2.
Stud
ents
by
univ
ersi
ty
Com
plut
ense
Jau
me
I of
Deg
ree
prog
ram
M
adri
d
Gra
nada
C
aste
llón
Mál
aga
Mur
cia
T
otal
P
erce
ntag
e
Aud
iovi
sual
com
mun
icat
ion
33
26
29
82
28
198
13%
Prim
ary
educ
atio
n 2
4 16
6 55
42
83
37
0 23
%In
form
atio
n do
cum
enta
tion
59
39
0 0
24
12
2 8%
Peda
gogy
15
45
1
22
30
113
7%Jo
urna
lism
43
0
53
85
55
23
6 15
%Ps
ycho
logy
35
13
1 27
11
19
22
3 14
%So
cial
wor
k 27
73
0
18
19
137
9%To
uris
m
30
67
23
36
20
176
11%
Tota
l 26
6
547
18
8
296
27
8
1,5
75
Perc
enta
ge
17%
35
%
12%
19
%
18
%
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
A Diagnosis of the Levels of Information Literacy Competency578
Results
The scales have been widely validated in previous studies.31 In the present study, the questionnaire EVALCI-KN had a Cronbach’s alpha—a statistical measure of the extent to which the survey questions measure the same thing—of 0.718, an outcome that con-firms the reliability and internal consistency of the instrument. Table 3 shows weighted average scores in each degree program and competency category.
Examining Gender and Degree Program
The survey data were analyzed to look for differences regarding gender and subject of study. To compare differences between male and female responses, the Mann-Whitney U test was used, while the Kruskal-Wallis test made it possible to determine if there were statistically significant differences among academic degree programs.
Table 3.Competency results by degree program
Competency category Degree program* Search Evaluation Processing Communication
Audiovisual 6.87 8.40 6.23 8.43 communication
Primary education 6.48 8.12 5.79 7.90
Information 8.13 8.50 7.13 8.28 documentation
Pedagogy 6.64 8.31 5.70 7.99
Journalism 7.07 8.35 6.24 8.26
Psychology 6.96 8.54 5.82 8.06
Social work 6.52 8.17 5.23 7.77
Tourism 6.54 8.12 6.96 7.81
Total 6.84 8.30 6.10 8.06
p-values† 0.005 0.000 0.001 0.005
*The Kruskal-Wallis test was used to determine the differences by competency category among degree programs.† The p-value gives the likelihood that any effect seen in the data, such as a difference between groups, might have occurred by chance.
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
María Pinto and Rosaura Fernández-Pascual 579
The statistics revealed no significant differences between female and male students in their levels of knowledge regarding searching and evaluation (Mann-Whitney U, p > 0.05). However, women had higher scores than men in the communication category, while men had higher scores in the processing category (Mann-Whitney U, p < 0.05, see Table 4).
As for the possible influence of the degree programs on the students’ actual levels of knowledge regarding IL competencies, the results show statistically significant dif-ferences (Kruskal-Wallis test, p < 0.05) between degree programs in all four categories (see Table 3).
A detailed analysis of the 26 competencies reveals significant differences in the levels reached in most. The only scores of students’ actual knowledge of the competen-cies that were similar in all fields of study were assessing the quality of information resources (referred to in the study as Kn-9 or knowledge 9), knowing the most relevant authors and institutions within your subject area (Kn-13), communicating in public (Kn-20), writing a document (Kn-22), and knowing the code of ethics in your academic or professional field (Kn-23).
Factors of Knowledge on the Competencies
One of the goals of our research was to determine the underlying structure of social sciences students’ knowledge of IL competencies. Exploratory factor analyses were conducted to uncover the number and composition of the underlying factors within the various degree programs. For the eight degree programs, we obtained an indication of sampling adequacy called the Kaiser-Meyer-Olkin (KMO) measure, the number of factors, and the explained variance (see Table 5).
Table 4.Statistical differences by gender
Competency* Male Female p-value†
Search 6.9190 6.8030 0.053Evaluation 8.2680 8.3142 0.146Processing 6.3193 5.8481 0.000Communication 7.8943 8.1942 0.026
*The Mann-Whitney U test was used to determine the differences between genders by competency category.†The p-value gives the likelihood that any effect seen in the data, such as a difference between groups, might have occurred by chance.
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
A Diagnosis of the Levels of Information Literacy Competency580
With regard to actual social sciences students’ knowledge of IL competencies, an exploratory factor analysis applied to the surveyed population confirmed the choice of
the six latent categories or factors. Loadings of the levels of knowledge of competencies for each of the six factors, as well as their ac-cumulated variances, are displayed in Table 6. Following Kaiser’s criterion, and taking into account that eigenvalues measure the total variance explained by a particular factor, the factors are included when the eigenvalue is greater than 1.0. To facilitate interpretation,
only the variables with a factor loading higher than 0.6 are included. The total number of selected competencies from this factor analysis is 21.
Factors are categorized according to the variance explained: communication (26.315 percent), communication-ICT (information and communications technology, 12.652 per-cent), evaluation (9.46 percent), searching (8.308 percent), searching-ICT (7.691 percent), and processing (4.078 percent). Two new factors, searching-ICT and communication-ICT, appeared in addition to the four previously raised by the survey itself. These two new factors are of a technological nature, being closely linked to ICT, and in addition, to digital literacy. They could be considered as technological versions of the search and communication factors previously designed.
The statistics revealed no sig-nificant differences between fe-male and male students in their levels of knowledge regarding searching and evaluation.
Table 5.Kaiser-Meyer-Olkin (KMO) measure, number of factors, and explained variance by degree program
Degree program Kaiser-Meyer-Olkin Percentage of (KMO) Number of variance measure* factors explained
Audiovisual communication 0.844 6 71.61%Primary education 0.773 6 60.61%Information documentation 0.805 6 73.11%Pedagogy 0.725 6 66.27%Journalism 0.758 6 66.93%Psychology 0.755 6 56.00%Social work 0.800 6 63.30%Tourism 0.705 6 57.50%*
*The Kaiser-Meyer-Olkin measure is an indication of sampling adequacy.
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
María Pinto and Rosaura Fernández-Pascual 581
Table 6.Knowledge factors
Loading* Accumulated variance (%)
Factor 1: Communication 26.315
†Kn-20 Communicating in public 0.714
Kn-21 Communicating in other languages 0.773
Kn-24 Knowing the laws on the use of information and intellectual property 0.802
Factor 2: Communication-ICT 38.967
Kn-22 Writing a document (report, academic work, etc.) 0.642
Kn-25 Creating academic presentations (PowerPoint, etc.) 0.707
Kn-26 Disseminating information on the Internet (websites, blogs, etc.) 0.797
Factor 3: Evaluation 48.427
Kn-9 Assessing the quality of information resources 0.767
Kn-10 Recognizing the author’s ideas within the text 0.695
Kn-11 Knowing the typology of scientific information sources (thesis, proceedings, etc.) 0.879
Kn-12 Determining whether an information resource is updated 0.657
Factor 4: Searching 56.735
Kn-2 Entering and using automated catalogs 0.633
Kn-5 Consulting and using electronic sources of primary information (journals, etc.) 0.688
Kn-8 Knowing information search strategies (descriptors, Boolean operators, etc.) 0.678
Factor 5: Searching—information and communications technology 64.426
Kn-3 Consulting and using electronic sources of primary information (journals, etc.) 0.742
Kn-4 Using electronic sources of secondary information (databases, etc.) 0.764
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
A Diagnosis of the Levels of Information Literacy Competency582
The six factors identified in Table 6 serve as latent categories for the next part of the investigation, a structural equation model that examines the relationships between those six latent categories (sets of competencies) and the specific competencies that contribute to them.
Structural Equation Model for Social Sciences Students
Under a multivariate approach, covering the eight degree programs taken into account in this research, we applied different structural equation models to find the one that best fit the data. The aim was to explore the latent structures of actual knowledge of the social sciences students through the data provided by the questionnaire EVALCI-KN. Thus the intention was to answer research question 4, “What relationships exist among the categories of actual social sciences students’ knowledge of IL competencies, and are these relationships similar to those previously uncovered with regard to social sciences students’ belief-in-importance and self-efficacy for IL competencies?” We refer the reader to in-depth descriptions of this technique.32 This approach, which includes multiple regression and factor analysis, can be found in previous studies in the area.33
The results provide insight into the relationship between latent categories (the fac-tors or sets of competencies identified in Table 6) and the individual competencies (the survey questions). Structural equation modeling treats each latent category as a “pure” construct that is expressed, to a greater or lesser extent, through each of the associated survey questions. Through these models, a better understanding of the internal structure of the processes of knowledge acquisition concerning IL competencies for particular student populations may be obtained, laying the groundwork for the design of new and more effective teaching and learning processes.
Kn-6 Searching for and retrieving Internet information (advanced searches, directories, portals, etc.) 0.711
Kn-7 Using informal electronic sources of information (blogs, discussion lists, etc.) 0.764
Factor 6: Processing 68.504
Kn-15 Recognizing text structure 0.725
Kn-16 Using database managers (Access, MySQL, etc.) 0.658
Kn-17 Using bibliographic reference managers (EndNote, Reference Manager, etc.) 0.773
Kn-19 Installing computer programs 0.676
*Unweighted least squares extraction; promax (oblique) rotation with Kaiser normalization.†Kn stands for knowledge and refers to the knowledge of a particular IL competency.
Loading* Accumulated variance (%)
Table 6, cont.
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
María Pinto and Rosaura Fernández-Pascual 583
Using LISREL software, the relationships between the observed competencies and the various latent categories are specified through what is known as the “measure-ment” side of the model. The other side is “structural,” which in this case refers only to correlations between pairs of categories, disregarding the possible causal relationships among them. As required for every structural equation model, indicators of reliability and goodness of fit were examined.34
The goodness of fit statistics are presented in Table 7, along with the standardiza-tions needed to make the coefficients comparable across the six latent categories: (1) searching, (2) searching-ICT, (3) evaluation, (4) processing, (5) communication, and (6) communication-ICT. The goodness of fit between the data and the model proposed can be assessed by a number of indicators: goodness of fit index (GFI), normed fit index (NFI), and root mean square error of approximation (RMSEA). A model is regarded as acceptable if GFI and NFI exceed 0.90.35 A model is regarded as acceptable if RMSEA is less than 0.08 and ideally less than 0.05.36
According to these criteria, the results show an acceptable fit. GFI and NFI values above 0.90 and RMSEA values below 0.05 indicate that the model fits the data reasonably well. The residuals are normally distributed, with a mean of zero and values ranging from –1.44 to 2.05 (see Table 7).
Figure 1 shows the measurement model regarding the latent categories included in Table 7. Using the conventional notation LISREL, the rectangles represent the com-petencies observed, while the ovals symbolize the latent categories. The direct arrows indicate causal relationships.
Figure 1. Diagram of the relationships for knowledge of IL competencies included in Table 7; rectangles represent the knowledge of competencies observed, and ovals symbolize the latent categories.
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
A Diagnosis of the Levels of Information Literacy Competency584
Table 7.Measurement equations for the six latent categories of social sciences students’ knowledge of IL competencies
Category Measurement equations R2
Searching *Kn-2 = 1.00 searching 0.74 Kn-5 = 0.64 searching 0.87 Kn-8 = 0.90 searching 0.72
Searching-ICT Kn-3 = 1.00 searching-ICT 0.62 Kn-4 = 0.70 searching-ICT 0.46 Kn-6 = 0.61 searching-ICT 0.58 Kn-7 = 0.63 searching-ICT 0.60
Evaluation Kn-9 = 1.00 evaluation 0.54 Kn-10 = 0.78 evaluation 0.64 Kn-11 = 0.67 evaluation 0.60 Kn-12 = 0.84 evaluation 0.66
Processing Kn-15 = 0.93 processing 0.64 Kn-16 = 0.78 processing 0.80 Kn-17 = 1.00 processing 0.61 Kn-19 = 0.85 processing 0.82
Communication Kn-20 = 0.59 communication 0.53 Kn-21 = 0.61 communication 0.78 Kn-24 = 1.00 communication 0.79
Communication-ICT Kn-22 = 0.54 communication-ICT 0.63 Kn-25 = 0.72 communication-ICT 0.58 Kn-26 = 1.00 communication-ICT 0.75
Goodness of fit statistics GFI = 0.93; RMSEA = 0.041; NFI = 0.92†
*Kn stands for knowledge and refers to the knowledge of a particular IL competency.† A model is regarded as acceptable if the goodness-of-fit index (GFI) exceeds 0.90, root mean square error of approximation (RMSEA) is less than 0.08 and ideally less than 0.05, and the normed fit index (NFI) exceeds 0.90.
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
María Pinto and Rosaura Fernández-Pascual 585
The correlations among the latent constructs are displayed in Table 8. Strong relation-ships between searching, searching-ICT, and evaluation can be observed. As expected, the two searching categories are closely related. Five weak correlations may be identified: between searching and processing or communication-ICT; searching-ICT and processing; evaluation and communication-ICT; and processing and communication-ICT.
Discussion
Concerning research question 1—“What are the actual levels of knowledge of IL compe-tencies within a sample of social sciences students? Are the levels of knowledge related to their belief in the importance of information literacy and their perceptions of self-efficacy?”—and question 2—“What are the main common characteristics and substantial differences concerning actual levels of knowledge of IL competencies among social sci-ences students, when comparing the various degree programs?”—global scores show that social sciences students show greater knowledge of IL competencies in the categories of evaluation and communication-dissemination of information. The results by degree program reveal that students enrolled in information documenta-tion show greater knowledge in the categories of searching for and processing information than the participating students in other disciplines. Audiovisual communication students obtain the highest levels in the communication-dissemina-tion category, followed by students of informa-tion documentation and journalism. Meanwhile, journalism students achieve their best scores on actual knowledge of the competencies belonging to the category of evaluation, although those same students assigned the highest scores in the belief-in-importance and self-efficacy dimensions to competencies belonging to the processing category.37 At the other end, primary education students show the lowest scores on actual knowledge of competencies within the categories of search and evalu-ation. Social work students receive minimal scores in the categories of processing and communication-dissemination of information.
For all the university degrees studied, in relation to the average levels of real knowledge of information competencies on the part of students, there are significant differences between the four categories of competency. Only five competencies—assess-ing the quality of information resources (Kn-9), knowing the most relevant authors and institutions within your subject area (Kn-13), communicating in public (Kn-20), writing a document (Kn-22), and knowing the code of ethics in your academic or professional field (Kn-23)—were found to have a similar average for students in all the degree programs.
Undergraduates in information documentation show significantly higher levels in objective knowledge in most competencies. This result agrees with those derived by Pinto, Fernández-Pascual, and Puertas in 2016 in relation to average levels in belief-in-importance and self-efficacy concerning IL competencies.38
Students enrolled in infor-mation documentation show greater knowledge in the cat-egories of searching for and processing information than the participating students in other disciplines.
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
A Diagnosis of the Levels of Information Literacy Competency586
Tabl
e 8.
Corr
elat
ions
am
ong
the
six
late
nt c
ateg
orie
s
Sea
rchi
ng
Se
arch
ing-
ICT
E
valu
atio
n
Proc
essi
ng
Com
mun
icat
ion
Com
mun
icat
ion-
ICT
Sear
chin
g 1.
00
Sear
chin
g-IC
T 0.
85
1.00
Eval
uatio
n 0.
75
0.94
1.
00
Proc
essi
ng
0.24
0.
38
0.73
1.
00
C
omm
unic
atio
n 0.
61
0.69
0.
51
0.41
1.
00
Com
mun
icat
ion-
ICT
0.35
0.
59
0.31
0.
34
0.53
1.
00
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
María Pinto and Rosaura Fernández-Pascual 587
With regard to question 3—“Which are the most relevant underlying factors with regard to social sciences students’ knowledge of IL competencies?”— six factors were identified, four of them matching the categories of the questionnaires. The other two competencies were of a technological-digital nature and primarily linked to the concepts of search and communication of information. These factors would bring together the technology-digital competencies nearest the functions of searching for and communication of information. This is not surprising, since these functions are highly affected by new technological developments. Pinto, Fernández-Pascual, and Puertas had previously identified at least one of these two new factors.39
Structural equation modeling allows us to estimate the existing correlations between the six latent competencies, thereby responding to question 4, “What relationships exist among the categories of actual social sciences students’ knowledge of IL competencies, and are these relationships similar to those previously uncovered with regard to social sciences students’ belief-in-importance and self-efficacy for IL competencies?” Results reveal a strong correlation among the categories searching, searching-ICT, and evaluation. Analysis of these correlation structures reflects the need to improve teaching strategies related to searching because this competency plays a key role. In addition, the correla-tion identified between ICT-related latent categories and evaluation and processing is remarkable.
In relation to question 5—“What latent factors are common in the three dimensions of belief-in-importance, self-efficacy, and actual knowledge?”—a comparison of these results with those described in Pinto, Fernández-Pascual, and Puertas regarding belief-in-importance and self-efficacy highlights that the higher scores pertain to the evaluation and communication categories, while the lower results are for processing. This empirical evidence confirms the relationship between the knowledge and the attitudes and motiva-tions of students regarding IL competencies. It also facilitates “mapping” the strengths and weaknesses of social sciences students learning IL competencies. Notwithstanding, there is a clear structural difference between the factor models adjusted to the scale of actual knowledge on IL competencies on the part of social sciences students and those models fitted to the subjective scales of belief-in-importance and self-efficacy concerning the same competencies and population.40
In the case of actual knowledge, competencies are focused on six factors. The first two are associated with the category of communication. The third is concerned with evaluation, which has a similar role in belief-in-importance and self-efficacy scales. The fourth and fifth factors are linked to the competencies of information search, and the sixth and least important factor relates to the competencies of processing. Here lie the main differences with the belief-in-importance and self-efficacy scales, because in this case the five-factor models place processing and searching as categories with a higher factor load.
Finally, it is important to note the similarities uncovered. A high rate of communality is found. Out of the 26 competencies of the questionnaires, 13 (50 percent) are common
Undergraduates in infor-mation documentation show significantly higher levels in objective knowl-edge in most competencies.
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
A Diagnosis of the Levels of Information Literacy Competency588
in the three factor models (see Table 9). In particular, 76.47 percent and 68.42 percent of the competencies respectively listed in the factor models belief-in-importance and self-efficacy remain in the factor model. Although their relative position in the factor models differs, these competencies play prominent roles in their respective categories. We may consider these competencies, common to the subjective (belief-in-importance and self-efficacy) and objective dimensions, as essential in the process of acquiring information competencies.
Table 9.Competencies common to three factor models on social sciences students’ belief-in-importance, self-efficacy, and actual knowledge regarding IL competencies
Searching
*Kn-3 Consulting and using electronic sources of primary information (journals, etc.)Kn-4 Using electronic sources of secondary information (databases, etc.)Kn-6 Searching for and retrieving Internet information (advanced searches, directories, portals, etc.)Kn-8 Knowing information search strategies (descriptors, Boolean operators, etc.)
Evaluation
Kn-10 Recognizing the author’s ideas within the textKn-11 Knowing the typology of scientific information sources (thesis, proceedings, etc.) Kn-12 Determining whether an information resource is updated
Processing
Kn-16 Using database managers (Access, MySQL, etc.) Kn-17 Using bibliographic reference managers (EndNote, Reference Manager, etc.)
Communication and diffusion
Kn-20 Communicating in publicKn-21 Communicating in other languages Kn-25 Creating academic presentations (PowerPoint, etc.)Kn-26 Disseminating information on the Internet (websites, blogs, etc.) *Kn stands for knowledge and refers to the knowledge of a particular IL competency.
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
María Pinto and Rosaura Fernández-Pascual 589
Eight of these 13 common competencies (61.53 percent) are closely related to the new technology-digital developments (Kn-3, Kn-4, Kn-6, Kn-8, Kn-16, Kn-17, Kn-25, and Kn-26), confirming that the technological-digital aspects of IL competencies are beginning to outperform the purely informational ones.
Conclusions
The results described here largely met our research expectations. First, we were able to determine the objective levels of knowledge and skills in the use of information competencies by a representative group of social sciences students, while also gauging the possible influence of attitudes and motivations leading students to achieve these capacities. The implementa-tion of three specific assessment tools, IL-HUMASS, EVALCI-KN, and EVALCI-S, provides for an integrated, holistic approach. This approach relates the subjective perspectives (based on attitudes and motivations) and objective perspectives (based on evidence) of a student’s relationship with the different IL competencies. Among social sciences students, the actual levels of knowledge about IL competencies are related to their attitudes (self-efficacy) and motivations (belief-in-importance), as confirmed by the presence of 13 competencies that are common to these three dimensions in the corresponding factor models.
Second, this research focused on a specific learning community, students in the social sciences, aiming to understand their degree of consistency regarding IL by providing objec-tive and subjective data on their information competencies. As for the levels of knowledge of IL on the part of students, the possibility of raising a factorial model that can be applied to the eight degree programs involved is confirmed. From the four IL categories of the questionnaires, the various factor analyses ap-plied to the degrees give us a common number of latent factors, six. The two new factors are closely linked to ICT technologies and to the categories search and communication.
Statistically significant differences were detected between men and women concern-ing the categories of processing and communication of information. Statistically signifi-cant differences also appear between the eight degree programs analyzed, affecting four IL categories. However, for five competencies, no differences between degree programs were detected. In the social sciences community, the following may be regarded as “uni-versal”: assessing the quality of information resources; knowing the most relevant authors and institutions within your subject area; communicating in public; writing a document; and knowing the code of ethics in your academic or professional field. After the factor reduction, four other competencies disappear and might be considered irrelevant. These are using printed sources of information; schematizing and abstracting information; us-
Among social sciences stu-dents, the actual levels of knowledge about IL competen-cies are related to their atti-tudes (self-efficacy) and moti-vations (belief-in-importance)
The technological-digi-tal aspects of IL compe-tencies are beginning to outperform the purely informational ones.
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
A Diagnosis of the Levels of Information Literacy Competency590
ing database managers; and handling statistical programs and spreadsheets. These somewhat surprising results are worthy of future exploration.
Based on the factors found to be common to university students in all the degree programs, a model of struc-tural equations defining the relation-ship between the six factors and the 20 selected competencies is proposed. In any case, future initiatives along the lines drawn here would be a welcome addition to research on key compe-tencies in the context of information literacy.
María Pinto is a professor in the Faculty of Information Science, University of Granada, Granada, Spain; her e-mail address is: [email protected].
Rosaura Fernández-Pascual is an associate professor in the Faculty of Economic and Business Sciences at the University of Granada, Spain; she may be reached by e-mail at: [email protected]
Notes
1. European Union, Key Competences for Lifelong Learning: European Reference Framework, Education and Training (Luxembourg, Belgium: Office for Official Publications of the European Communities, 2007).
2. Anusca Ferrari, DIGCOMP: A Framework for Developing and Understanding Digital Competence in Europe, JRC (Joint Research Centre) Scientific and Policy Reports, ed. Yves Punie and Barbara N. Brečko (Luxembourg, Belgium: Publications Office of the European Union, 2013).
3. Government of Spain, “Real Decreto 1393/2007, Ordenacion de las Enseñanzas Universitarias Oficiales [Royal Decree 1393/2007, Organization of Official University Education],” Boletín Oficial del Estado (Official State Gazette) 260 (October 30, 2007).
4. Comisión mixta (Joint Commission) CRUE (Conferencia de Rectores de las Universidades Españolas [Conference of Rectors of Spanish Universities])-TIC (Comisión Sectorial de las Tecnologias de la información y las Comunicaciones [Sectoral Commission for Information Technology and Communications]) y REBIUN (Red de Bibliotecas Universitarias [university library network]), Competencias Informáticas e Informacionales en los Estudios de Grado (Computer and information studies degree skills), 2009.
5. Association of American Colleges & Universities (AAC&U), “Information Literacy VALUE Rubric,” 2005, https://assessment.trinity.duke.edu/documents/InformationLiteracy.pdf; Nancy Wootton Colborn and Rosanne M. Cordell, “Moving from Subjective to Objective Assessments of Your Instruction Program,” Reference Services Review 26, 3 (1998): 125–37; Bonnie Gratch Lindauer, “The Three Arenas of Information Literacy Assessment,” Reference & User Services Quarterly 44, 2 (2004): 122–29.
6. Rebecca S. Albitz, “The What and Who of Information Literacy and Critical Thinking in Higher Education,” portal: Libraries and the Academy 7, 1 (2007): 97–109.
In the social sciences community, the following may be regarded as “universal”: assessing the quality of information resources; knowing the most relevant authors and in-stitutions within your subject area; communicating in public; writing a document; and knowing the code of ethics in your academic or profes-sional field.
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
María Pinto and Rosaura Fernández-Pascual 591
7. Noa Aharony, “Information Literacy in the Professional Literature: An Exploratory Analysis,” Aslib (Association for Information Management) Proceedings 62, 3 (2010): 261–82; Mohammad Nazim and Moin Ahmad, “Research Trends in Information Literacy: A Bibliometric Study,” SRELS [Sarada Ranganathan Endowment for Library Science] Journal of Information Management 44, 1 (2007): 53–62.
8. Hannelore B. Rader, “Information Literacy 1973–2002: A Selected Literature Review,” Library Trends 51, 2 (2002): 242–59.
9. María Pinto, María Isabel Escalona-Fernández, and Antonio Pulgarín, “Information Literacy in Social Sciences and Health Sciences: A Bibliometric Study (1974–2011),” Scientometrics 95, 3 (2013): 1071–94.
10. María Pinto, “Viewing and Exploring the Subject Area of Information Literacy Assessment in Higher Education (2000–2011),” Scientometrics 102, 1 (2015): 227–45.
11. Leo Appleton, “Examination of the Impact of Information-Skills Training on the Academic Work of Health-Studies Students: A Single Case Study,” Health Information & Libraries Journal 22, 3 (2005): 164–72; Alison J. Head and Michael B. Eisenberg, “Project Information Literacy Progress Report: How College Students Evaluate and Use Information in the Digital Age” (Seattle: Information School. University of Washington, 2010), http://projectinfolit.org/pdfs/PIL_Fall2010_Survey_FullReport1.pdf; Els Kuiper, Monique Volman, and Jan Terwel, “Developing Web Literacy in Collaborative Inquiry Activities,” Computers & Education 52, 3 (2009): 668–80; Maria Pinto, “Design of the IL-HUMASS [Information Literacy Humanities and Social Sciences] Survey on Information Literacy in Higher Education: A Self-Assessment Approach,” Journal of Information Science 36, 1 (2010): 86–103; María Pinto, “An Approach to the Internal Facet of Information Literacy Using the IL-HUMASS Survey,” Journal of Academic Librarianship 37, 2 (2011): 145–54; María Pinto and Dora Sales, “Insights into Translation Students’ Information Literacy Using the IL-HUMASS Survey,” Journal of Information Science 36, 5 (2010): 618–30; Maria Pinto and Dora Sales, “Uncovering Information Literacy’s Disciplinary Differences through Students’ Attitudes: An Empirical Study,” Journal of Librarianship and Information Science 47, 3 (2015): 204–15; P. K. Rangachari and Usha Rangachari, “Information Literacy in an Inquiry Course for First-Year Science Undergraduates: A Simplified 3C [credibility, content, and currency] Approach,” Advances in Physiology Education 31, 2 (2007): 176–79; Eric Resnis, Katie Gibson, Arianne Hartsell-Gundy, and Masha Misco, “Information Literacy Assessment: A Case Study at Miami University,” New Library World 111, 7–8 (2010): 287–301.
12. Cheryl L. Blevens, “Catching Up with Information Literacy Assessment,” College & Research Libraries News 73, 4 (2012): 202–6; Joanna M. Burkhardt, “Assessing Library Skills: A First Step to Information Literacy,” portal: Libraries and the Academy 7, 1 (2007): 25–49; Nancy W. Noe and Barbara A. Bishop, “Assessing Auburn University Library’s Tiger Information Literacy Tutorial (TILT),” Reference Services Review 33, 2 (2005): 173–87; Anita Ondrusek, Valeda F. Dent, Ingrid Bonadie-Joseph, and Clay Williams, “A Longitudinal Study of the Development and Evaluation of an Information Literacy Test,” Reference Services Review 33, 4(2005): 388–417.
13. Amalia Monroe-Gulick and Julie Petr, “Incoming Graduate Students in the Social Sciences: How Much Do They Really Know about Library Research?” portal: Libraries and the Academy 12, 3 (2012): 315–35; Melissa Gross and Don Latham, “Undergraduate Perceptions of Information Literacy: Defining, Attaining, and Self-Assessing Skills,” College & Research Libraries 70, 4 (July 1, 2009): 336–50.
14. Brian Lym, Hal Grossman, Lauren Yannotta, and Makram Talih, “Assessing the Assessment: How Institutions Administered, Interpreted, and Used SAILS [Standardized Assessment of Information Literacy Skills],” Reference Services Review 38, 1 (2010): 168–86; Brian Detlor, Heidi Julien, Rebekah Willson, Alexander Serenko, and Maegen Lavallee, “Learning Outcomes of Information Literacy Instruction,” Journal of the American Society for Information Science and Technology 62, 3 (2011): 572–85.
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
A Diagnosis of the Levels of Information Literacy Competency592
15. Lynn Cameron, Steven L. Wise, and Susan M. Lottridge, “The Development and Validation of the Information Literacy Test,” College & Research Libraries 68, 3 (2007): 229–36; Davida Scharf, Norbert Elliot, Heather A. Huey, Vladimir Briller, and Kamal Joshi, “Direct Assessment of Information Literacy Using Writing Portfolios,” Journal of Academic Librarianship 33, 4 (2007): 462–78.
16. Irvin R. Katz, “Testing Information Literacy in Digital Environments: ETS’s [Educational Testing Service’s] iSkills Assessment,” Information Technology and Libraries 26, 3 (2007): 1–3.
17. Allan Bundy, ed., Australian and New Zealand Information Literacy Framework: Principles, Standards and Practice (Adelaide, Australia: Australian and New Zealand Institute for Information Literacy, 2004); Ralph Catts, Information Skills Survey, Technical Manual (Canberra, Australia: CAUL [Council of Australian University Librarians], 2005).
18. M. Pinto and Dora Sales, “INFOLITRANS [Information Literacy for Translators]: A Model for the Development of Information Competence for Translators,” Journal of Documentation 64, 3 (2008): 413–37.
19. María Pinto and Rosaura Fernández-Pascual, “Information Literacy Competencies among Social Sciences Undergraduates: A Case Study Using Structural Equation Model,” presentation at European Conference on Information Literacy (ECIL), Dubrovnik, Croatia, October 20–23, 2014.
20. María Pinto, Rosaura Fernández-Pascual, and Susana Puertas, “Undergraduates’ Information Literacy Competency: A Pilot Study of Assessment Tools Based on a Latent Trait Model,” Library & Information Science Research 38, 2 (2016): 180–89.
21. Rosanne M. Cordell and Linda F. Fisher, “Reference Questions as an Authentic Assessment of Information Literacy,” Reference Services Review 38, 3 (2010): 474–81.
22. Judith T. M. Gulikers, Theo J. Bastiaens, and Paul A. Kirschner, “A Five-Dimensional Framework for Authentic Assessment,” Educational Technology Research and Development 52, 3 (2004): 67–86.
23. Nikolas Leichner, Johannes Peter, Anne-Kathrin Mayer, and Günter Krampen, “Assessing Information Literacy Programmes Using Information Search Tasks,” Journal of Information Literacy 8, 1 (2014): 3–20; Shikha Sharma, “From Chaos to Clarity: Using the Research Portfolio to Teach and Assess Information Literacy Skills,” Journal of Academic Librarianship 33, 1 (2007): 127–35; Valerie Sonley, Denise Turner, Sue Myer, and Yvonne Cotton, “Information Literacy Assessment by Portfolio: A Case Study,” Reference Services Review 35, 1 (2007): 41–70.
24. Caroline F. Timmers and Cees A. W. Glas, “Developing Scales for Information-Seeking Behaviour,” Journal of Documentation 66, 1 (2010): 46–69.
25. Jos van Helvoort, “A Scoring Rubric for Performance Assessment of Information Literacy in Dutch Higher Education,” Journal of Information Literacy 4, 1 (2010): 22–39.
26. Leichner, Peter, Mayer, and Krampen, “Assessing Information Literacy Programmes Using Information Search Tasks.”
27. J. Richard Landis and Gary G. Koch, “The Measurement of Observer Agreement for Categorical Data,” Biometrics 33 (1977): 159–74; James M. LeBreton and Jenell L. Senter, “Answers to 20 Questions about Interrrater Reliability and Interrater Agreement,” Organizational Research Methods 11, 4 (2008): 815–52.
28. María Pinto, José-Antonio Gómez-Hernández, Susana Puertas, David Guerrero, Ximo Granell, Carmen Gómez, Rocío Palomares, and Aurora Cuevas, “Designing and Implementing Web-Based Tools to Assess Information Competences of Social Science Students at Spanish Universities,” in Worldwide Commonalities and Challenges in Information Literacy Research and Practice, ed. Serap Kurbanoglu, Esther Grassian, Diane Mizrachi, Ralph Catts, and Sonja Špiranec (Istanbul, Turkey: Springer, 2013), 443–49; Pinto and Fernández-Pascual, “Information Literacy Competencies among Social Sciences Undergraduates.”
29. Dorothy Adkins Wood, Test Construction: Development and Interpretation of Achievement Tests (Columbus, OH: Charles E. Merrill, 1960).
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
María Pinto and Rosaura Fernández-Pascual 593
30. Robert L. Ebel and David A. Frisbie, Essentials of Educational Measurement (Englewood Cliffs, NJ: Prentice Hall, 1986); Eduardo Backhoff Escudero, Norma Larrazolo Reyna, and Martín Rosas Morales, “The Level of Difficulty and Discrimination Power of the Basic Knowledge and Skills Examination (EXHCOBA),” Revista Electrónica de Investigación Educativa (Electronic journal of educational research) 2, 1 (2000).
31. Pinto, “Design of the IL-HUMASS Survey on Information Literacy in Higher Education”; Pinto, “An Approach to the Internal Facet of Information Literacy Using the IL-HUMASS Survey”; Pinto, Gómez-Hernández, Puertas, Guerrero, Granell, Gómez, Palomares, and Cuevas, “Designing and Implementing Web-Based Tools to Assess Information Competences of Social Science Students at Spanish Universities”; Pinto and Fernández-Pascual, “Information Literacy Competencies among Social Sciences Undergraduates.”
32. Judea Pearl, Causality: Models, Reasoning, and Inference, 2nd ed. (New York: Cambridge University Press, 2009); Rex B. Kline, Principles and Practice of Structural Equation Modeling, 4th ed. (New York: Guilford, 2016).
33. O’Connor, Radcliff, and Gedeon, “Applying Systems Design and Item Response Theory to the Problem of Measuring Information Literacy Skills”; Ralph Catts, “Evaluating Information Literacy Initiatives in Higher Education,” in Informaatiolukutaito yliopisto-opetuksessa (Information literacy in higher education), ed. Anne Nevgi (Helsinki, Finland: Yliopistopaino Kustannus/Palmenia-sarja, 2007), 33–52; Nathan P. Podsakoff, Wei Shen, and Philip M. Podsakoff, “The Role of Formative Measurement Models in Strategic Management Research: Review, Critique, and Implications for Future Research,” in Research Methodology in Strategy and Management, ed. David J. Ketchen and Donald D. Bergh (Bradford, U.K.: Emerald Group, 2006), 197–252.
34. Karl G. Jöreskog and Dag Sörbom, PRELIS [preprocessor for LISREL, linear structural relations] 2: User’s Reference Guide (Lincolnwood, IL: Scientific Software International, 2005). Since it is a sample not normally distributed, inter-item polychoric correlation matrices and their asymptotic variance-covariance matrices were considered, as well as weighted least squares.
35. Barbara M. Byrne, Structural Equation Modeling with AMOS [analysis of moment structures]: Basic Concepts, Applications and Programming, 2nd ed. (New York: Routledge Taylor & Francis Group, 2010); Randall E. Schumacker and Richard G. Lomax, A Beginner’s Guide to Structural Equation Modeling, 2nd ed. (Mahwah, NJ: Lawrence Erlbaum, 2004).
36. Michael W. Browne and Robert Cudeck, “Alternative Ways of Assessing Model Fit,” in Testing Structural Equation Models, ed. Kenneth A. Bollen and J. Scott Long (Newbury Park, CA: SAGE, 1993), 136–62; James H. Steiger, “Structural Model Evaluation and Modification,” Multivariate Bahavioral Research 25, 2 (1990): 173–80.(136-162
37. Pinto, Fernández-Pascual, and Puertas, “Undergraduates’ Information Literacy Competency.”
38. Ibid.39. Ibid.40. Ibid.
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.
This m
ss. is
peer
review
ed, c
opy e
dited
, and
acce
pted f
or pu
blica
tion,
porta
l 17.3
.