20
AC 2010-373: COMPUTATIONAL THINKING: WHAT SHOULD OUR STUDENTS KNOW AND BE ABLE TO DO? Dianne Raubenheimer, North Carolina State University Dr. C. Dianne Raubenheimer received her PhD from the University of Louisville and is Director of Assessment in the College or Engineering and Adjunct Assistant Professor in the Department of Adult and Higher Education at NC State University. Within the College of Engineering she serves as the coordinator of ABET and other accreditation processes, acts as an assessment & evaluation resource/consultant to faculty in different programs, develops and implements assessment plans, and serves as the primary educational assessment data analyst on the Dean’s staff. A particular interest is in helping faculty to develop and implement classroom-based assessment and action research plans to establish the effectiveness of instruction and to use the data to improve teaching and student learning. She is currently working with several engineering faculty, researching the impact of in-class use of technology on teaching and student learning. Dianne has also worked as an education consultant for a number of organizations and is currently serving as external evaluator on several grants. Eric Wiebe, North Carolina State University Dr. Eric Wiebe is an Associate Professor in the Department of Mathematics, Science, and Technology Education at NC State University. He received his Doctorate in Psychology and has focused much of his research on issues related to the use of technology in the instructional environment. He has also worked on the integration of scientific visualization concepts and techniques into both secondary and post-secondary education. Dr. Wiebe has been a member of ASEE since 1989. Chia-Lin Ho, North Carolina State University Chia-Lin Ho joined the Computing across Curricula Team in early 2008 as a research assistant.She is a graduate student in the Industrial/Organizational Psychology Doctoral Program. She received a B.S. in Psychology and a Bachelor of Business Administration at the National Cheng-Chi University in Taiwan in 2002 and her Masters in I/O Psychology at the University of North Carolina at Charlotte in 2005. Her research interests include measurement and evaluation issues, individual differences, leadership, cross-cultural studies, work motivation, and the application of technology on human resources management. © American Society for Engineering Education, 2010

AC 2010-373: COMPUTATIONAL THINKING: WHAT SHOULD …rouskas.csc.ncsu.edu/Projects/CAC/Pubs/ASEE-CAC-2010.pdfenvironment. He has also worked on the integration of scientific visualization

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: AC 2010-373: COMPUTATIONAL THINKING: WHAT SHOULD …rouskas.csc.ncsu.edu/Projects/CAC/Pubs/ASEE-CAC-2010.pdfenvironment. He has also worked on the integration of scientific visualization

AC 2010-373: COMPUTATIONAL THINKING: WHAT SHOULD OUR STUDENTSKNOW AND BE ABLE TO DO?

Dianne Raubenheimer, North Carolina State UniversityDr. C. Dianne Raubenheimer received her PhD from the University of Louisville and is Directorof Assessment in the College or Engineering and Adjunct Assistant Professor in the Departmentof Adult and Higher Education at NC State University. Within the College of Engineering sheserves as the coordinator of ABET and other accreditation processes, acts as an assessment &evaluation resource/consultant to faculty in different programs, develops and implementsassessment plans, and serves as the primary educational assessment data analyst on the Dean’sstaff. A particular interest is in helping faculty to develop and implement classroom-basedassessment and action research plans to establish the effectiveness of instruction and to use thedata to improve teaching and student learning. She is currently working with several engineeringfaculty, researching the impact of in-class use of technology on teaching and student learning.Dianne has also worked as an education consultant for a number of organizations and is currentlyserving as external evaluator on several grants.

Eric Wiebe, North Carolina State UniversityDr. Eric Wiebe is an Associate Professor in the Department of Mathematics, Science, andTechnology Education at NC State University. He received his Doctorate in Psychology and hasfocused much of his research on issues related to the use of technology in the instructionalenvironment. He has also worked on the integration of scientific visualization concepts andtechniques into both secondary and post-secondary education. Dr. Wiebe has been a member ofASEE since 1989.

Chia-Lin Ho, North Carolina State UniversityChia-Lin Ho joined the Computing across Curricula Team in early 2008 as a researchassistant.She is a graduate student in the Industrial/Organizational Psychology Doctoral Program.She received a B.S. in Psychology and a Bachelor of Business Administration at the NationalCheng-Chi University in Taiwan in 2002 and her Masters in I/O Psychology at the University ofNorth Carolina at Charlotte in 2005. Her research interests include measurement and evaluationissues, individual differences, leadership, cross-cultural studies, work motivation, and theapplication of technology on human resources management.

© American Society for Engineering Education, 2010

Page 2: AC 2010-373: COMPUTATIONAL THINKING: WHAT SHOULD …rouskas.csc.ncsu.edu/Projects/CAC/Pubs/ASEE-CAC-2010.pdfenvironment. He has also worked on the integration of scientific visualization

Computational thinking: What should our students know and be

able to do?

Abstract

A NSF funded project on our campus has two overarching goals: (1) to create a computational

thinking thread in engineering programs that spans from the freshman to senior years and bridges

the divide between freshman year computing and computing in upper-level classes, and (2) to

enable students to take computing competencies to the next level, where they are able to perform

high-level computing tasks within the context of a discipline.

To achieve the goals of the project, faculty fellows from different engineering departments

participate in a series of seminars relating to computational thinking that are held over the

duration of a semester. The faculty fellows also undertake action research projects in their

classrooms as they redesign course curricula to integrate computational thinking skills. In order

to build relevant curricula and to offer appropriate faculty development sessions, it was first

necessary to identify what computational skills and competencies different engineering industries

expect of graduates as they enter the workforce and in their first years on the job. This

presentation will share the results of the data collection and analysis effort centered on

identifying these criteria.

The starting point for identifying industry needs was a workshop held with a panel of industry

representatives. Based on the results of the industry workshop, a model of computational

capabilities emerged that articulated different levels of computational ability in a problem

solving context. Using this framework, a Delphi process was employed to survey a group of

practicing engineers employed by companies hiring graduates from our university, to gain

consensus about desired computational skills and competencies. In the first round of the Delphi,

six open-ended questions were posed. Preliminary versions of the questions were piloted at the

industry panel workshop and refined based on the framework for use in the Delphi study. The

open-ended responses were coded by three researchers independently and then collectively

developed into a consensus coding scheme. The resulting coded responses clustered into five

main themes, with interrelationships established between primary themes and subthemes. Using

these themes, the second round Delphi survey was developed requesting the same employers to

rate the importance of computational skills on a 5-point scale from 'not important' to 'very

important'.

In addition to presenting results of both rounds of the Delphi study and the model of

computational abilities, we will present the pre- and post-course student survey results conducted

in classes where curriculum changes have been piloted based on this model of computational

thinking. The purpose of the student surveys was to find out if changes in the courses reflected

the computational skills highlighted by industry representatives as essential skills for graduates

to possess.

Finally, we will present recommendations for larger scale curricular changes in the different

engineering disciplines based on the findings from this study.

Page 3: AC 2010-373: COMPUTATIONAL THINKING: WHAT SHOULD …rouskas.csc.ncsu.edu/Projects/CAC/Pubs/ASEE-CAC-2010.pdfenvironment. He has also worked on the integration of scientific visualization

1.0 Introduction

The engineering workplace has been impacted by rapidly developing computational technologies

that are radically reshaping the nature of the workplace.1 This and other immense changes in

global political and economic dynamics means the 21st century engineer will look very different

than their 20th

century counterparts.2 While these changes can be seen as a real threat to the

engineering job market, engineers who have learned how to harness computational capabilities

for advanced analysis and problem-solving will continue to be in great demand for decades to

come. However, while broad, general skills such as computational capabilities are recognized as

crucial to future careers, there is a dearth of understanding as to how to characterize these

abilities and how to integrate them into STEM curricula.3 To make sure that future engineering

graduates are properly prepared for the 21st century workplace, our multidisciplinary National

Science Foundation project, CPATH: Computing Across Curricula, has a twofold goal to (1)

characterize and develop a computational thinking thread that spans beyond the freshman year’s

computing course to all levels of the engineering curricula, and (2) increase students’

computational competency by applying appropriate computing approaches during/in the problem

solving process.

Developing computationally capable engineers requires the understanding of both what

capabilities matriculating students bring with them4 and what engineering employers expect from

their new hires. In addition to being able to characterize students relative to their computational

capabilities at these endpoints, engineering faculty and instructors need to be able to be able to

employ a framework of computational capabilities that both provides a vehicle for measuring

capabilities of students at the beginning and end of their courses, but also helps guide their

strategies around pedagogical innovations intended to impact these capabilities. With the purpose

of specifying the trajectory of computational capabilities through an undergraduate engineering

program, the project gathered data from numerous sources, 1) to understand the endpoint

trajectory of graduating engineers by engaging representatives of companies which historically

have hired our graduates, and 2) to develop a theoretical framework of computational capabilities

and instructional model that can be used to characterize student abilities and guide instructional

design.

To this end, the research reported here both provides background to the overall approach to the

research project, results to date, and new data aimed at answering the following research

questions:

1. What computational capabilities do future managers of engineering graduates expect new

hires to possess?

2. Can the instructional model of computational capabilities, developed during the research

process, be used to characterize the experiences, abilities, and the perceptions of

engineering students along dimensions of interest to future employers?

2.0 Approach

Figure 1 depicts the overall approach to data collection and analysis to address the two primary

goals of this project: (1) to create a computational thinking thread in engineering programs that

spans from the freshman to senior years and bridges the divide between freshman year

Page 4: AC 2010-373: COMPUTATIONAL THINKING: WHAT SHOULD …rouskas.csc.ncsu.edu/Projects/CAC/Pubs/ASEE-CAC-2010.pdfenvironment. He has also worked on the integration of scientific visualization

computing and computing in upper-level classes, and (2) to enable students to take computing

competencies to the next level, where they are able to perform high-level computing tasks within

the context of a discipline.

Both the experience of engineering education faculty on the project and ongoing reporting in the

literature5, 6

indicated that our strategy should involve the examination of both commonalities

and differences in computational literacy goals across a broad range of engineering disciplines.

The Delphi process7, a method used to gain consensus on a topic among a group of experts, was

selected as the most robust approach to inductively identify the specific needs of engineering-

oriented industries that have been ill-defined in the literature. A literature review and a Delphi

process were paired with the implementation of a “Computing across Curricula” (CAC)

community which sponsored engineering faculty fellows to participate in a seminar series and

conduct action research projects in their classrooms. The CAC community was a crucial test bed

for developing and testing a Computational Capabilities Theoretical Framework.

It was our intention that the emerging theoretical framework and the research results from this

project be used for further research, curriculum decision making and classroom change. This is

reflected in the schematic diagram below, where research results have informed both classroom

interventions, as well as the design of subsequent stages of the research process.

Figure 1: Schematic showing the overall approach to data collection and analysis.

Page 5: AC 2010-373: COMPUTATIONAL THINKING: WHAT SHOULD …rouskas.csc.ncsu.edu/Projects/CAC/Pubs/ASEE-CAC-2010.pdfenvironment. He has also worked on the integration of scientific visualization

2.1 Initial Industry Panel and Computational Capabilities Instructional Model

With the goal of defining an initial set of computational capabilities around which the Delphi

questions could be developed, an industry panel was convened8. The industry panel was also

asked to identify individuals who would participate in the Delphi study. The industry panel

included thirteen participants and represented companies in the computing, energy, textile and

healthcare industries. Participants included senior executives, as well as first-line engineering

managers, and represented five different engineering disciplines. A small brainstorming activity

was facilitated in each subgroup using Affinity Diagrams to answer three open-ended questions

drafted by our project team, with the potential for those questions to be used later in the Delphi

process. The three questions posed to the industry panel were:

≠ What proficiencies and fluencies are required for new hires in your company?

≠ What proficiencies and fluencies do you expect your workers to develop during their first

years on the job?

≠ What new proficiencies and fluencies do you see emerging in the next couple of years in

your field?

From the combination of results from both subgroups, some common themes emerged as shown

in Table 1. These results and feedback from the workshop were utilized to construct the first

Delphi.

Table 1: – Common Themes from the Industry Panel Workshop

New hires After first year on job Next few years

Specific applications (domain

knowledge)

Technological tools Architecture &

technology skills

Problem solving skills (critical

thinking)

Systems knowledge Soft skills (global

issues)

Communication skills Self motivated innovation Accountability

Knowledge of a programming

language

Understanding business

needs (value proposition)

Data exploration

Database management skills Data reporting

In parallel with the industry panel work was a comprehensive literature review pertaining to

computer competency, proficiency, and fluency at the university level. The results of the

literature review revealed broad and inconsistent interpretations of the terms competency,

proficiency, and fluency, with very little material that spoke to the specific needs of the

engineering profession. While numerous articles were collected as part of the literature review,

two documents ended up being central to the model development: an influential National

Research Council report 9 and more recent work done by Dougherty and colleagues.

10

Using the outcomes of the industry panel workshop and literature review, a first draft of a

Computational Capabilities Instructional Model was completed (Figure 2). This model is part

one of the Computational Capabilities Theoretical Framework that emerged.

The instructional model we developed looks at computational capabilities needed in a problem-

solving context, central to both professional engineering practice and, appropriately, engineering

Page 6: AC 2010-373: COMPUTATIONAL THINKING: WHAT SHOULD …rouskas.csc.ncsu.edu/Projects/CAC/Pubs/ASEE-CAC-2010.pdfenvironment. He has also worked on the integration of scientific visualization

education.11

Basic, relatively stable intellectual capabilities are recognized as essential for

problem solving, including the general cognitive abilities necessary for learning and applying

declarative and procedural knowledge as well as for engaging in the problem-solving process.

Technical skills refer to the abilities pertaining to manipulating a specific software tool or

programming in a particular language to solve the problem. Two types of specific knowledge

also need to be applied to the problem. Conceptual knowledge is higher-level knowledge (i.e.,

understanding at a more abstract level) of computing systems and languages in general.

Application domain knowledge is within the engineering discipline where the problem resides

(e.g., polymer synthesis, circuit design, mechanical coupling design).

Figure 2 – The Computational Capabilities Instructional Model

Using this model, three levels of Computational Capability were defined (Table 2), and this

forms Part Two of the Computational Capabilities Instructional Model. The goal was to establish

terminology to describe curricula that identify what general capabilities should be assumed as

students leave secondary education and matriculate into an undergraduate engineering program,

and that can be used to describe competencies as these same students enter the workforce.

Page 7: AC 2010-373: COMPUTATIONAL THINKING: WHAT SHOULD …rouskas.csc.ncsu.edu/Projects/CAC/Pubs/ASEE-CAC-2010.pdfenvironment. He has also worked on the integration of scientific visualization

Table 2: Levels of Computational Capability

Level Description Competency

The individual has technical skill mastery of certain computational tools

and/or programming languages. Limits in conceptual knowledge means

that they are limited to solving well-defined tasks with specified tools.

When faced with a more open-ended or complex problems, limits in

conceptual knowledge will mean they will probably not be able to solve

the problem.

Proficiency

The individual has some conceptual knowledge of both computing

systems and their application domain. When presented with a problem,

they are able to select the appropriate tools(s), seek the necessary

information, and present a solution. The regularly used technical skills

are committed to memory, and external information resources are not

needed in these cases. More complex problems and problems with

multiple possible solution paths for which they have to evaluate the

quality of the different solution paths will create difficulties for the

individual. Overall intellectual capability may be a limiting factor.

Fluency

The individual has extensive knowledge of the technical tools and

conceptual aspects of both computer systems and the application domain

of their profession. Within their professional area, they are able design

and evaluate multiple solution paths to complex problems. They are well

versed in general knowledge in the problem space and do not need to

refer to external resources for common problems. New computing tools

are readily evaluated and integrated into their existing tool set. Limits to

problem-solving usually result from moving outside their professional

application domain or the bounds of general intellectual capabilities.

During their four years in an engineering program, students continue to develop a competency

level of both general capabilities useful in many areas of their education, as well as specific

capabilities for their chosen discipline. The instructional model assumes that many of the

computational capabilities that a student develops and applies will be in a problem-solving

context. It also assumes that as students move through their undergraduate program and into

workplace settings, that these problem-solving scenarios will become increasingly ill-defined

and complex12, 13

, requiring the proficiency level of computational capability. It is important to

note that the assumption (based on feedback from the industry panel) is that few students will

develop capabilities at the fluency level prior to embarking on a professional engineering career.

2.2 Delphi Round 1

The first round of a Delphi study involved posing a set of open-ended questions to a group of

experts. 14

The detail of the Delphi process we used is described in last year’s conference paper 15

.

In summary, our experts were engineering line managers who directly supervise new engineering

hires in a company, including graduates from our university. Using the initial industry workshop

and the model presented above as the framework, we generated six open-ended questions

regarding the competencies, proficiencies and fluencies that industry expects of (a) new

engineering hires and (b) after a few years on the job. We requested the Delphi participants to

answer these questions, providing as much detail as possible.

Page 8: AC 2010-373: COMPUTATIONAL THINKING: WHAT SHOULD …rouskas.csc.ncsu.edu/Projects/CAC/Pubs/ASEE-CAC-2010.pdfenvironment. He has also worked on the integration of scientific visualization

These six Delphi questions were:

1. What computing competencies are required for new technical hires at your company?

2. What computing proficiencies do you expect your technical employees to develop during

their first few years on the job?

3. What new computing skills and processes do you see emerging in the next couple of

years in your field?

4. Once fluent, what types of problems do you expect your technical employees (with 3-5

years of experience) to solve using computing tools?

5. Once fluent, what types of projects do you expect your technical employees (with 3-5

years of experience) to design using computing tools?

6. What computing capabilities do you expect technical employees to use to be successful in

a global work environment?

All survey responses were content analyzed by three researchers who read all responses,

generated a coding list, coded all responses, checked for inter-rater agreement, sought consensus

if there was any disagreement about coding statements, generated a set of themes and then

explored the relationships among the themes. Thematic clustering of all responses, developed

through consensus, was used to summarize these relationships.

2.3 Delphi Round 2

For the second round of the Delphi, we focused on computing competencies required for new

technical hires (Question 1 in Delphi 1) and those that were expected to be developed during the

first few years on the job (Question 2 in Delphi 1). Themes emerging more than once from any

question in the Delphi Round 1 were used to develop a Likert scale survey, which was sent back

for completion by the respondents from the first Delphi. [We also recruited additional

respondents in this round who represented non-computing industries, specifically textiles

engineering].

The survey contained three sections, where the first section included three personal

identification questions (i.e., respondent name, job title, and company name), the second section

had 12 questions pertaining to the computing capabilities of new hires, and the third part had 14

questions pertaining to computing capabilities expected during the first years on the job. Each

response was assigned a value (1 = not important, 2 = slightly important, 3= average importance,

4 = important, 5 = very important) and the mean rating given by all respondents was calculated

for each question. Responses with a mean value higher than 4.0 and with a standard deviation

less the 1.0 indicate a high level of consensus among participants about the importance of that

particular item, while responses with lower means and higher standard deviations reflect lower

levels of consensus 14

.

Results were further analyzed by type of engineering industry, with computer science, electrical

engineering, computer engineering, information technology and engineering computer services

responses being aggregated separately to the responses from other industries that included

chemical engineering, textile engineering, mechanical engineering, civil engineering, chemical

and biomolecular engineering, and industrial and systems engineering. This decision to analyze

the computer science-related and other engineering responses separately came out of the first

Page 9: AC 2010-373: COMPUTATIONAL THINKING: WHAT SHOULD …rouskas.csc.ncsu.edu/Projects/CAC/Pubs/ASEE-CAC-2010.pdfenvironment. He has also worked on the integration of scientific visualization

round of the Delphi analysis, with the conclusion that these engineering areas had unique

computational capability needs.

2.4 Student Surveys

During the seminar series, the faculty fellows were exposed to the Computational Competencies

Instructional Model (Figure 2 and Table 2) and the Delphi results. The industry needs along with

other information about computing innovations were delivered to inspire these faculty members

to consider appropriate classroom pedagogical changes that would enhance the computational

capabilities of their students. Each faculty fellow that has participated in the seminar series has

undertaken some instructional change in at least one course that s/he is teaching in order to

enhance the computational capabilities of their students, with each engaging in a parallel action

research project. [The detailed results from each action research project undertaken, though

beyond the scope of this paper, will be reported in the future.] In courses where instructional

changes were implemented, students were recruited to take a pre- and a post-test survey, to

examine whether changes in the course reflected the computational skills highlighted by industry

representatives as essential skills for graduates to possess.

Construction of the initial student survey was based on the most common themes that emerged

from the first round of the Delphi and the Computational Competencies Instructional Model, and

contained five sections. The survey was piloted in summer 2009 and changes made based on

student feedback. The final survey was deployed in fall 2009, and again in spring 2010, although

the latter results are not yet available. In the first section of the pre-survey, students rated the

frequency of use of particular computing processes/tools, thinking about the semester prior to the

current one, while in the second section they rated their level of skill in using those

processes/tools in the previous semester. In the third section, students rated the perceived value

of those processes/tools to their learning in the previous semester. The fourth section contains

attitudinal questions about learning and about computer tools. The fifth section asks

demographic questions. The post-test contains the same items, but in this instance, students were

asked to answer the questions as they pertained to the “enhanced computational capabilities”

course in which they were enrolled. The mean ratings of the pre- and post-survey were calculated

and compared using a paired t-test.

3.0 Results

The Delphi Round 1 research (previously reported in more detail15

) resulted in the identification

of five thematic cluster areas that were used to both inform the Delphi Round 2 research and the

implementation of the student surveys. These three areas of data collection and analysis are

reported in more detail below.

3.1 Delphi 1 Results

A total of 19 participants in the first round of the Delphi represented six different engineering

disciplines, with their work experience ranging from 1 to 20 years and with the position from

first-line engineers to senior managers.

Page 10: AC 2010-373: COMPUTATIONAL THINKING: WHAT SHOULD …rouskas.csc.ncsu.edu/Projects/CAC/Pubs/ASEE-CAC-2010.pdfenvironment. He has also worked on the integration of scientific visualization

There were 30 themes identified across the Delphi 1 survey questions, and each theme was given

a code. Five overlapping clusters emerged from our examination of the relationship among

themes15.

These five meta-themes were the result of clustering the larger set of more granular

codes, listed in Appendix 1, which emerged from the Delphi Round 1 coding. These larger

thematic clusters were:

≠ Computer Science (including Microsoft Office Tools, Basic Knowledge of Architecture,

Knowledge of Programming, Basic Operating Systems),

≠ Data Analysis (including Data Analysis Skills, Database Fundamentals, Database

Management),

≠ Design Modeling and Simulation (including Ability to Use Simulation Packages, Process

Modeling and Design, Proficiency in Simulations),

≠ Core Individual Work Skills (including Web Searching, Problem Solving),

≠ Meta-Project Level (including Project Management Applications, Communication

Tools).

This thematic analysis revealed there was a core set of computing skills common to all

engineering disciplines. For instance, across the Computer Science cluster, there were Core

Individual Work Skills (e.g., Microsoft Office Applications) common to all engineering

disciplines and there were also themes related to engineering problem-solving common to all

disciplines. While the specific tools varied, all disciplines needed to manage engineering data as

part of their problem-solving processes (e.g., Database Management, Database Fundamentals).

Meta-Project Level themes were also common across disciplines and relate to connecting

engineering design work to other aspects of the companies’ work. Many of these “soft skills”,

while not directly related to engineering problem-solving, have long been recognized by

engineering educators as key capabilities valued by the engineering profession.

As noted above, for computer science and computer and electrical engineering, much of the day-

to-day computational work stayed within the Computer Science cluster. However, other

engineering areas weighed heavily on themes that appeared in the Design Modeling and

Simulation cluster. Data Analysis held themes that were related to both Computer Science

capabilities and somewhat more indirectly related to the design process, such as managing

engineering data used in the decision-making process. As noted above, design data management,

whether it was related to computer science/computer and electrical engineering or to engineering

areas making use of modeling and simulation tools, made use of database management tools that

also resided in the Data Analysis theme.

3.2 Delphi 2 Results

In order to answer the first research question relating to the computational capability

expectations of future employers of engineering graduates, a total of 22 respondents in the

second round of the Delphi represented nine different engineering disciplines from different

organizations.

Of 12 questions pertaining to computing capabilities of new hires, there was no response with a

mean value higher than 4.0 and with a standard deviation less the 1.0, indicating that no

consensus was achieved among participants about the relative importance of different computing

Page 11: AC 2010-373: COMPUTATIONAL THINKING: WHAT SHOULD …rouskas.csc.ncsu.edu/Projects/CAC/Pubs/ASEE-CAC-2010.pdfenvironment. He has also worked on the integration of scientific visualization

capabilities of new hires (see ranking of the capabilities in Table 3). In contrast, of 14 computing

capabilities expected to develop during the first years on the job, there was consensus by all

industries on four capabilities (as indicated by a mean about 4.0 and standard deviation less than

1.0), Ability to Learn and Adaptability, Microsoft Office Tools, Industry-Specific Tools, and

Data Analysis Skills. These were considered as important skills to develop in those first years

(see ranking of the computing capabilities in Table 4).

To analyze responses by type of engineering industry, the responses from the computer science

and computer/electrical engineering industries were clustered and indicated that there was

consensus on one capability, Basic Knowledge of Programming, is an important computing

ability for new hires (see Table 3). There was consensus that five capabilities, Proficiency in

Programming Languages, Ability to Learn & Adaptability, Basic Knowledge of Programming,

Integrated View of Systems/Applications, and Web Programming & Language, were to be

developed during the first few years on the job (see Table 4).

The results of the responses from all other engineering industries analyzed together indicated

consensus that two capabilities, Microsoft Office Tools and Web Search, are important skills for

new hires (see Table 3). There was also consensus that five capabilities, Microsoft Office Tools,

Ability to Learn & Adaptability, Industry-Specific Tools, Data Analysis Skills, and

Communication Tools, were expected during the first years on the job (see Table 4).

Table 3: Computational capabilities expected for new hires

All (N= 22) CSC Only (N= 8) Non-CSC (N=14)

Computing Capability Rank M SD Rank M SD Rank M SD

Web Search 1 4.2 1.2 3 3.6 1.8 2 4.6 0.7

Microsoft Office Tools 2 4.1 1.2 5 3.1 1.3 1 4.6 0.6

Data Analysis Skills 3 3.9 1.1 4 3.5 1.1 3 4.1 1.1

Basic Knowledge of

Programming 4 3.6 1.3 1 4.6 0.5 7 3.1 1.3

Proficiency in Programming

Languages 5 3.4 1.3 6 3.1 1.3 7 2.8 1.2

Communication

Tools/Organization 6 3.3 1.4 8 2.6 1.4 4 3.7 1.2

Queries Debugging/Testing 7 3.2 1.4 2 4.0 1.3 7 2.8 1.3

Web Programming & Language 8 3.1 1.2 4 3.5 0.8 7 2.8 1.3

Database Fundamentals 9 3.0 1.1 7 2.9 1.3 6 3.1 1.1

Process Modeling & Design 10 2.9 1.5 9 2.0 1.2 5 3.4 1.5

Software Systems Design 11 2.8 1.4 3 3.6 1.2 8 2.3 1.3

Basic Knowledge of

Architectures 12 2.6 1.3 4 3.5 1.2 9 2.1 1.0

Page 12: AC 2010-373: COMPUTATIONAL THINKING: WHAT SHOULD …rouskas.csc.ncsu.edu/Projects/CAC/Pubs/ASEE-CAC-2010.pdfenvironment. He has also worked on the integration of scientific visualization

Table 4: Computing capabilities expected to develop during the first few years on the job

All (N= 22) CSC Only (N= 8) Non-CSC (N=14)

Computing Capability Rank M SD Rank M SD Rank M SD

Ability to Learn & Adaptability 1 4.7 0.6 1 4.8 0.5 2 4.6 0.6

Microsoft Office Tools 2 4.3 0.9 7 3.5 1.1 1 4.7 0.5

Data Analysis Skills 3 4.2 0.8 6 3.6 0.9 3 4.5 0.5

Industry-specific Tools 3 4.2 0.9 6 3.6 0.9 3 4.5 0.7

Communication

Tools/organization 4 3.9 1.1 8 3.1 1.3 4 4.3 0.8

Basic Knowledge of

Programming 5 3.6 1.4 3 4.5 0.8 8 3.1 1.4

Proficiency in Programming

Languages 6 3.5 1.5 2 4.8 0.5 11 2.7 1.4

Process Modeling & Design 7 3.3 1.6 11 2.6 1.5 5 3.7 1.6

Project Management

Applications 8 3.3 1.2 9 2.9 1.1 6 3.5 1.2

Web Programming & Language 8 3.3 1.5 5 4.0 0.9 10 2.9 1.6

Integrated View of

Systems/Applications 9 3.2 1.4 4 4.1 0.8 11 2.7 1.4

Basic Knowledge of

Architectures 10 2.9 1.6 4 4.1 1.1 12 2.2 1.4

Proficiency in Simulations 11 2.8 1.3 12 2.1 1.3 7 3.2 1.2

Database Management 12 2.7 1.1 10 2.4 0.9 9 2.9 1.1

3.3 Student Survey Results

To answer the second research question in this project (whether the previously developed

instructional model of computational capabilities can be used to characterize the experiences,

abilities, and the perceptions of engineering students along dimensions of interest to future

employers), a sample of 92 students completed a pre- and post-course survey. The survey was

constructed by selecting the six themes that appeared most frequently in the industry responses to

the Delphi Round 1. Most students surveyed were between the ages of 19 to 21 years of age

(78.9%), male (82.4%), and white (84.9%; 5.8% were African-American). Students (35.2%

sophomore; 40.7% junior; 24.2% senior) were recruited from five courses (including a graphical

communications, chemical engineering, mechanical engineering and textile engineering course),

with a mean GPA of 3.27 (SD = 0.05).

The results indicated that, on average, students applied two computing capabilities, Web Search

and Communication Tools/Organization, more than once a week, during classes in the previous

semester (Spring 2009). However, for the semester in which they were enrolled in an enhanced

computational capabilities course (Fall 2009), students reported significantly less usage of

computing abilities related to web searches, communication tools, and basic knowledge of

programming (see Table 5).

Page 13: AC 2010-373: COMPUTATIONAL THINKING: WHAT SHOULD …rouskas.csc.ncsu.edu/Projects/CAC/Pubs/ASEE-CAC-2010.pdfenvironment. He has also worked on the integration of scientific visualization

Table 5: The frequency students applied the computing capabilities during the class

Pre-test Post-test

Computing Capability M SD M SD

Database Fundamentals 1.84 0.99 1.74 1.01

Process Modeling & Design 2.01 1.27 1.96 1.21

Basic Knowledge of Programminga 1.95 1.23 1.36 0.64

Data Analysis Skills 2.85 1.14 2.63 1.25

Communication Tools/Organizationa 4.11 1.23 3.39 1.47

Web Searcha 4.69 0.73 3.39 1.45

Note. Items are rated on a 5-point scale, ranging from 1 (Never) to 5 (Daily). a indicates pre- and post-test means are significantly different from each other at p < .05.

In terms of self-reported skill level, students initially rated themselves being competent in

Communication Tools/Organization and being proficient in conducting Web Searches in the

semester prior to taking the enhanced computational capabilities course. However, in the

semester of taking the enhanced computational capabilities course the self-rating of the Web

Search skills significantly decreased while the rating of abilities related to Database

Fundamentals, Basic Knowledge of Programming, and Data Analysis Skills increased

significantly (see Table 6).

Table 6: The self-reported skill level in applying the computing capabilities during the class

Pre-test Post-test

Computing Capability M SD M SD

Database Fundamentalsa 2.11 1.07 2.34 1.11

Process Modeling & Design 2.43 1.31 2.59 1.19

Basic Knowledge of Programminga 1.82 1.01 2.07 1.15

Data Analysis Skillsa 2.51 1.01 2.95 1.09

Communication Tools/Organization 3.75 1.02 3.64 1.12

Web Searcha 4.43 0.73 4.10 1.08

Note. Items are rated on a 5-point scale, ranging from 1 (Novice) to 5 (Expert).

a indicates pre- and post-test means are significantly different from each other at p < .05.

As regards the perceived value of different computing capabilities in facilitating their college

education, students gave high value to skills related to Data Analysis Skills, Communication

Tools/Organization, and Web Search at the beginning of the semester. However, at the end of the

semester the perceived value of these six computing capabilities in facilitating the student's

college education significantly decreased for all six dimensions (see Table 7).

Page 14: AC 2010-373: COMPUTATIONAL THINKING: WHAT SHOULD …rouskas.csc.ncsu.edu/Projects/CAC/Pubs/ASEE-CAC-2010.pdfenvironment. He has also worked on the integration of scientific visualization

Table 7: The perceived value of the computing capabilities in facilitating students' college

education

Pre-test Post-test

Computing Capability M SD M SD

Database Fundamentals a 2.46 0.89 2.08 0.99

Process Modeling & Designa 2.88 0.98 2.24 1.11

Basic Knowledge of Programming a 2.25 0.84 1.79 0.86

Data Analysis Skills a 3.19 0.79 2.76 1.04

Communication Tools/

Organization a

3.54 0.76 3.08 1.02

Web Search a 3.67 0.56 3.10 1.06

Note. Items are rated on a 4-point scale, ranging from 1 (Not at all) to 4 (Very valuable). a The pre- and post-test means of all computing capabilities are significantly different from each

other at p < .05.

The authors also examined the impacts of the course intervention on (a) students' self-efficacy

about learning in the discipline of engineering/computer science (9 questions), and (b) on the

self-efficacy of using computers (7 questions). The results, aggregating across questions in each

scale, are shown in Table 8. No change in either of the scales was found after implementing the

course interventions.

Table 8: Self-reported engineering/computing self-efficacy and computer self-efficacy

Pre-test Post-test

Scale M SD M SD

(a) Engineering/computer science self-

efficacy

32.38 4.73 32.73 4.36

(b) Computer use self-efficacy 27.67 4.23 27.46 3.84

Note. Items are rated on a 5-point scale, ranging from 1 (Strongly Disagree) to 5 (Strongly

Agree). There were no significant differences in pre- and post-test results.

4.0 Discussion

Overall, the iterative process outlined in Figure 1 has provided a robust approach (a) for

developing a Theoretical Framework, (b) for eliciting feedback from industry professionals

about specific industry needs, (c) for using results to inform subsequent research, (d) for making

decisions about curriculum and instructional practices, and (e) for assessing whether curricular

and pedagogical innovations incorporated into engineering courses, in response to these findings,

have impacted the computational capabilities of students.

Page 15: AC 2010-373: COMPUTATIONAL THINKING: WHAT SHOULD …rouskas.csc.ncsu.edu/Projects/CAC/Pubs/ASEE-CAC-2010.pdfenvironment. He has also worked on the integration of scientific visualization

We will now respond to the two research questions initially posed:

1. What computational capabilities do future managers of engineering graduates expect new

hires to possess?

Round one of the Delphi effectively gathered the range of computational capabilities perceived

by industry representatives to be important for their specific disciplinary area. The first two

questions, relating to expected capabilities of new hires, and to expected capabilities needed after

a few years on the job, were particularly relevant 1) for understanding the endpoint trajectory of

our graduates, and 2) for developing a framework of computational capabilities to be used in

characterizing student abilities and guiding instructional design. The broad themes that emerged

were particularly important for faculty fellows as they considered the content changes and the

instructional modifications necessary for developing the computational capabilities needed for

the future employment of students.

Similarly, the Delphi Round 2 highlighted areas of consensus and non-consensus across

industries and within similar industries. It seems that the Delphi process employed in this project

has effectively captured the broad computational capability expectations of future employers of

engineering graduates. It is not surprising that there was little consensus across the spectrum of

computer science/computer engineering and the range of other engineering disciplines, because

each specific area has unique requirements. However, consensus on the expectation of some

computing capabilities was reached relating to new hires and for employees having worked for

few years, for both the computer science/computer engineering industries and for the non-

computer science industries. For faculty, the industry consensus around specific skills relevant

to their discipline makes it clear what skills students need to acquire before graduation. The

consensus items provide a set of priorities for student learning, and thus for course design and

instructional emphasis.

The only “general job skill” not tied to computational capabilities that was included on the

survey had to do with “Ability to Learn and Adaptability,” in part because so many respondents

on Round 1 of the Delphi brought up this qualification. Not surprisingly, there was also

consensus across all industries in Round 2 on the importance of this soft skill being developed in

the first few years of employment. This capability was also a consensus theme for both industry

groups, when the data were analyzed by industry category.

There was no overall consensus on any capabilities for new hires. Any other overall consensus

capabilities across all industries were related to the first years on the job, and these were the use

of Microsoft Office Tools, Data Analysis Skills and Industry-specific Tools. However, when the

computer science/computer engineering industries were analyzed separately, these were not their

major consensus capabilities, indicating that they have very specific computer science related

expectations of capabilities that are more important. These are Basic Knowledge of

Programming, Proficiency in Programming Languages, Web Programming and Languages, and

Integrated View of Systems/Applications.

Overall, there was an expectation, perhaps not surprisingly, that Industry-specific Tools would

be learned on the job, but that new recruits would bring with them to their new job the general

Page 16: AC 2010-373: COMPUTATIONAL THINKING: WHAT SHOULD …rouskas.csc.ncsu.edu/Projects/CAC/Pubs/ASEE-CAC-2010.pdfenvironment. He has also worked on the integration of scientific visualization

abilities to use MS Office and Web Search tools. There was overall consensus of the importance

of Data Analysis Skills to be enhanced during the first years on the job, but this was not a

consensus capability as a prior skill to be learned at university (although it did rate fairly highly).

Perhaps Data Analysis Skills are considered to be more industry specific, although there was

consensus for non-computer science jobs around the importance of this capability. Perhaps this is

less valued in some computer science areas, and so did not show up as a consensus capability for

the computer science industries alone.

Not surprisingly, basic programming was important for new hires into computer science jobs

with higher level proficiencies being developed on the job. Interestingly, there was not any

consensus about basic programming for new hires in non-computer science industries.

2. Can the instructional model of computational capabilities, developed during the research

process, be used to characterize the experiences, abilities, and the perceptions of engineering

students along dimensions of interest to future employers?

With respect to the second research question, the variations among respondents and the

differences between the six computing capabilities framed in the student survey seem to indicate

that the Computational Capabilities Instructional Model can be used to characterize some of the

experiences, abilities, and the perceptions of engineering students. Overall, it appears that the

survey was useful in tracing the frequency of use of different capabilities over different

semesters and for tracking the perceived skill level between semesters. However, the results

about the perceived value of the different capabilities are somewhat difficult to interpret, with a

decrease in all dimensions by the end of the semester in which the course was taught. The

engineering/computing self efficacy questions did not show any differences between semesters.

In all cases, the survey results showed a decrease in the frequency of use of different computing

capabilities between the previous semester, and at the end of the semester in which the enhanced

course was implemented. Three of these were statistically significantly lower levels of use,

including Basic Knowledge of Programming, Communication Tools/Organization, Web Search

Tools. The decrease in these specific capabilities is not too surprising, specifically the latter two,

because many instructors may assume students already have these capabilities and so not

specifically targeted them. And programming would be less of a focus in the non-computer

science courses. However, it is somewhat surprising that there was no increased frequency of use

for any of the capabilities. It may be important to discuss with faculty implementing enhanced

courses, whether they are actually teaching towards these capabilities, or whether additional

capabilities need to be added to the survey.

Students self-rated their skill levels for Database Fundamentals and Basic Knowledge of

Programming in the advanced beginner range, indicating they are still developing competencies

in these areas. Process Modeling and Design & Data Analysis Skills were rated in the competent

range, while they rated themselves in the proficient range for Communication

Tools/Organization and Web Searches. However, the relatively small difference in scores

between the pre- and post-tests, for all six computing capabilities, indicates that the shift towards

increased competency, proficiency or even fluency does not occur in a single semester. Indeed,

there may even be a self-reported backward movement in their development, as seen in the case

Page 17: AC 2010-373: COMPUTATIONAL THINKING: WHAT SHOULD …rouskas.csc.ncsu.edu/Projects/CAC/Pubs/ASEE-CAC-2010.pdfenvironment. He has also worked on the integration of scientific visualization

of Web Search capability, where this may not have been used much in the semester concerned.

The movement towards greater computational ability likely occurs across a range of courses

during the student's program, where the different computational capabilities are used and

expanded upon at successive levels in the curriculum. Thus, it would be valuable to track

students' self-reported capabilities over time.

The survey results show alignment between industry expectations and student perceptions about

the importance of Microsoft Office/Communication tools and Web Searching tools. Students

seem to come into their classes with both high ratings concerning how often they used

communication, web, and programming tools and what their perceived abilities were in the

previous semester. On one hand, they probably used some of these tools less than they did in the

previous semester, but their use of them perhaps also demonstrated that they didn’t know as

much as they thought they did.

The lower perceived values of the six computing capabilities and the higher self-reported skill

level of the computing capabilities in the semester of taking the enhanced computational

capabilities course may demonstrate that the value of learning a particular computing skill/tool

decreases as the corresponding skill level increases. Of note, in the area of more specialized

engineering tools (database, programming, data analysis), the courses seemed to be successful in

providing enhanced (self-reported) skills. So, another explanation for the lower frequency of use

may be that while the frequency of use was less, the level of application was deeper and more

intensive, leading to greater ratings of skill levels for those items.

It also seems that the classroom pedagogical changes may impact students' self-efficacy about

learning in the discipline of engineering/computer science and computer use self-efficacy in the

long term, rather than over one semester. Again, tracking of students over many semesters

would help to determine if there are shifts in efficacy over time.

5.0 Recommendations and Future Work

The two rounds of the Delphi process have provided a benchmark against which innovations in

enhancing computational capabilities can be developed across engineering disciplines. Future

Delphi studies may target specific engineering disciplines or sub-areas within disciplines, so that

specific engineering curricula or curriculum concentrations/tracks may be better served by

understanding the more detailed nuances of particular industries.

The first implementation of a student survey instrument has shown promise of revealing the

impact of pedagogical changes in the classroom on student's skill levels of some of the

computational capabilities indicated as important to industry representatives. Clearly more than

one semester with a relatively small student sample will be needed to see if the innovations

explored by our CAC Fellows are effective. Also, tracking of students over successive semesters

is needed to trace the development of computational capabilities throughout their curriculum.

The results of the first implementation of the student survey, the findings from the second round

of the Delphi, and discussions about the student survey results with faculty implementing the

enhanced courses should all be used to further refine the student survey instrument prior to a

larger scale roll-out. However, even our initial data has indicated that our instructional model of

Page 18: AC 2010-373: COMPUTATIONAL THINKING: WHAT SHOULD …rouskas.csc.ncsu.edu/Projects/CAC/Pubs/ASEE-CAC-2010.pdfenvironment. He has also worked on the integration of scientific visualization

computational capabilities has provided a useful framework for generating formative and

summative data.

The instructional model within the larger theoretical framework of computational capabilities,

plus the benchmarks established by the Delphi process, provides a mechanism for beginning to

flesh a progression of increased computational skills and knowledge over the undergraduate

years. As more classroom experiments are conducted and more feedback from faculty elicited,

progressions, that not only differentiate between years, but also disciplinary areas, will provide

rich, detailed data around which strategic instructional decisions can be made. A crucial feature

of these progressions is the need to continuously emphasize, throughout each year and all core

classes, the instructors to focus on using and further developing these computational capabilities.

As industry needs continue to evolve, ongoing data collection from students and faculty, as well

as from future Delphi rounds, will provide feedback to enhance the development of curriculum

progression strategies and models.

Acknowledgements

This work is supported by NSF (CISE # 0722192) as part of CISE Pathways to

Revitalized Undergraduate Computing Education (CPATH) program. The project team would

also like to extend its sincere thanks to our partners in industry who served on our panels and our

CAC fellows who are implementing their innovations in their classrooms.

Bibliography

1. Levy, F., & Murnane, R. J. (2004). The new division of labor: How computers are creating the next job market.

Princeton, NJ: Princeton University Press.

2. Vest, C. M. (2008). Context and Challenge for Twenty-First Century Engineering Education. Journal of

Engineering Education, 97(3), 235-236.

3. NRC, National Research Council. (2008). Research on Future Skill Demands: A Workshop Summary.

Washington, DC: Center for Education, Division of Behavioral and Social Sciences and Education.

4. Gonzales, R. F., & Renshaw, S. (2005). Connected Computing - A Hierarchy of Pre-Engineering Computing

Skill Competencies. Paper presented at the Annual Meeting of the ASEE.

5. Davison, L., & Porritt, N. (1999). Using computers to teach. Proceedings of the Institution of Civil Engineers-

Civil Engineering, 132(1), 24-30.

6. Kassim, H. O., & Cadbury, R. G. (1996). The place of the computer in chemical engineering education.

Computers & Chemical Engineering, 20, S1341-S1346.

7. Linstone, H. A., & Turoff, M. (1975). The Delphi method: Techniques and applications. Downloaded 1/4/2008

from http://is.njit.edu/pubs/delphibook/

8. Authors. (2008). Paper presented at the ASEE Annual Conference, Pittsburgh, PA.

9. NRC-CITL, National Research Council - Committee on Information Technology Literacy. (1999). Being fluent

with information technology. Washington, DC: National Academies Press.

10. Dougherty, J. P., Dececchi, T., Clear, T., Richards, B., Cooper, S., & Wilusz, T. (2003). Information Technology

Fluency in Practice. ACM SIGCSE Bulletin, 35(2), 153-171.

11. Atman, C. J., Adams, R. S., Cardella, M. E., Turns, J., Mosborg, S., & Saleem, J. (2007). Engineering Design

Processes: A Comparison of Students and Expert Practitioners. Journal of Engineering Education, 96(4), 359-

379.

12. Stevens, R., O'Connor, K., Garrison, L., Jocuns, A., & Amos, D. M. (2008). Becoming an Engineer: Toward a

Three Dimensional View of Engineering Learning. Journal of Engineering Education, 97(3), 355-368.

Page 19: AC 2010-373: COMPUTATIONAL THINKING: WHAT SHOULD …rouskas.csc.ncsu.edu/Projects/CAC/Pubs/ASEE-CAC-2010.pdfenvironment. He has also worked on the integration of scientific visualization

13. Strobel, J., & Cardella, M. (2008). Compound Problem Solving: Workplace Lessons for Engineering Education.

Paper presented at the ASEE Annual Meeting, Pittsburgh, PA.

14. McKitrick, S. (2007). Assessing ‘ineffable’ general education outcomes using the Delphi approach. Paper

presented at the NCSU Assessment Symposium, Cary, NC.

15. Authors. (2009). Paper presented at the ASEE Annual Conference.

Page 20: AC 2010-373: COMPUTATIONAL THINKING: WHAT SHOULD …rouskas.csc.ncsu.edu/Projects/CAC/Pubs/ASEE-CAC-2010.pdfenvironment. He has also worked on the integration of scientific visualization

Appendix 1: Finalized Theme Categories from the 1st Delphi Survey

Theme Code

1. Analyze & Evaluate Existing Process AEEP

2. Ability to Learn & Adaptability ALA

3. Ability to Use Simulation Packages AUSP

4. Basic Knowledge of Architectures BKA

5. Basic Knowledge of Programming BKP

6. Basic Operation System BOS

7. Communication Tools/organization COT

8. Data Analysis Skills DAS

9. Driver Concept DC

10. Database Fundamentals DF

11. Database Management DM

12. Forecasting F

13. Financial/Interdisciplinary Knowledge FIK

14. General: Teamwork, Problem solving (not computing

competencies) G

15. Industry-specific Tools IT

16. Integrated View of Systems/Applications IVSA

17. Knowledge of Architectures KA

18. Microsoft Office Tools MOT

19. None/Not relevant N

20. Project Management Applications PMA

21. Process Modeling & Design PMD

22. Problem-solving & Problem-shooting PP

23. Proficiency in Programming Languages PPL

24. Proficiency in Simulations PS

25. Queries Debugging/Testing QDT

26. Security Control SC

27. Software Systems Design SSD

28. Virtualization V

29. Web Programming & Language WPL

30. Web Search WS