Computer Science Theory & Introduction
Week 1 Lecture Material – F'13 RevisionDoug Hogan
Penn State University
CMPSC 201 – C++ Programming for EngineersCMPSC 202 - FORTRAN Programming for Engineers
Hardware vs. SoftwareHardware essentially, things you can touch input, output, storage devices memory
Software essentially, what the computer knows data, 0s and 1s programs (this is a software course)
Categories of Memory
Read-only memory (ROM)can only read data
Random-access memory (RAM)can read and write informationprimary storage - computer’s main
memoryvolatile
Sequential Access vs. Random Access MemorySequential Access:must access each location in memory in order
Random Access: can access memory locations using addresses, in any order
Speed implications?
Track 1
Track 2
Measuring Memory
base unit: 1 bit = binary digit, 0 or 18 bits = 1 byte (B)1000 bytes ≈ 1 kilobyte (KB)1000 KB ≈ 1,000,000 B ≈ 1 megabyte (MB)1000 MB ≈ 1,000,000,000 B ≈ 1 gigabyte (GB)1000 GB ≈ 1,000,000,000,000 B ≈ 1 terabyte (TB)
Storage Device Capacities
Floppy disk (the old 3.5” ones) 1.44 MB
Compact disc (CD) 650-700 MB
Digital Versatile/Video Disc (DVD) 4.7 GB
Hard disks, flash drives typically sizes in GB
Software Overview
System softwareControls basic operations of computerThe operating system
manages memory, files, application software
File management tasks – deleting, etc.
Software Overview
Application softwareNot essential to system runningEnables you to perform specific tasksEx:
Office softwareWeb browsersMedia playersGames
Algorithms and Languages
An algorithm is a set of instructions to solve a problem.Think recipes.Many algorithms may solve the same problem. How do we choose?
We use a programming language to explain our algorithms to computer and write programs.
Programming Paradigms/Models
Imperative Programming: specify steps to solve problem, use methods, methods could get longObject-Oriented Programming (OOP): create objects to model real-world phenomena, send messages to objects, typically shorter methodsEvent-Driven Programming: create methods that respond to events like mouse clicks, key presses, etc. Others: Functional, logic, etc.
Compiled vs. Interpreted Languages
Interpreted Language Requires software
called an interpreter to run the code
Code is checked for errors as it runs (erroneous code: do the best we can…)
Examples: HTML, JavaScript, PHP
Compiled Language Requires software
called a compiler to run the code
Code must be compiled into an executable before running (and thus error-free)
Examples: C, C++, Pascal, Fortran, BASIC
Compiling Process
Source Code
(C++, Fortran, …)
ObjectCode
ExecutableProgram
ObjectCode fromLibraries
compiler linker
ErrorsSyntax Errors Misuse of the language, much like using
incorrect punctuation in English Compiler reports; program won’t run until
they’re resolved
Logic Errors Program doesn’t solve the problem at hand
correctly
Runtime Errors Errors that occur while the program is
running, e.g. problems accessing memory and files, divide by zero
AbstractionPoll: Who can use a CD player? Who can explain how a CD player works? Who can drive a car? Who is an auto mechanic?
Abstraction Principle of ignoring details that allows us to
use complex devices Focus on the WHAT, not the HOW Fundamental to CS Other examples?
Levels of Abstraction
0. Digital Logic1. Microprocessor2. Machine Language3. Operating System4. Assembly Language5. High-Level Language6. Application Software
Binary Numbers
Use two symbols: 0 and 1Base 2Compare with decimal number systemUses symbols 0, 1, 2, 3, 4, 5, 6, 7, 8, 9Base 10
At the lowest level of abstraction, everything in a computer is expressed in binary.
Binary Numbers, ctd.
01
1011
100101110111
Places: Decimal: 1s, 10s, 100s,
etc. Binary, 1s, 2s, 4s, 8s,
etc.
Conversion between decimal and binary is done by multiplying or adding by powers of 2. “There are 10 kinds of people in the world…”
10001001101010111100110111101111
10000
Other Number Systems
Any positive integer could be the base of a number system. (Big topic in number theory.)Others used in computer science:Octal: Base 8Hexadecimal:
Base 16New symbols A, B, C, D, E, F
ASCII
Every character on a computer -- letters, digits, symbols, etc. -- is represented by a numeric code behind the scenes.This system of codes is called ASCII, short for American Standard Code for Information Interchange.We’ll learn more in lab…
# Transistors on a Processor
Processor Date Number of Transistors
4004 1971 2,250
8008 1972 2,500
8080 1974 5,000
8086 1978 29,000
286 1982 120,000
386 1985 275,000
486 DX 1989 1,180,000
Pentium 1993 3,100,000
Pentium II 1997 7,500,000
Pentium III 1999 24,000,000
Pentium 4 2000 42,000,000
Data forIntel processors:
Data from Section 4.1 of :
Yates, Daniel S., and David S. Moore and Daren S. Starnes. The Practice of Statistics. 2nd Ed. New York: Freeman, 2003.
A Graphical View
Graph from Intel's web site (http://www.intel.com/technology/mooreslaw/index.htm); Retrieved 9/24/2006
Pay attention to the units on the axes…
Moore’s Law
Prediction from Gordon Moore of Intel in 1965.Implication: The speed of processors doubles roughly every 12 to 18 months.Exponential relationship in the data. For the curious: the regression equation from the
data two slides back is
Can this go on forever?
ˆ y 1648.161.393x