35
+ Introduction Ethical and Professional Computing Michael Heron

ETHICS01 - Introduction to Computer Ethics

Embed Size (px)

DESCRIPTION

An overview of issues related to ethics and the psychology of immorality. Intended for computing students as part of a professional and ethical issues module.

Citation preview

Page 1: ETHICS01 - Introduction to Computer Ethics

+

Introduction

Ethical and Professional ComputingMichael Heron

Page 2: ETHICS01 - Introduction to Computer Ethics

+Introduction

The world has moved on from when the first software developers wrote the first pieces of software.

It was a dramatically different culture in the early days of computing. Mostly academic Mostly reputation driven Software rarely patented Source code freely given away.

Nowadays, software developers work within a much more complicated environment. This course is aimed at helping you navigate that

environment.

Page 3: ETHICS01 - Introduction to Computer Ethics

+Module Introduction

This module has the following learning outcomes. Discuss the ethical and moral obligations that an individual

has as a computer professional. Discuss the ethical and moral obligations that an

organisation has as a computer professional. Analyse case studies of the application of computer ethics

to given circumstances actual and hypothetical. Advise management on the ethical and professional factors

which should be taken into account when planning corporate mission statements and codes of conduct in an organisation.

Page 4: ETHICS01 - Introduction to Computer Ethics

+Module Introduction

The assessment for the module is broken up into two parts. An individual research study with a Word Equivalence of

2500. This is aimed primarily at Learning Outcome 1.

A group report and presentation (Word equivalence 2500) This is aimed at learning outcomes 2, 3, 4

Both of these pieces of assessment are equally weighted. 50% for both.

You will get the outline of these in weeks 7 and 8. During the class time then.

Page 5: ETHICS01 - Introduction to Computer Ethics

+A Complex Environment

Nowadays, the lassiez-faire culture has become increasingly interlinked with legislation and ethical considerations. It’s not possible to simply develop ‘in a vacuum’.

Even those who don’t commercialise their code must be aware of legal issues that simply weren’t an issue as recently as when I did my degree. AS RECENTLY.

Large multinational companies have become increasingly litigious with regards to patents and trademarks. C.F. Apple versus Samsung

Page 6: ETHICS01 - Introduction to Computer Ethics

+A Complex Environment

A culture that is evolving down increasingly collaborative routes requires an understanding of contracts and licences. The open source world in particular has an often

pathological focus on the exact terms under which source code can be used.

Companies are increasingly seeing the data that they store and manipulate as belonging to them. Who owns your facebook data?

Do you even know?

EULAs often require users to sign over their privacy. And most people don’t read them.

Page 7: ETHICS01 - Introduction to Computer Ethics

+A Complex Environment

The world of a computing professional is complicated.

Professional organisations such as the BCS, the ACM and the IEEE all have ethical guidelines to which members are expected to adhere. However, guidelines are a poor substitute for informed

decision making.

In this module, we’re not going to look at lists of rules and obligations. Instead, our goal is to get you to think about the

implications of decisions, and where responsibility lies in certain scenarios.

Page 8: ETHICS01 - Introduction to Computer Ethics

+Ethics

Ethics are the codes of behaviour that guides a person when nobody is looking. It’s easy to be upstanding when under scrutiny.

Ethics are not the same thing as morals. They are often used interchangeably, but they are distinct

terms.

Morals are beliefs, developed from personal experience and teachings, and serve as a measure of personal character. We won’t get too deeply into this… it’s a tinderbox area.

Moral behaviour underpins ethical behaviour, for the most part.

Page 9: ETHICS01 - Introduction to Computer Ethics

+Ethics

Ethics by contrast are codes and set principles, in a variety of categories. You may have a personal ethical code which is largely

indistinguishable from your morality.

Some of these codes and principles are imposed from without. Professional organisations expect their members to behave in

particular ways, whether or not they agree with the morality of the rules.

A lawyer may consider a certain kind of crime reprehensible (personal ethics/morality) but still be required to put on a strident defence for his or her client (professional ethics).

Page 10: ETHICS01 - Introduction to Computer Ethics

+Professional Ethical Codes

We will assume in this module that you are capable of reading these as they relate to individual organisations. ACM BCS IEEE

It is enough to know that you will be expected to hold true to these if you become a member of any professional body.

Our focus in this module is instead going to be on how you make ethical decisions, independent of any particular context.

Page 11: ETHICS01 - Introduction to Computer Ethics

+Ethical Decisions

It is important to realize at this point that an ethical decision is often a judgment call. And that call is going to be different from person to person.

Preconceptions, presuppositions and your own sense of what is right and wrong are going to come into play.

In this module we’re going to explore the topic from a number of perspectives. Legal Moral Ethical

Page 12: ETHICS01 - Introduction to Computer Ethics

+Ethical Decisions

First, let’s define what we mean by an ethical decision.

It means ‘doing the right thing’. But how do we judge that?

By what criteria do we define something to be right? Personal morality? Professional obligation? Good to society? Good to an individual?

While your professional ethics may bind you to a particular action, they don’t bind you to a particular judgement.

Page 13: ETHICS01 - Introduction to Computer Ethics

+Show of Hands!

Should a doctor who believes life begins at conception perform an abortion?

Should a lawyer who believes his client to be guilty (but has no evidence) ‘throw’ the trial?

Should a low level employee in a strict, punitive hierarchy be held responsible for following immoral instructions from his or her boss?

Is it okay to tell a small lie if it brings a great good?

Is it okay to tell a big lie if it brings a great good?

Page 14: ETHICS01 - Introduction to Computer Ethics

+No Wrong Answers

There are no wrong answers to any of these questions. Although there are answers that will be more or less wrong

depending on how you view the decision. Let’s leave the discussion of moral relativity for another

time…

However, in all of these cases it is likely that the real answer is more nuanced. Yes, but… No, unless…

How do we decide what is right and what is wrong in these kinds of scenarios?

Page 15: ETHICS01 - Introduction to Computer Ethics

+What is Right?

Let’s break it down a little into groups of stakeholders. Groups who are impacted by a decision made.

In no particular order: The individual making the decision The people directly impacted by the decision The society in which the decision maker functions The network of societies in which the decision maker’s

society functions. Interested observers The people who are within the decision maker’s field. Any others?

Page 16: ETHICS01 - Introduction to Computer Ethics

+What is Right?

Decision making rarely devolves down into a simple hierarchy of X beats Y beats Z Unfortunately

However, we need to assess the impact on each of these shareholders to arrive at a gauge of the impact of a decision.

We also need to assess the intention behind a decision. This in itself is a minefield.

At its simplest, we can break it down like this: Was a decision made for the right reasons? Was a decision made for the wrong reasons?

Page 17: ETHICS01 - Introduction to Computer Ethics

+Right and Wrong Reasons

Some reasons are more ‘right’ than others. We’ll evolve this statement as time goes by.

Almost invariably, people are hard-wired to think of the common good of a decision. A decision taken for selfish reasons (my own personal

advancement) are less ‘right’ than decisions taken for social reasons (to advance society).

But, with complex groups of stakeholders, this becomes very difficult to assess. Is what is good for the company always good for the individual? Is it okay to be selfish if it puts you in a position later on down

the line to do more good?

Page 18: ETHICS01 - Introduction to Computer Ethics

+Intention and Outcome

What was the outcome of a decision, on each of the individual stakeholders? In an ideal world, it was good for everyone.

But that won’t necessarily make it moral or ethical! Aaaaa!

If it was a good outcome, does it matter if the decision was made for the wrong reasons?

If it was a bad outcome, is it as bad if the person making the decision thought they were doing the right thing?

All of this is tremendously important. And very difficult to pin down.

Page 19: ETHICS01 - Introduction to Computer Ethics

+This Module

This module is shared between myself, Dr Man Qi and Pauline Belford. We’ll each be taking you for different parts of it.

For my parts, they will generally follow simple pattern. I talk for a while.

(Too long, will undoubtedly be the consensus view) I pass some ethical questions on to you for discussion.

These will be related to the topic of the lecture, but also draw in things that have been previously discussed.

The discussions will be small group based. And I will ask groups to report on whatever conclusions they

have reached.

Page 20: ETHICS01 - Introduction to Computer Ethics

+Discussion Intentions

The intention of these discussions is not to arrive at a communal decision. We’re not going to be able to do that, and to try would be

madness.

Instead, it’s to ‘explore the space’ around the decision.

If someone has a different view from you, you should try to understand the perspective from which it emerges.

‘Education is the ability to hold two contradictory ideas in mind without getting angry’

No-one is right, no-one is wrong.

Page 21: ETHICS01 - Introduction to Computer Ethics

+Areas of Concern

The module content is wide-ranging, and will touch on a whole host of issues. Hacking, past and present Whistleblowing Spamming Free Speech versus Censorship Pornography and content control Sharing and privacy Intellectual property

The content is varied because ethical decisions have to be made all the time. Often when they’re not expected.

Page 22: ETHICS01 - Introduction to Computer Ethics

+But I’m a good person.

It is important from the start that we dispense of one particular counter-argument. Isn’t this all common sense? After all, I’m basically a good person.

The problem is that we live in an incredibly complex world within phenomenally powerful social contexts. Anyone can do bad things, given the right combination of:

Environment Opportunity Motive Personality

It is important to understand this is not an issue that only bad people need to worry about.

Page 23: ETHICS01 - Introduction to Computer Ethics

+Social Contexts

Back in the 1960s, a Stanford psychologist by the name of Phillip Zimbardo conducted an experiment. It is now known as the Stanford prison experiment.

In it, he recruited undergraduates to play the part of prisoners or prison guards.

A two week, live in experiment, for which participants were financially recompensed. Everyone was a volunteer.

This is one of the most studied experiments in social psychology.

Page 24: ETHICS01 - Introduction to Computer Ethics

+The Stanford Prison Experiment

There were no significant differences between people who were prisoners and those who were guards.

People with psychological problems, disabilities or criminal records were excluded fro the study.

People were randomly assigned to groups. Nobody chose for themselves what they were going to be. There was no self selection bias.

A common problem in data gathering when participation in some aspect of the data gathering is chosen by the participant.

Page 25: ETHICS01 - Introduction to Computer Ethics

+The Stanford Prison Experiment

Prisoners were collected from their house by police car. None of their neighbours knew why.

They were given a stern speech by the Warden (Zimbardo) about the seriousness of their situation. Given with all the trappings of power that would normally go

along with this kind of encounter.

They then underwent a strip search and delousing. They took this experiment seriously, both the participants

and the investigator.

At the end, they were put into the prison. And the experiment got underway.

Page 26: ETHICS01 - Introduction to Computer Ethics

+The Stanford Prison Experiment

The prison was constructed within Stanford. A blocked off corridor Lab rooms with barred doors for cells

Each cell was bugged with an intercom

The corridor was ‘the yard’ The only places where prisoners were allowed to walk, eat

or exercise.

Events in the hall videotaped through a small opening.

A closet was used for ‘the hole’ Two feet wide by two feet deep

Page 27: ETHICS01 - Introduction to Computer Ethics

+The Stanford Prison Experiment

We don’t have time to do this experiment full justice. Please do read up on it outside of class. It’s hugely important.

Suffice to say – the experiment ended prematurely after one week. Too many of the prisoners were suffering real psychological damage.

The guards relentlessly bullied and belittled the prisoners. Remember, only a toss of a coin was between them and being a

prisoner themselves.

The prisoners had bought into the myth that they were actual prisoners. Even going so far as to beg for release to the warden, when they

were free to leave at any time.

Page 28: ETHICS01 - Introduction to Computer Ethics

+The Stanford Prison Experiment

None of the guards were ‘bad people’. Although they did bad things,

None of the prisoners were ‘weak’. Although they were hugely impacted by the scenario.

It was the power of the fiction that had been created that resulted in the abuses. Guards and prisoners bought into their roles with

conviction.

The social context was what drove the participants to the edge. And this is often true outside of experimental scenarios.

Page 29: ETHICS01 - Introduction to Computer Ethics

+The Lucifer Effect

Zimbardo postulates in his book ‘the Lucifer effect’ that situations like Abu Gharib and abuses at Guantanamo are not the result of ‘a few bad apples’ As if often claimed.

Instead, it’s the result of a rotten barrel. Which turns good apples bad.

When the wrong kind of pressures are applied in the wrong kinds of ways, everyone can be a villain.

With the wrong kind of culture for the wrong kind of reasons, every company can be an Enron.

http://www.youtube.com/watch?v=Z0jYx8nwjFQ

Page 30: ETHICS01 - Introduction to Computer Ethics

+The Milgram Experiment

One of the most famous psychological experiments is the Milgram experiment. The chances are high that you will have encountered this

somewhere in your life.

For the uninitiated: Two ‘volunteers’ are brought in to an experiment.

One is a patsy. A stooge. They are assigned ‘randomly’ into teacher and learner.

The real volunteer is always the teacher. They are asked to participate in an experiment on the

effect of electrical shock on recall. Asked by a man in a white doctor’s coat.

Page 31: ETHICS01 - Introduction to Computer Ethics

+The Milgram Experiment

The learner is placed in front of a machine with various switches on it. They have values indicating strength of electrical shock.

They go all the way up to 450 volts

They are asked to provide the ‘learner’ with various recall tasks. Every time the answer is wrong, they flick the next highest

switch.

The question is – how many people would deliver a lethal electrical shock just because a man in a white coat asked them to? Let’s see what happens:

http://www.youtube.com/watch?v=IzTuz0mNlwU&feature=relmfu

Page 32: ETHICS01 - Introduction to Computer Ethics

+The Milgram Experiment

Before the experiment, Milgram polled some senior psychology majors to determine what they would believe to be the rate of delivering a ‘lethal shock’. Average of 1.2% would inflict a lethal shock.

Around the same percentage as generally given for psychopaths in society.

Of 40 professional psychiatrists: By 300 volts, only 3.73% would continue. ‘A little over one tenth of one percent’ would inflict a

lethal shock.

‘It just wasn’t going to happen’ was the general view of the pre-polling.

Page 33: ETHICS01 - Introduction to Computer Ethics

+Actual Result

In the first set of experiments, sixty five percent inflicted the final, lethal 450 volt shock. Although they usually did so under considerable

protestation.

The experiment was conducted many times around the world. With the results being replicated regardless of country.

It has been repeated many times since. It is one of the best supported results in psychological

literature. Over the years, the rate has remained remarkably

consistent: Between 61 and 66% will go all the way to the end.

Page 34: ETHICS01 - Introduction to Computer Ethics

+Discussion Questions

What ethical issues are being raised by conducting the Milgram experiment? Who is being hurt?

Physically? Emotionally?

What ethical issues are being raised through the Stanford Prison Experiment? What are the ethical obligations on Zimbardo? What are the ethical obligations on the people in the study?

Should these experiments have been conducted?

Page 35: ETHICS01 - Introduction to Computer Ethics

+Conclusion

It’s hard to do the right thing.

Mainly because it’s hard to judge what the right thing is.

And even if you can, other people won’t think it’s right.

We do the best we can, with the tools we have.

Man, there are a lot of bullet points expected in this template.

Look, another one. And it wants one after that! I’m going to end with a joke.

A man walks into a bar with a giraffe under his arm. The bartender says ‘hey, you can’t leave that lying there’, and oh no I’ve ran out of space.