a foundation for system security clark thomborson 5 august 2009 this presentation is based on “a...

38
A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer and Information Security, ed. Mark Stamp, Springer, to appear 2009. A preprint version is available at http ://www.cs.auckland.ac.nz/~ cthombor/Pubs/Foundation/foundationv21.pdf .

Upload: della-ford

Post on 18-Jan-2018

217 views

Category:

Documents


0 download

DESCRIPTION

The Importance of Modelling Assertion: A human can analyse simple systems (≤ 7 elements or concepts). Implications: If we want to analyse complex systems, we must use models (simplifications). If we want to have confidence in our analyses, we must validate our models. Validation: Do our analytic results (predictions) match our observations? Error sources: model, application, observation. 3

TRANSCRIPT

Page 1: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

A Foundation for System Security

Clark Thomborson5 August 2009

This presentation is based on “A Framework for System Security”, in Handbook of Computer and Information Security, ed. Mark Stamp, Springer, to appear 2009. A preprint version is available at http://www.cs.auckland.ac.nz/~cthombor/Pubs/Foundation/foundationv21.pdf.

Page 2: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Questions to be (Partially) Answered

What is security? What is trust? “What would be the shape of an

organisational theory applied to security?” [Anderson, 2008]

What would be the shape of a security theory applied to an organisation?

2

Page 3: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

The Importance of Modelling Assertion: A human can analyse simple

systems (≤ 7 elements or concepts). Implications:

If we want to analyse complex systems, we must use models (simplifications).

If we want to have confidence in our analyses, we must validate our models.

Validation: Do our analytic results (predictions) match our observations?

Error sources: model, application, observation. 3

Page 4: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Human-based security! Axioms:

1. Security and distrust are determined by human fears.

2. Functionality and trust are determined by human desires.

Justification (by the Socratic method): If nobody can be harmed or helped by a

system, then why should this system be considered secure, insecure, functional, or non-functional?

4

Page 5: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Interactions Axiom 3: System activity can be

decomposed into interactions:A: M(B) → C

A, B, and C are systems. Note: A, B, or C may be null, e.g. M → C.

M is a message: information (mass, or energy) that is transmitted from A to C, and which may be a function of B.

B is the subject of the message. For example, “A introduces B to C”.

5

Page 6: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Owners and Sentience Axiom 4: Every system has an owner,

and every owner is a system. Definitions:

If B owns A, then we say that “A is a subsystem of B”.

If a constitutional actor C is a subsystem of itself (i.e. if C owns C, and |C| = 1), then we say that “C is a sentient actor”. We use sentient actors to model humans.

If a system contains a sentient actor, we call it a “sentient system”.

6

Page 7: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Judgement Actors Axiom 5: Every system has a

distinguished actor called its “judgement actor”, which specifies its security and functionality requirements. When a judgement actor is sent a message

containing a list of actions, it may reply to the sender with a judgement.

A list of actions resulting in a positive judgement is a functional behaviour.

A list of actions resulting in a negative judgement is a security fault.

7

Page 8: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Analyses A descriptive and interpretive report of a

judgement actor's (likely) responses to a (possible) series of system events is called an analysis of this system. If an analysis considers only security faults,

then it is a security analysis. If an analysis considers only functional

behaviour, then it is a functional analysis. The set of environmental assumptions on

the system is the workload of the analysis.

8

Page 9: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Requirements Elicitation An analyst has two preliminary tasks:

Specify constitutions (= system architectures), either by examining design documents or by observations of an actual system;

Specify judgement actors (= system requirements), by interviewing or observing the relevant humans.

The task of specifying a judgement actor is called requirements elicitation.

9

Page 10: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Qualitative vs. Quantitative Analysis

A quantitative analysis is numerical, requiring an analyst to estimate the probabilities of relevant classes of

events in relevant populations, and also to estimate the owner's costs and benefits in all

of the likely scenarios. A qualitative analysis is verbal, providing

the semantics required to explain (or conduct) a quantitative analysis.

A useful framework will support both types.10

Page 11: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

System Architecture Actors have three types of relationships

with each other.1. Hierarchical: a superior (owning) actor,

and its inferior actors (subsystems). 2. Peering: an equality relation among

peers, with voting and membership processes.

3. Aliased: the connection between the different roles played by the same human or real-world system.

11

Page 12: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Graphical Representation

12

e1

Inferiors

Peers

Aliases

e1’

Superior

Peers

Inferiors

This is a digraph embedded in a pseudosurface: the nodes are located at points where the space differs from a surface. Peerages are cliques.

Page 13: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

13

The Hierarchy

Control is exerted by a superior power.

Prospective controls are not easy to evade.

Retrospective controls are punishments.

The Hierarch grants allowances to inferiors.

King, President, Chief Justice, Pope, or …

Peons, illegal immigrants, felons, excommunicants, or …

The Hierarch can impose and enforce obligations. In the Bell-LaPadula model, the Hierarch is concerned with

confidentiality. Inferiors are prohibited from reading superior’s data. Superiors are allowed to read their inferior’s data.

Page 14: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

14

The Alias (in an email use case)

We use aliases every time we send personal email from our work computer.

We have a different alias in each organisation.

We are prohibited from revealing “too much” about our organisations.

We are prohibited from accepting dangerous goods and services.

Agency X Gmail

C, acting as a governmental

agent C, acting as a Gmail client

Each of our aliases is in a different security environment.

Managing aliases is difficult, and our computer systems aren’t very helpful…

Page 15: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

15

The Peerage The peers define the

goals of their peerage.

If a peer misbehaves, their peers may punish them (e.g. by expelling them).

Peers can trade goods and services.

The trusted servants of a peerage do not exert control over peers.

The trusted servants may be aliases of peers, or they may be automata.

Facilitator, Moderator, Democratic Leader, …

Peers, Group members, Citizens of an ideal democracy, …

Page 16: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

16

Example: A Peerage Exerting Audit Control on a Hierarchy

Auditor

IG2IG1

OS Root Administrator

Users/Peers

Chair of User Assurance Group

Inspector-General (an elected officer)

• Peers elect one or more Inspector-Generals.

• The OS Administrator makes a Trusting appointment when granting auditor-level Privilege to an alias of an Inspector-General.

• The Auditor discloses an audit report to their Inspector-General alias.

• The audit report can be read by any Peer.

• Peers may disclose the report to non-Peers.

Page 17: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Owner-Centric Security Axiom 6. The judgement actor of a

system is a representation of the desires and fears of its owner.

Implication: If the system’s owner is unaware of their system, then the judgement actor will make no judgements.

If the system’s owner is inconsistent or incoherent, then their system has indefinite security and functionality.

17

Page 18: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

What can an owner do? An owner might fulfil their desires by

modifying their system or by controlling its environment. These are functional enhancements.

A fearful owner may seek security enhancements, by architectural modifications on their own

system, or by exerting control over other systems.

18

Page 19: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Lessig’s Taxonomy of Control

Easy Difficult

Inexpensive

Expensive

Computers make things easy or difficult.

Legal Illegal

Governments make things legal or illegal.

The world’s economy makes things inexpensive or expensive.

Moral

ImmoralOur culture makes things moral or immoral.

19

Page 20: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Temporal & Organisational Dimensions

Prospective controls: Architectural security (easy/hard) Economic security (inexpensive/expensive)

Retrospective controls: Legal security (legal/illegal) Normative security (moral/immoral)

Temporality = {prospective, retrospective}. Organisation = {hierarchy, peerage}.

20

Page 21: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Security Properties (Traditional)

1. Confidentiality: no one is allowed to read, unless they are authorised.

2. Integrity: no one is allowed to write, unless they are authorised.

3. Availability: all authorised reads and writes will be performed by the system.

Authorisation: giving someone the authority to do something.

Authentication: being assured of someone’s identity. Identification: knowing someone’s name or ID#. Auditing: maintaining (and reviewing) records of

security decisions.21

Page 22: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Micro to Macro Security

“Static security”: system properties (confidentiality, integrity, availability).

“Dynamic security”: system processes (Authentication, Authorisation, Audit). Beware the “gold-plated” system design!

“Security Governance”: human oversight Specification, or Policy (answering the question of

what the system is supposed to do), Implementation (answering the question of how to

make the system do what it is supposed to do), and Assurance (answering the question of whether the

system is meeting its specifications).

22

Page 23: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Clarifying Static Security

Confidentiality, Integrity, and Availability are appropriate for read/write data.

What about security for executables? Unix directories have “rwx” permission bits: XXXity!

What about security for directories, services, ...? Each level of a taxonomy should have a few categories

which cover all the possible cases. Each case should belong to one category.

Confidentiality, Integrity, XXXity, “etc”ity are all Prohibitions.

Availability is a Permission. SS

Pro

C I X

Per

A

SS

C I X A

23

Page 24: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Prohibitions and Permissions

Prohibition: forbid something from happening. Permission: allow something to happen. There are two types of P-secure systems:

In a prohibitive system, all operations are forbidden by default. Permissions are granted in special cases.

In a permissive system, all operations are allowed by default. Prohibitions are special cases.

Prohibitive systems have permissive subsystems. Permissive systems have prohibitive subsystems.

Prohibitions and permissions are properties of hierarchies, such as a judicial system. Most legal controls (“laws”) are prohibitive. A few are

permissive.

24

Page 25: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Extending our Requirements Taxonomy

Contracts are non-hierarchical: agreed between peers. Obligations are promises to do something in the future. Exemptions are exceptions to an obligation.

There are two types of O-secure systems. Obligatory systems have exemptive subsystems. Exemptive systems have obligatory subsystems.

Can peerages be P-secure, and can hierarchies be O-secure? Yes, in general, peerages will have some prohibitions and

permissions. Yes, superiors will often impose obligations on their inferiors. So... the type of organisation correlates with, but does not define,

the type of requirement. We need a clearer criterion for our classification, if we want a clear taxonomy.

25

Page 26: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Four types of static security requirements: Obligations are forbidden inactions, e.g. “I.O.U.

$1000.” Exemptions are allowed inactions, e.g. “You need not

repay me if you have a tragic accident.” Prohibitions are forbidden actions. Permissions are allowed actions.

Two classification axes: Strictness = {forbidden, allowed}, Activity = {action, inaction}.

“Natural habitat” of these requirements: Peerages typically forbid and allow inactions, Hierarchies typically forbid and allow actions.

Inactions and Actions

26

Page 27: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Reviewing our Framework

1. What is security? Three layers: static, dynamic, governance. Static security requirements: (forbidden, allowed) x

(action, inaction). Research question: how to characterise dynamic and

governance requirements?2. How can owners gain security or functionality?

Controls: (prospective, retrospective) x (hierarchy, peerage).

3. What is trust?

27

Page 28: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Niklas Luhmann, on Trust A prominent, and controversial, sociologist. Thesis: Modern systems are so complex that

we must use them, or avoid using them, without carefully examining all risks, benefits, and alternatives.

Trust is a reliance without an assessment. We cannot control any risk we haven’t assessed

We trust any system which might harm us. (This is the usual definition.)

Distrust is an avoidance without an assessment.

28

Page 29: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Security, Trust, Distrust, ... The fifth dimension in our framework is

assessment, with three cases: Cognitive assessment (of security &

functionality), Optimistic non-assessment (of trust &

coolness), Pessimistic non-assessment (of distrust &

uncoolness).

29

Page 30: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Security vs. Functionality Sixth dimension: Feedback (negative vs.

positive) to the owner of the system. We treat security as a property right. Every system must have an owner, if it is to

have any security or functionality. The owner reaps the benefits from

functional behaviour, and pays the penalties for security faults. (Controls are applied to the owner, ultimately.)

The analyst must understand the owner’s desires and fears.

30

Page 31: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Summary of our Taxonomy Requirements:

Strictness = {forbidden, allowed}, Activity = {action, inaction}, Feedback = {negative, positive}, Assessment = {cognitive, optimistic,

pessimistic}. Controls:

Temporality = {prospective, retrospective}, Organisation = {hierarchy, peerage}.

Layers = {static, dynamic, governance}.31

Page 32: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Application: Access Control An owner may fear losses as a result of

unauthorised use of their system. This fear induces an architectural

requirement (prospective, hierarchical): Accesses are forbidden, with allowances for

specified users. It also induces an economic requirement, if

access rights are traded in a market economy. If the peers are highly trusted, then the architecture

need not be very secure.

32

Page 33: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Access Control (cont.) Legal requirement (retrospective,

hierarchical): Unauthorised users are prosecuted. Must collect evidence – this is another

architectural requirement. Normative requirement (retrospective,

peering): Unauthorised users are penalised. Must collect deposits and evidence, if peers

are not trusted.33

Page 34: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Functions of Access Control If an owner desires authorised accesses, then

there will be functional requirements. Forbidden inaction, positive feedback (“reliability”)

If an owner fears losses from downtime, then there are also security requirements. Forbidden inaction, negative feedback (“availability”)

Security and functionality are intertwined! The analyst must understand the owner’s motivation,

before writing the requirements. The analyst must understand the likely attackers’

motivation and resources, before prioritising the requirements.

34

Page 35: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Summary What is security? What is trust?

Four qualitative dimensions in requirements: Strictness, Activity, Feedback, and Assessment.

Two qualitative dimensions in control: Temporality, and Power.

Can security be organised? Can organisations be secured? Yes: Static, Dynamic, and Governance levels. Hybrids of peerages and hierarchies seem very

important.

:35

Page 36: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Applications / Questions

1. An employee accessing an outsourced service: System architecture? Judgement actor for employer? Judgement actor for employee? Judgement actor for service provider?

2. A bank vault. Can you define a “trust boundary”?

36

Page 37: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Applications (2)

3. An access control system?4. An access control system with an

auditor?5. A Bell-LaPadula system with three

levels of authority?6. A Biba system with three levels?7. A prisoner-warden system? See [Yu et

al., 2009]8. A “Chinese wall”?

37

Page 38: A Foundation for System Security Clark Thomborson 5 August 2009 This presentation is based on “A Framework for System Security”, in Handbook of Computer

Open Questions Can our framework be extended to

dynamic systems, e.g. Clark-Wilson? How should we model introspection? How should judgement actors be changed? Hohfeldian analysis (of laws, and of the

law-making process) seems a very promising approach …

38