skip to content

IT Help and Support

University Information Services
 

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

 

A

Acceptable level of risk
The level of risk which is acceptable to you, your institution and/or the University defined by being at or lower than the risk threshold.

Access control
To ensure that access to assets is authorized and restricted based on business and security requirements (ISO27000, 3.1)

Asset
Something that has value to you or your institution.

Attack
Attempt to destroy, expose, alter, disable, steal or gain unauthorized access to or make unauthorized use of an asset

Anonymised data
Information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable (Recital 26, UK GDPR 2021)

i.e. data which can never be attributable to a person and there is no 'key' which can re-personalise the data.

Anonymisation
The Information Commissioner's Office (ICO) has written a guidance paper on anonymisation, pseudonymisation and privacy enhancing technologies, which is soon to be published. In the May 2021 draft introduction, it states:

"Data protection law also does not specifically define ‘anonymisation’. However, its meaning for the purposes of the UK data protection framework is clear from the wording of Recital 26 of the UK GDPR. It is the way in which you turn personal data into anonymous information, so that it then falls outside the scope of data protection law.

You can consider data to be effectively anonymised when it:

  • does not relate to an identified or identifiable individual; or
  • is rendered anonymous in such a way that individuals are not (or are no longer) identifiable.

We use the broad term ‘anonymisation’ to cover the techniques and approaches you can use in the pursuit of these aims – ie, of preventing the identification of the individuals the data relates to, taking into account all relevant factors.

This includes making sure that even if the names of the people or participants are removed that nothing else about the data could possibly identify a person (e.g. the only person over 96 and male in a given street; the only person with a heart bypass and 6ft 7in in a given ward)."

Authenticity
Proof that it or its source is genuine.

Availability
Reliable access to it, as and when required.

Awareness and awareness materials
Awareness is the term used for bringing topics of interest to the communicator to members of staff or students of the University via many different routes. It could be mentioned by a colleague, or on a poster on a College or Department wall; it could be via a newsletter or an email or through webpages of information, pictures, videos; through specific training courses or face-to-face meetings, training sessions and workshops. The information and cyber security team have collated a wealth of cyber security awareness materials for use by departmental administrators, heads of institutions, heads of institutional or collegiate IT or anyone else with an interest in communicating cyber security and information security awareness across a department, division or office. Please see our website for more details.

B

Biometric data
Personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data; (Article 4(14), UK GDPR 2021)

Business continuity plan
A business continuity plan (BCP) is a plan for how to restore an entity back to a working state as soon as possible after a disaster is noticed to be happening. This could be moving people and resources to a different location temporarily, while a disaster is still ongoing, so that the organisation can still function in some way.

BYOD
Bring your own device. This is a term used to distinguish devices used in the University which are not University owned. See our best practice guide for using your own device for University business (for example a mobile phone for reading your work email, or your own laptop or desktop when working from home). 

C

Confidential data
Data which, if it were disclosed, could cause harm to the University.

Confidentiality
Only those who are authorised to do so can access it.

Container
This is the Octave Allegro word for where the data is. There are technical containers, physical containers and human containers.

Critical information asset
A dataset or information that is critical to your institution.

In other words, a critical information asset is a dataset that is of great value to you – for example, critical and confidential research data, a departmental 'people' database, CAD data, genome data or contracts HR data. 

Further examples: Animal data that is crucial for the completion of your institutional Home Office Report; genomic data that must be available for your research group to function; participant data that is running under NHS governance or ethics requirements that is critical to keep confidential and to maintain its integrity; disability data on students and staff that is critical for lecturers and critical to keep confidential; chemicals data that is crucial for compliance with counter terrorism act.

CSIRT
Computer Security Incident Response Team (CSIRT) is a team of IT security staff that co-ordinates the response to IT security incidents, providing a second-line IT support team to college and departmental IT staff. CSIRT is able to draw upon the expertise of other UIS teams such as Server Infrastructure, Networks and the Service Desk. It liaises with recognised IT contacts in every University institution. IT staff who need or wish to contact CSIRT should email: csirt@uis.cam.ac.uk

CSIRT is not an IT Service Desk and it does not provide direct support to general staff or students. General staff or students should report an IT incident via our Incident Reporting Process.

Current impact
The consequence to you and/or the institution, should the threat happen with the current level of controls in place to reduce the impact.

Current likelihood
The chances that the threat could happen with the current level of controls in place.

Current risk
The risk with the current level of controls in place. Sometimes called the net risk.

Cyber essentials

The UK Government standard for cyber security. It has 5 sections: secure configuration; user access control; malware protection; secure update management; firewalls. By following the standard there are two options: a) self-certification or b) externally verified certification. The latter of these is called Cyber Essentials Plus. The National Cyber Security Centre (NCSC) website gives more information on Cyber Essentials, what the standard involves and how to achieve the standard.

Cyber security
Cyber security is defined by the National Institute of Science and Technology (NIST) as the "ability to protect or defend the use of cyberspace from cyber attacks". It's concerned with online information and protecting from attacks from cyberspace. It's not concerned with other aspects of security, such as paper or offline resources, or accidental damage caused by internal or external users. It is, therefore, a subset of information security, albeit a large one. The UK Government standard for cyber security is Cyber Essentials.

Cyber security crisis
An abnormal and unstable situation that threatens the organisation’s strategic objectives, reputation or viability (BS11200, the British Standard for crisis management) There is no need to respond to every security incident as a crisis. Incidents are normal, and although they may cause disruption and impact normal business, they do not typically threaten to run out of control or lead to a strategic impact. This could include but is not limited to vulnerability exploitation, computer misuse, unauthorized account use, unauthorized use of privileges, malicious code etc. The severity of the security incident is defined in the Appendix 3: UIS CIMP Quick Reference Categorising Cyber Security Incidents by Criticality table.

Cyber security information event
In the Cyber Security Incident Management Plan (CIMP) this is defined as an occurrence not yet assessed that may affect the performance of the University managed IT services. A cyber security information event sometimes provides an indication that a security incident is occurring.

Cyber security incident
An adverse event that might cause disruption, loss or emergency, but which does not meet the organization’s criteria for, or definition of, a crisis. (BS 11200:2014 Crisis Management – Guidance and Good Practice) In the Cyber Security Incident Management Plan (CIMP), a security incident is defined as an assessed occurrence having potential to cause or causing adverse effects on the functioning of the University managed IT services. The severity of the security incident is defined in the Appendix 3: UIS CIMP Quick Reference Categorising Cyber Security Incidents by Criticality table.

See: Reporting an IT security incident and reporting and managing critical cyber security incidents.

Cyber Security Incident Management Plan (CIMP)
The University's CIMP documents the high-level procedures to coordinate the critical cyber security incident management process.
It outlines the response process for managing critical cyber security incidents efficiently and effectively to minimize adverse impact on University-wide systems and services managed by UIS. It is owned by the Registrary. UIS' CIMP is owned by the Director of UIS.

Cyber Security Training and Awareness
The Information and Cyber Security Team have created two cyber security training courses: one for Staff and one for Students. There is also a wealth of material regarding best practice guides on topics such as working from home, choosing a strong password and step-by-step guides on topics such as how to encrypt your laptop and how to set up a user account and much more. Please visit our cyber security awareness and training page for more details.

 D

Data
There are many many 'definitions' or citings of the difference between data and information across the world. Basically, data is thought of as unorganised set of facts, whereas information is organised and in some form of context; data is raw, individual and sometimes unrelated and on its own is meaningless whereas information is processed, related and has meaning. For example, the NIST Glossary lists data in about 10 different ways depending upon the context.

However you think of it, in terms of information and cyber security, it doesn't actually matter whether you call it 'data' or 'information'. What is important is that, if it is of value, it is an asset to the University and should be secured appropriately to ensure the risk to you, your institution or the University, should it be compromised in some way, is at an acceptable level. See cyber security awarenss and training, classify and store your data, information asset tools, data security for researchers, risk assessment for security

Data Protection
The Information Compliance Office of the University has extensive information about Data Protection and what it means for us all in the University.

Data protection impact assessment (DPIA)
The DPIA considers risks, impact, likelihood and threats. Unlike the Information Security Risk Assessment (ISRA), it focuses on the risks to the data subjects should data be compromised (that is, disclosed, lost or corrupted), rather than the risks to you or the University.

Data Protection Training
There is information about Data Protection Training from the Information Compliance Office of the University. 

Destruction or loss of an information asset
This is where the information asset it totally destroyed, destroyed enough to not give value to the organisation or lost completely. This means that you cannot access the information asset at all, ever. There is not going to be a time when you can access it in the future. Just for clarity, lost is meaning, forever. i.e. a phone dropped over the side of a ferry in the middle of loch ness, where it would never be accessible again. Not like lost on a train where someone else could find it and possibly access it - that would be the 'disclosure' option.

Disaster recovery plan
A disaster recovery plan (or DRP) is a plan for how to restore a system, department or full organisation back to its original state after a disaster. The organisation's DRP is usually made up of a host of smaller, more focused DRPs for the departments, and theirs, in turn, are made up of smaller DRPs for all their systems and processes. Each system DRP is usually to do with restoring data from backups. The terms recovery point objective (RPO) and recovery time objective (RTO) are used to describe the point from which the backup can be restored and the time taken to restore the data, and shows how much data will be lost and how much time will be lost before resumption of normal University business.

Disclosure of an information asset
This is where the information asset is, whether accidentally or maliciously, accessible by those who are not supposed to be able to access it. This is a breach of confidentiality and in some circumstances can be very costly (a personal data breach can cost the University up to £72 million: a breach of information prior to the filing of a patent could potentially lose those involved millions of pounds).

G

Genetic data (UK GDPR)
Personal data relating to the inherited or acquired genetic characteristics of a natural person which give unique information about the physiology or the health of that natural person and which result, in particular, from an analysis of a biological sample from the natural person in question; (Article 4(13), UK GDPR 2021)

H

Human container
This is an Octave Allegro term for those people who have critical information in their heads, such that their absence may cause an impact on the department/institution/research project. It is mentioned in ISRA1, but not in any of the other forms at this time. Maybe something to think about, though, when considering risks to the institution or project.

Examples:

  • DAs may have their own filing systems to do important and urgent submissions
  • PIs may have their own files for grant submissions and research data or certain researchers may have these
  • Academics may have lecture notes 'in their heads' that are critical to a certain section of the Tripos, and no-one else knows where these files, notes or datasets are stored or what they are called, should they suddenly become ill or unavailable

These DAs, PIs, academics and certain researchers are all critical information assets in their own right.

I

ICO (University)
Information Compliance Office. This office is part of the Governance and Compliance Division of the University of Cambridge.

ICO (UK)
Information Commissioners Office. The ICO is the UK's independent body set up to uphold information rights.

IDS
Intrusion Detection System. A system which detects possible intrusion based on past usage. In the University, the IDS was a pre-cursor to the IPS (Intrusion Prevention System).

IEC
International Electrotechnical Commission (IEC) is the world's leading organization that prepares and publishes International Standards for all electrical, electronic and related technologies - collectively known as "electrotechnology". Referenced in the full ISO27001 standard: BS ISO/IEC ISO 27001: 2022

Impact
The consequence to you and/or the institution, should the threat actually happen.

Incident response playbook
An incident response playbook is a very detailed, step-by-step procedure for how to respond to a specific incident. It's owned by the IT specialist group who would be doing the technical recovery in an actual incident. (for example, the Computer Security Incident Response Team for a ransomware incident response playbook).

Information
There are many many 'definitions' or citings of the difference between data and information across the world. Basically, data is thought of as unorganised sets of facts, whereas information is organised and in some form of context; data is raw, individual and sometimes unrelated and on its own is meaningless whereas information is processed, related and has meaning. For example, NIST Glossary lists 11 different definitions for information depending upon the context.

However you think of it, in terms of information and cyber security, it doesn't actually matter whether you call it 'data' or 'information'. What is important is that, if it is of value, it is an asset to the University and should be secured appropriately to ensure the risk to you, your institution or the University of it being compromised in some way, is at an acceptable level. See information assetcyber security awarenss and training, classify and store your data, information asset tools, data security for researchers, risk assessment for security

Information asset
A dataset or information that has value to you or your institution.

Examples: Research datasets that have value to you, the Research Lead; student needs dataset that has value to you, a lecturer; training records that have value to your office; HR data and committee minutes that have value to your department or institution.

Information security
Information security is about protecting the confidentiality, integrity and availability of information no matter what form that information is in (for example, paper, video, audio, data in a database, on- or off-line). This triad is sometimes called the CIA of information. ISO 27001 is an international standard for information security.

Integrity
Assurance that an asset is trustworthy and complete.

Interruption (of an Information Asset)
This is where the information asset is not accessible for a period of time. It is not destroyed and not lost forever. There will be a time when the full asset is accessible once more.

IPS
Intrusion Prevention System. A system which attempts to prevent intrusion based on past usage. The University IPS inspects all data that crosses our network boundary, applying a variety of detection methods to identify security threats or incidents. When a security threat is identified, the IPS will either automatically block the malicious network traffic or create an alert, depending on the incident and configuration.

Examples of security threats include SQL injection attacks on a website, malware communications or a botnet scanning for vulnerabilities.

ISO
International Organization for Standards (ISO) develops and publishes international standards: notably ISO27001:2022. See www.iso.org/home.html for more details.

ISO27001:2013 and ISO27001:2022
The international standard in information security, which has 7 clauses covering the risk and governance and 114 controls to reduce risk within Annex A.  (and 93 controls in the 2022 version). Annex A control areas 2013 include: mobile devices; asset management; access control; cryptography; physical security; operations security; communications security; systems acquisition, development and maintenance; information security incident management and compliance. Annex A control areas 2022 directly align with clauses 5 to 8 of ISO27002:2022 and are as follows: Organizational Controls, People Controls, Physical Controls and Technological Controls.

You would need to establish and continually improve an information security management system (ISMS) to be certified to this standard, and only an external certified auditing body (for example, BSi) can give this certification. The process is laborious and document-heavy. However, as the processes and procedures are embedded into the workings of the organisation and as producing evidence for audit becomes second nature, the documentary burden reduces dramatically and the ISMS can be seen to be efficient and beneficial to the organisation and its end-users.
Receiving ISO certification provides an assurance of the security of the workings of the organisation, but only within the scope of the ISO certificate.

You can read the standard via the University of Cambridge Library idiscover website (choose A–Z and select B for British Standard). Further help on the Annex A controls is given in ISO27002, also available via the library. We examine the Annex A control areas as part of our guidance on further reducing the risk.

There are many standards in the ISO27000 stable, including ISO27005 on risk management.

ISRA
Information Security Risk Assessment. Unlike the Data Protection Impact Assessment (DPIA), the information security risk assessment considers the risks to you and the University, rather than the data subject, should data be compromised.

We have developed two versions of the ISRA using the Octave Allegro Methodology, which is cited as one of 6 risk assessment methodologies on the National Cyber Security Centre (NCSC) website. We chose this methodology as the easiest to use, quickest to do, practical, clearest in terms of vocabulary and repeatable.

  • ISRA on Microsoft Forms: This is the easiest and quickest format of the ISRA and has many guidance pages to help. Start with the introduction.
     
  • The Assistive ISRA on Microsoft Excel, using macros to assist the process. This has the advantage of being able to be used for more than one asset with upto 10 scenarios in one file. It is easy to see everything at one glance and takes the assessor through from listing the assets, creating the threat scenarios, listing the controls alongside expected risk reductions, assessing the risks and then deciding on the most appropriate risk treatments. This can be used by departments managing ISO 27001 platforms through to small services, but most people have found that they need inital personal guidance before completing it for the first time. We recommend this ISRA for use within a project for a new large system or service, or one that needs to have a very thorough risk assessment.

IT incident
An IT incident as an unplanned interruption to, or quality reduction of, a UIS service. This includes all incidents (from trivial up to major) for all UIS-provided technical solutions that underpin University services and includes solutions both hosted through UIS or managed by UIS through third-party providers. Contact our Service Desk to report an IT incident.

IT Incident Management (IM) process
The IM process documents the high-level procedures to coordinate incident management and aims to manage the lifecycle of all incidents. Its aim is to return the IT service to users as quickly as possible and minimise the adverse impact on business operations, ensuring that appropriate levels of service quality are maintained. It is owned by the most senior member(s) of the incident scope.

IM sits within and across any response process, ensuring all stages are handled, according to the National Cyber Security Centre (NCSC). It deals with any communications, media handling, escalations and any reporting issues, pulling the whole response together, coherently and holistically. By comparison, incident response (IR) includes triage, in-depth analysis, technical recovery actions and more.

The University's incident management plans tend to be a full umbrella plan for managing an incident, including communications, escalations and reporting. They sometimes include a high-level process to show how the plan is invoked and ended, but also may include, or link to, the main incident response plan as a more detailed process of who does what and when, in an incident (that is, the triage, analysis and technical recovery actions) and also incident response playbooks or links to them or their owners. See also incident response playbook.

IT security incident
An IT security incident is an event that, whether suspected or actual, is likely to compromise the confidentiality, integrity or availability of the University's data or systems, such as:

  • accidentally disclosing your password to a fake website
  • suspecting malware on your device
  • unauthorised disclosure of University data
  • loss of your device containing University data.

We have instructions for reporting an IT security incident.

L

Level of risk (risk score)
The magnitude of a risk expressed in terms of the combination of consequences and their likelihood (ISO27000). We have used 'impact' as the 'consequences' in this statement and so the measurement of risk is the product of impact and likelihood (risk = impact x likelihood), which we call the risk score.

Likelihood
The chances that the threat could happen.

M

Major Incident Management Plan (MIP)
A major IT incident relates to an issue to any service that could significantly disrupt users across the University, or impact the University's reputation. The major incident management plan (MIP) is a plan or framework that shows the process of managing the major incident. For example, if someone suspects a major service incident, they speak to the UIS Service Desk first and the Service Desk will decide whether to assemble an incident team. If this incident is also a cyber security incident, then the CIMP may be evoked, depending upon the impact threshold.

Minimum Security Requirements for Systems and Services
The minimum security requirements for systems and services help service owners, project and technical leads implement effective security standards for their services. They are known in University Information Services (UIS) as the security non-functional requirements (NFRs) or the security NFRs. See How to meet the minimum security requirements for systems and services.

Modification (of an information asset)
This is where the information asset has been altered in some way either maliciously or accidentally, breaching its Integrity. You could argue there is some loss of availability of the exact sections of data which have been modified, but it is generally assumed to be the untrustworthiness of the data which is the major concern here. In other words, if your data has been modified such that you don't know which bits have been modified and which haven't, then it has lost all credibility. Many researchers put Integrity as the top priority, as their research grants depend upon the integrity of the data. It is therefore very likely that Modification may be their highest area for concern.

Motivation (of a threat actor)
The motivation which leads the threat actor to make the threat happen. In Octave Allegro there are only two motivations: Accidental and Deliberate. Different motivations may lead to different impacts and different likelihoods and thus different risks.

Also, different motivations may lead to different controls which we need to put in place to mitigate these different risks. For example, to control against accidental modification to an information asset, training would reduce the likelihood and thus reduce the risk; against deliberate modification, no amount of training would reduce the risk, in fact it could even increase the risk; we would need employee controls via HR, line management controls and/or access controls to reduce the likelihood of it occurring and thus reduce the risk. 

N

Non Functional Requirements (NFRs)
When a new system is proposed for development or purchase, there are a set of functional requirements created which define what the system should do for the users and for the benefit of the University. However, it is noted that there are some requirements a system should meet which are not obviously for the benefit of the user but are definitely for the benefit of the University, because it meets University IT aims and objectives for now and in the future. For example, can the supplier assure us that a compromise can be detected and the service is recovered to a clean state quickly? or that the system is written to a minimum standard of development such that the likelihood of a successful hacker attack is minimised? These are called Non-Functional Requirements or NFRs. Those to do with Information and Cyber Security are called the Security NFRs or the Minimum security requirements for systems and services.

Non-repudiation
The ability to prove the occurrence of a claimed event or action and its originating entities (ISO27000)

P

Personal data
Any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person; (Article 4(12), UK GDPR 2021). See ICO website for full definition of personal data and further guidance.

Personal data breach
A breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data transmitted, stored or otherwise processed; (Article 4(1), UK GDPR 2021)

i.e. where personal data held by the University is lost, stolen, inadvertently disclosed to an external party, or accidentally published. The University's Information Compliance website provides examples of data breaches and guidance on who should report a data breach and how to do it.

Physical container
The physical container is the physical storage location – that is, actually where an asset is, physically – including the live version, copies and backup(s).

Examples for live versions: Institution laptop, personal laptop, tablets, phones, USB sticks and pen drives; other portable media; Department's servers; UIS-managed server; Clinical School Computing Service server; one of the data centres, such as the West Cambridge Data Centre, Soulsby, New Museum Site; a cloud service server, such as ones that hold the University OneDrive for Business, Dropbox Business and G Suite, but also could be the cloud service for people's personal versions of Dropbox, Google Drive or OneDrive and some could be other third-party cloud services.

Examples of backup locations: Server rooms on another disk or mirror or tape; in a safe in a storage room; in someone else's server room on tape or on other machines; USB sticks/pen drives (for your laptop).

PID
Personally Identifyable Data. NHS East Suffolk and North Essex Hospital defines it as: "Personal Identifiable Data (often known as PID) is any information that is personal to you and would identify you as an individual." Additionally, the term PII is used to mean data which can identify a natural person, and, under the UK GDPR (2021) the term personal data is used.

PII
Personally Identifyable Information. Under NIST, the National Institute of Standards and Technology, of the US, in both  NIST SP 800-79-2  and  NIST SP 800-122, there are definitions of PII. The first one is: "Any representation of information that permits the identity of an individual to whom the information applies to be reasonably inferred by either direct or indirect means."

Under the UK GDPR (2021) personally identifyable information is called personal data. See ICO website for full definition of personal data.

Pseudonymised data
Data which cannot identify a person in this form but there is a 'key' which can re-personalise the data.

Pseudonymised data Key
The key to the data which allows the holder to re-personalise the data. 

Example: A list of identifiers (say numbers) alongside the first-names and surnames of the participants in a certain study.

Pseudonymisation
The processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person. (Article 4(5), UK GDPR 2021)

i.e. it is making personally identifiable data not identifiable, including ensuring combinations of unobscured data do not identify a person (see anonymisation), but holding a 'key' (additional information) which can re-personalise the data. It is important to ensure the 'key' is kept as securely as the original personally identifyable data would need to be kept. See Classify and store your data; Storing and sharing personal data; Data security for researchers; Risk assessment for data security pages on our website.

The Information Commissioner's Office (ICO) has written a guidance paper on anonymisation, pseudonymisation and privacy enhancing technologies, which is soon to be published. In the May 2021 draft: Chapter 3 Pseudonymisation it states:

  • "Pseudonymisation refers to techniques that replace, remove or transform information that identifies individuals, and keep that information separate. 
  • Data that has undergone pseudonymisation remains personal data and is in scope of data protection law. 
  • Take care not to confuse pseudonymisation with anonymisation. Ultimately, pseudonymisation is a way of reducing risk and improving security. It is not a way of transforming personal data to the extent the law no longer applies. 
  • However, you may be able to disclose a pseudonymised dataset (without the separate identifiers) on the basis that it is effectively anonymised from the recipient’s perspective.
  • The DPA 2018 contains two specific criminal offences to address the potential for harm resulting from unauthorised reversal of pseudonymisation. This applies to the reversal of pseudonymised data and any further processing of it, without first obtaining consent from the responsible controller."

R

Raw impact
The consequences to you and/or the institution, should the threat happen and you have no controls in place to reduce the impact.

Raw likelihood
The chances that the threat could happen with no controls in place to reduce those chances.

Raw risk
The risk before any controls are in place. Sometimes known as gross risk.

Recovery Point Objective (RPO)
The recovery point objective gives how much time should be between backups. So, for daily backups, you go back to same point yesterday when the backup was taken, and all the data between then and now would be lost.

Recovery Time Objective (RTO)
The recovery time objective states how long it would take to recover/restore data from your backup. So, if it takes 3 hours to upload the backup, the RTO cannot be less than 3 hours.

Residual impact
The consequences to you and/or the institution, should the threat  happen after the treatment proposed to mitigate the risk and reduce the impact has been implemented.

Residual likelihood
The chances that the threat could happen after the proposed treatment of the risk has been implemented.

Residual risk
Risk that is left after the treatment of the risk – usually the implementation of further controls.

Risk
The effect of uncertainty on objectives (ISO27000).

Risk acceptance
Informed decision to take a particular risk (ISO27000)

Risk appetite
The 'appetite' the organisation has for this type of risk. The University's risk appetite for cyber security is "Averse".

Risk score
See Level of Risk

Risk threshold
The limit at which the organisation will accept a level of risk.

For example, in a 5 x 5 impact and likelihood risk score table, the risk threshold may be set to the score of 10. If there are any assessed risks with a risk score over this value, then the risk is deemed unacceptable and must be treated (see Risk treatment). The risk threshold is usually defined with respect to the risk appetite of the organisation. The University's risk appetite for cyber security is "Averse". 

Risk treatment
The treatment of the current risk. Options are:

  • Reduce: to reduce a risk, you would implement additional controls either to reduce the likelihood of it occurring or to reduce the impact should it occur, or both.
  • Avoid: this is where you avoid the threat scenario from happening, completely, by not doing the thing that created the risk in the first place.
  • Transfer: this is where you transfer the risk to a third party. It is usually by means of insuring against the risk such that, should the risk happen, the insurance company will take most of the impact. financially.
  • Accept: this is where you accept the risk as it is.

Full explanation and examples are in the guidance notes to the ISRA.

S

Security NFRs (Non Functional Requirements)
The minimum security requirements for systems and services help service owners, project and technical leads implement effective security standards for their services. They are known in University Information Services (UIS) as the security non-functional requirements (NFRs) or the security NFRs.

Sensitive data
There is no official definition for sensitive data but it is usually a general term which is used to mean one or more of confidential data, special category personal data, or data which could have an impact on the department/institution/research group, should it be compromised in some way.

See also: Data security classifications, Classify and store your data or storing and sharing personal data.

Special category personal data

  • personal data revealing racial or ethnic origin;
  • personal data revealing political opinions;
  • personal data revealing religious or philosophical beliefs;
  • personal data revealing trade union membership;
  • genetic data;
  • biometric data (where used for identification purposes);
  • data concerning health;
  • data concerning a person’s sex life; and
  • data concerning a person’s sexual orientation.

Personal data relating to criminal convictions and offences are not technically special category personal data under the UK GDPR but such data are afforded a similarly sensitive status under both the UK GDPR and related UK legislation.

T

Technical container
A technical container means the logical storage location – that is, the name you would tell your colleagues in your institution where you store certain data – rather than the actual physical location of the data.

Examples: my home directory, the department's shared drive or folder; the office's r:\ drive; the institution's database or system; the name of a third-party database or system; a virtual machine; my inbox.

Threat
This is what you are worried about happening.

Threat actor
Someone who or something that 'delivers' the threat: either exploits a vulnerability in the system, or happens due to a vulnerability in the system.

Examples: fire, flood, wind, system failure, power cut, hacker, employee, contractor, student, visitor.

Threat scenario
This is putting the threat into context to include a threat actor, some sort of 'exploitation' or causation, and the resultant loss of confidentiality, integrity or availability of the dataset – that is, the 'what ifs'.

Examples (a variation on which has actually happened in the University in the last 5 years):

What if a hacker decided to send phishing emails to someone in my institution and gains access to the exams database, with all my questions in it, and then publishes them all on the internet?

What if I had a major section of my research data on a USB stick and now I can't find it?

What if the next-door building caught fire and the fire brigade flooded our building, including our server room, trying to put it out, resulting in serious disruption to accessing our critical data for days?

What if a disgruntled employee planted a logic bomb months ago in my research dataset, left tomorrow to go on holiday, but sends a message to say he's not coming back, and the bomb times-out and deletes the entire dataset?

What if a cyber-criminal compromises the integrity of an existing website such that they embed malware that is then served up to all visitors to the website?

What if my senior research associate had the only copy of the raw data from my research project on her own personal Dropbox account, I realise I need it to find an error in the analysis, but she left the University last month and although I try to contact her I don't get a reply?

What if the department's RAID controller fails in such a way as to lose the integrity of the data on the server, over some period of time, including my critical scientific research data?

Training
There are two cyber security training courses: one for Staff and one for Students. There is also Data Protection Training available from the Information Compliance Office.

U

University Emergency Action Plan
The University Emergency Action Plan is a set of plans for managing an emergency that may or may not escalate into a disaster. It sets up bronze, silver and gold teams dependent upon the scale of the emergency. Its focus is to protect the health and safety of the members of the University (such as staff, students and visitors) along with its infrastructure and assets. It includes a link to invoke the cyber security incident management plan, should the need arise.

V

Vulnerability
Weakness in the 'system' that can be exploited by the threat actor.

Examples: frayed wires, unpatched operating system, open doors and windows, unmaintained sprinkler or air-conditioning systems.

The Cyber Security Incident Management Plan (CIMP) further defines vulnerabilities, thus:

  • Vulnerability (Technical) – a technical vulnerability is a hardware, firmware or software or design weakness that leaves the system open to exploitation by adversaries.
  • Vulnerability (Administrative) – an administrative vulnerability is a security weakness caused by incorrect or poor implementation of the security controls by system administrators or security officers. It is not a design deficiency. Full remediation is possible through changing the implementation or procedure.